851 resultados para Meaningful teachings
Resumo:
This paper addresses the issue of activity understanding from video and its semantics-rich description. A novel approach is presented where activities are characterised and analysed at different resolutions. Semantic information is delivered according to the resolution at which the activity is observed. Furthermore, the multiresolution activity characterisation is exploited to detect abnormal activity. To achieve these system capabilities, the focus is given on context modelling by employing a soft computing-based algorithm which automatically enables the determination of the main activity zones of the observed scene by taking as input the trajectories of detected mobiles. Such areas are learnt at different resolutions (or granularities). In a second stage, learned zones are employed to extract people activities by relating mobile trajectories to the learned zones. In this way, the activity of a person can be summarised as the series of zones that the person has visited. Employing the inherent soft relation properties, the reported activities can be labelled with meaningful semantics. Depending on the granularity at which activity zones and mobile trajectories are considered, the semantic meaning of the activity shifts from broad interpretation to detailed description.Activity information at different resolutions is also employed to perform abnormal activity detection.
Resumo:
This research explores whether patterns of typographic differentiation influence readers’ impressions of documents. It develops a systematic approach to typographic investigation that considers relationships between different kinds of typographic attributes, rather than testing the influence of isolated variables. An exploratory study using multiple sort tasks and semantic differential scales identifies that readers form a variety of impressions in relation to how typographic elements are differentiated in document design. Building on the findings of the exploratory study and analysis of a sample of magazines, the research describes three patterns of typographic differentiation: high, moderate, and low. Each pattern comprises clusters of typographic attributes and organisational principles that are articulated in relation to a specified level of typographic differentiation (amplified, medium, or subtle). The patterns are applied to two sets of controlled test material. Using this purposely-designed material, the influence of patterns of typographic differentiation on readers’ impressions of documents is explored in a repertory grid analysis and a paired comparison procedure. The results of these studies indicate that patterns of typographic differentiation consistently shape readers’ impressions of documents, influencing judgments of credibility, document address, and intended readership; and suggesting particular kinds of engagement and genre associations. For example, high differentiation documents are likely to be considered casual, sensationalist, and young; moderate differentiation documents are most likely to be seen as formal and serious; and low differentiation examples are considered calm. Typographic meaning is shown to be created through complex, yet systematic, interrelationships rather than reduced to a linear model of increasing or decreasing variation. The research provides a way of describing typographic articulation that has application across a variety of disciplines and design practice. In particular, it illuminates the ways in which typographic presentation is meaningful to readers, providing knowledge that document producers can use to communicate more effectively.
Resumo:
As satellite technology develops, satellite rainfall estimates are likely to become ever more important in the world of food security. It is therefore vital to be able to identify the uncertainty of such estimates and for end users to be able to use this information in a meaningful way. This paper presents new developments in the methodology of simulating satellite rainfall ensembles from thermal infrared satellite data. Although the basic sequential simulation methodology has been developed in previous studies, it was not suitable for use in regions with more complex terrain and limited calibration data. Developments in this work include the creation of a multithreshold, multizone calibration procedure, plus investigations into the causes of an overestimation of low rainfall amounts and the best way to take into account clustered calibration data. A case study of the Ethiopian highlands has been used as an illustration.
Resumo:
Design patterns are a way of sharing evidence-based solutions to educational design problems. The design patterns presented in this paper were produced through a series of workshops, which aimed to identify Massive Open Online Course (MOOC) design principles from workshop participants’ experiences of designing, teaching and learning on these courses. MOOCs present a challenge for the existing pedagogy of online learning, particularly as it relates to promoting peer interaction and discussion. MOOC cohort sizes, participation patterns and diversity of learners mean that discussions can remain superficial, become difficult to navigate, or never develop beyond isolated posts. In addition, MOOC platforms may not provide sufficient tools to support moderation. This paper draws on four case studies of designing and teaching on a range of MOOCs presenting seven design narratives relating to the experience in these MOOCs. Evidence presented in the narratives is abstracted in the form of three design patterns created through a collaborative process using techniques similar to those used in collective autoethnography. The patterns: “Special Interest Discussions”, “Celebrity Touch” and “Look and Engage”, draw together shared lessons and present possible solutions to the problem of creating, managing and facilitating meaningful discussion in MOOCs through the careful use of staged learning activities and facilitation strategies.
Resumo:
Recent research on the Reformation has been concerned with the process by which lay people acquired a religious identity, whether it began merely as an act of political obedience or by a sudden ‘conversion’ to new doctrines. Confessional politics made it imperative for rulers to try to control the religious allegiances of their people, but the doctrine of conversion (as a spiritual change) made this theoretically impossible. Instead, a ‘culture of persuasion’ developed by which clerical and secular rulers sought to persuade their people to accept teachings authorized by the state. The possibility of religious dissent, of converting away from the state-sanctioned denomination, made conversion an issue whose importance was far greater than the actual number of converts. The study of confessionalism and conversion emphasises two theses fundamental to Reformation studies: that the era produced radical changes in the ways that people thought about their personal and communal identities, and that it made individuals’ religious choices the urgent concern of their governors.
Resumo:
We know that from mid-childhood onwards most new words are learned implicitly via reading; however, most word learning studies have taught novel items explicitly. We examined incidental word learning during reading by focusing on the well-documented finding that words which are acquired early in life are processed more quickly than those acquired later. Novel words were embedded in meaningful sentences and were presented to adult readers early (day 1) or later (day 2) during a five-day exposure phase. At test adults read the novel words in semantically neutral sentences. Participants’ eye movements were monitored throughout exposure and test. Adults also completed a surprise memory test in which they had to match each novel word with its definition. Results showed a decrease in reading times for all novel words over exposure, and significantly longer total reading times at test for early than late novel words. Early-presented novel words were also remembered better in the offline test. Our results show that order of presentation influences processing time early in the course of acquiring a new word, consistent with partial and incremental growth in knowledge occurring as a function of an individual’s experience with each word.
Resumo:
Ethnopharmacological relevance: Cancer patients in all cultures are high consumers of herbal medicines (HMs) usually as part of a regime consisting of several complementary and alternative medicine (CAM) modalities, but the type of patient, the reasons for choosing such HM-CAM regimes, and the benefits they perceive from taking them are poorly understood. There are also concerns that local information may be ignored due to language issues. This study investigates aspects of HM-CAM use in cancer patients using two different abstracting sources: Medline, which contains only peer-reviewed studies from SCI journals, and in order to explore whether further data may be available regionally, the Thai national databases of HM and CAM were searched as an example. Materials and methods: the international and Thai language databases were searched separately to identify relevant studies, using key words chosen to include HM use in all traditions. Analysis of these was undertaken to identify socio-demographic and clinical factors, as well as sources of information, which may inform the decision to use HMs. Results: Medline yielded 5,638 records, with 49 papers fitting the criteria for review. The Thai databases yielded 155, with none relevant for review. Factors associated with HM-CAM usage were: a younger age, higher education or economic status, multiple chemotherapy treatment, late stage of disease. The most common purposes for using HM-CAM cited by patients were to improve physical symptoms, support emotional health, stimulate the immune system, improve quality of life, and relieve side-effects of conventional treatment. Conclusions: Several indicators were identified for cancer patients who are most likely to take HM-CAM. However, interpreting the clinical reasons why patients decide to use HM-CAM is hampered by a lack of standard terminology and thematic coding, because patients' own descriptions are too variable and overlapping for meaningful comparison. Nevertheless, fears that the results of local studies published regionally are being missed, at least in the case of Thailand, appeared to be unfounded.
Resumo:
Transformation of the south-western Australian landscape from deep-rooted woody vegetation systems to shallow-rooted annual cropping systems has resulted in the severe loss of biodiversity and this loss has been exacerbated by rising ground waters that have mobilised stored salts causing extensive dry land salinity. Since the original plant communities were mostly perennial and deep rooted, the model for sustainable agriculture and landscape water management invariably includes deep rooted trees. Commercial forestry is however only economical in higher rainfall (>700 mm yr−1) areas whereas much of the area where biodiversity is threatened has lower rainfall (300–700 mm yr−1). Agroforestry may provide the opportunity to develop new agricultural landscapes that interlace ecosystem services such as carbon mitigation via carbon sequestration and biofuels, biodiversity restoration, watershed management while maintaining food production. Active markets are developing for some of these ecosystem services, however a lack of predictive metrics and the regulatory environment are impeding the adoption of several ecosystem services. Nonetheless, a clear opportunity exists for four major issues – the maintenance of food and fibre production, salinisation, biodiversity decline and climate change mitigation – to be managed at a meaningful scale and a new, sustainable agricultural landscape to be developed.
Resumo:
This case series compares patient experiences and therapeutic processes between two modalities of cognitive behaviour therapy (CBT) for depression: computerized CBT (cCBT) and therapist-delivered CBT (tCBT). In a mixed-methods repeated-measures case series, six participants were offered cCBT and tCBT in sequence, with the order of delivery randomized across participants. Questionnaires about patient experiences were administered after each session and a semi-structured interview was completed with each participant at the end of each therapy modality. Therapy expectations, patient experiences and session impact ratings in this study generally favoured tCBT. Participants typically experienced cCBT sessions as less meaningful, less positive and less helpful compared to tCBT sessions in terms of developing understanding, facilitating problem-solving and building a therapeutic relationship.
Resumo:
While many academics are sceptical about the 'impact agenda', it may offer the potential to re-value feminist and participatory approaches to the co-production of knowledge. Drawing on my experiences of developing a UK Research Excellence Framework (REF) impact case study based on research on young caregiving in the UK, Tanzania and Uganda, I explore the dilemmas and tensions of balancing an ethic of care and participatory praxis with research management demands to evidence 'impact' in the neoliberal academy. The participatory dissemination process enabled young people to identify their support needs, which translated into policy and practice recommendations and in turn, produced 'impact'. It also revealed a paradox of action-oriented research: this approach may bring greater emotional investment of the participants in the project in potentially negative as well as positive ways, resulting in disenchantment that the research did not lead to tangible outcomes at local level. Participatory praxis may also pose ethical dilemmas for researchers who have responsibilities to care for both 'proximate' and 'distant' others. The 'more than research' relationship I developed with practitioners was motivated by my ethic of care rather than by the demands of the audit culture. Furthermore, my research and the impacts cited emerged slowly and incrementally from a series of small grants in an unplanned, serendipitous way at different scales, which may be difficult to fit within institutional audits of 'impact'. Given the growing pressures on academics, it seems ever more important to embody an ethic of care in university settings, as well as in the 'field'. We need to join the call for 'slow scholarship' and advocate a re-valuing of feminist and participatory action research approaches, which may have most impact at local level, in order to achieve meaningful shifts in the impact agenda and more broadly, the academy.
Resumo:
Background: Although a large number of randomized controlled trials (RCTs) have examined the impact of the n-3 (ω-3) fatty acids EPA (20:5n-3) and DHA (22:6n-3) on blood pressure and vascular function, the majority have used doses of EPA+DHA of > 3 g per d,which are unlikely to be achieved by diet manipulation. Objective: The objective was to examine, using a retrospective analysis from a multi-center RCT, the impact of recommended, dietary achievable EPA+DHA intakes on systolic and diastolic blood pressure and microvascular function in UK adults. Design: Healthy men and women (n = 312) completed a double-blind, placebo-controlled RCT consuming control oil, or fish oil providing 0.7 g or 1.8 g EPA+DHA per d in random order each for 8 wk. Fasting blood pressure and microvascular function (using Laser Doppler Iontophoresis) were assessed and plasma collected for the quantification of markers of vascular function. Participants were retrospectively genotyped for the eNOS rs1799983 variant. Results: No impact of n-3 fatty acid treatment or any treatment * eNOS genotype interactions were evident in the group as a whole for any of the clinical or biochemical outcomes. Assessment of response according to hypertension status at baseline indicated a significant (P=0.046) fish oil-induced reduction (mean 5 mmHg) in systolic blood pressure specifically in those with isolated systolic hypertension (n=31). No dose response was observed. Conclusions: These findings indicate that, in those with isolated systolic hypertension, daily doses of EPA+DHA as low as 0.7 g bring about clinically meaningful blood pressure reductions which, at a population level, would be associated with lower cardiovascular disease risk. Confirmation of findings in an RCT where participants are prospectively recruited on the basis of blood pressure status is required to draw definite conclusions. The Journal of Nutrition NUTRITION/2015/220475 Version 4
Resumo:
Demand for organic meat is partially driven by consumer perceptions that organic foods are more nutritious than non-organic foods. However, there have been no systematic reviews comparing specifically the nutrient content of organic and conventionally produced meat. In this study, we report results of a meta-analysis based on sixty-seven published studies comparing the composition of organic and non-organic meat products. For many nutritionally relevant compounds (e.g. minerals, antioxidants and most individual fatty acids (FA)), the evidence base was too weak for meaningful meta-analyses. However, significant differences in FA profiles were detected when data from all livestock species were pooled. Concentrations of SFA and MUFA were similar or slightly lower, respectively, in organic compared with conventional meat. Larger differences were detected for total PUFA and n-3 PUFA, which were an estimated 23 (95 % CI 11, 35) % and 47 (95 % CI 10, 84) % higher in organic meat, respectively. However, for these and many other composition parameters, for which meta-analyses found significant differences, heterogeneity was high, and this could be explained by differences between animal species/meat types. Evidence from controlled experimental studies indicates that the high grazing/forage-based diets prescribed under organic farming standards may be the main reason for differences in FA profiles. Further studies are required to enable meta-analyses for a wider range of parameters (e.g. antioxidant, vitamin and mineral concentrations) and to improve both precision and consistency of results for FA profiles for all species. Potential impacts of composition differences on human health are discussed.
Resumo:
A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993–2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state.
Resumo:
Ιn the eighteenth century the printing of Greek texts continued to be central to scholarship and discourse. The typography of Greek texts could be characterised as a continuation of French models from the sixteenth century, with a gradual dilution of the complexity of ligatures and abbreviations, mostly through printers in the Low Countries. In Britain, Greek printing was dominated by the university presses, which reproduced conservatively the continental models – exemplified by Oxford's Fell types, which were Dutch adaptations of earlier French models. Hindsight allows us to identify a meaningful development in the Greek types cut by Alexander Wilson for the Foulis Press in Glasgow, but we can argue that in the middle of the eighteenth century Baskerville was considering Greek printing the typographic environment was ripe for a new style of Greek types. The opportunity to cut the types for a New Testament (in an twin edition that included a generous octavo and a large quarto version) would seem perfect for showcasing Baskerville's capacity for innovation. His Greek type maintained the cursive ductus of earlier models, but abandoned complex ligatures and any hint of scribal flourish. He homogenised the modulation of the letter strokes and the treatment of terminals, and normalised the horizontal alignments of all letters. Although the strokes are in some letters too delicate, the narrow set of the style composes a consistent, uniform texture that is a clean break from contemporaneous models. The argument is made that this is the first Greek typeface that can be described as fully typographic in the context of the technology of the time. It sets a pattern that was to be followed, without acknowledgement, by Richard Porson nearly a century and a half later. The typeface received little praise by typographic historians, and was condemned by Victor Scholderer in his retrospective of Greek typography. A survey of typeface reviews in the surrounding decades establishes that the commentators were mostly reproducing the views of an arbitrary typographic orthodoxy, for which only types with direct references to Renaissance models were acceptable. In these comments we detect a bias against someone considered an arriviste in the scholarly printing establishment, as well as a conservative attitude to typographic innovation.
Resumo:
This paper explores the role of voice quality in the teaching of pronunciation and argues that since voice quality encompasses so many aspects of phonology, it provides a useful point of departure for pronunciation work. A teaching technique is described, in which the concept of voice quality is used in communicative practice to give students the opportunity to identify meaningful aspects of suprasegmental pronunciation, and see how they fit into the overall pattern of connected speech.