830 resultados para Translating and interpreting.
Resumo:
Current feed evaluation systems for dairy cattle aim to match nutrient requirements with nutrient intake at pre-defined production levels. These systems were not developed to address, and are not suitable to predict, the responses to dietary changes in terms of production level and product composition, excretion of nutrients to the environment, and nutrition related disorders. The change from a requirement to a response system to meet the needs of various stakeholders requires prediction of the profile of absorbed nutrients and its subsequent utilisation for various purposes. This contribution examines the challenges to predicting the profile of nutrients available for absorption in dairy cattle and provides guidelines for further improved prediction with regard to animal production responses and environmental pollution. The profile of nutrients available for absorption comprises volatile fatty acids, long-chain fatty acids, amino acids and glucose. Thus the importance of processes in the reticulo-rumen is obvious. Much research into rumen fermentation is aimed at determination of substrate degradation rates. Quantitative knowledge on rates of passage of nutrients out of the rumen is rather limited compared with that on degradation rates, and thus should be an important theme in future research. Current systems largely ignore microbial metabolic variation, and extant mechanistic models of rumen fermentation give only limited attention to explicit representation of microbial metabolic activity. Recent molecular techniques indicate that knowledge on the presence and activity of various microbial species is far from complete. Such techniques may give a wealth of information, but to include such findings in systems predicting the nutrient profile requires close collaboration between molecular scientists and mathematical modellers on interpreting and evaluating quantitative data. Protozoal metabolism is of particular interest here given the paucity of quantitative data. Empirical models lack the biological basis necessary to evaluate mitigation strategies to reduce excretion of waste, including nitrogen, phosphorus and methane. Such models may have little predictive value when comparing various feeding strategies. Examples include the Intergovernmental Panel on Climate Change (IPCC) Tier II models to quantify methane emissions and current protein evaluation systems to evaluate low protein diets to reduce nitrogen losses to the environment. Nutrient based mechanistic models can address such issues. Since environmental issues generally attract more funding from governmental offices, further development of nutrient based models may well take place within an environmental framework.
Resumo:
Despite the acknowledged benefits of reducing SFA intake few countries within the EU meet recognised targets. Milk and dairy products represent the single largest source of dietary SFA in most countries, yet epidemiological evidence indicates that milk has cardioprotective properties such that simply reducing consumption of dairy foods to meet SFA targets may not be a sound public health approach. The present paper explores the options for replacing some of the SFA in milk fat with cis-MUFA through alteration of the diet of the dairy cow, and the evidence that such changes can improve the indicators for CHD and CVD in general for the consumer. In addition, the outcome of such changes on risk factors for CHD and CVD at the population level is examined in the light of a modelling exercise involving data for eleven EU member states. Given the current and projected costs of health care, the results indicate that urgent consideration should be given to such a strategy.
Resumo:
Research on social communication skills in individuals with Williams syndrome has been inconclusive, with some arguing that these skills are a relative strength and others that they are a weakness. The aim of the present study was to investigate social interaction abilities in a group of children with WS, and to compare them to a group of children with specific language impairment and a group of typically developing children. Semi-structured conversations were conducted and 100-150 utterances were selected for analysis in terms of exchange structure, turn taking, information transfer and conversational inadequacy. The statistical analyses showed that the children with WS had difficulties with exchange structure and responding appropriately to the interlocutor's requests for information and clarification. They also had significant difficulties with interpreting meaning and providing enough information for the conversational partner. Despite similar language abilities with a group of children with specific language impairment, the children with WS had different social interaction skills, which suggests that they follow an atypical trajectory of development and their neurolinguistic profile does not directly support innate modularity. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Ranald Roderick Macdonald (1945-2007) was an important contributor to mathematical psychology in the UK, as a referee and action editor for British Journal of Mathematical and Statistical Psychology and as a participant and organizer at the British Psychological Society's Mathematics, statistics and computing section meetings. This appreciation argues that his most important contribution was to the foundations of significance testing, where his concern about what information was relevant in interpreting the results of significance tests led him to be a persuasive advocate for the 'Weak Fisherian' form of hypothesis testing.
Resumo:
This paper considers possible problems researchers might face when interpreting the results of studies that employ variants of the preference procedure. Infants show a tendency to shift their preference from familiar to novel stimuli with increasing exposure to the familiar stimulus, a behaviour that is exploited by the habituation paradigm. This change in attentional preference with exposure leads us to suggest that researchers interested in infants' pre-experimental or spontaneous preferences should beware of the potentially confounding effects of exposing infants to familiarization trials prior to employing the preference procedure. The notion that infant attentional preference is dynamic also calls into question the use of the direction of post-familiarization preference per se when interpreting the knowledge or strategies available to infants. We look into the results of a cross-modal word learning study to show how the interpretation of results may be difficult when infants exhibit a significant preference in an unexpected direction. As a possible solution to this problem we propose that significant preferences in both directions should be sought at multiple intervals over time. Copyright (C) 2004 John Wiley Sons, Ltd.
Resumo:
This paper considers how environmental threat may contribute to the child's use of avoidant strategies to regulate negative emotions, and how this may interact with high emotional reactivity to create vulnerability to conduct disorder symptoms. We report a study based on the hypothesis that interpreting others' behaviours in terms of their motives and emotions - using the intentional stance - promotes effective social action, but may lead to fear in threatful situations, and that inhibiting the intentional stance may reduce fear but promote conduct disorder symptoms. We assessed 5-year-olds' use of the intentional stance with an intentionality scale, contrasting high and low threat doll play scenarios. In a sample of 47 children of mothers with post-natal depression ( PND) and 35 controls, children rated as securely attached with their mothers at the age of 18 months were better able to preserve the intentional stance than insecure children in high threat scenarios, but not in low threat scenarios. Girls had higher intentionality scores than boys across all scenarios. Only intentionality in the high threat scenario was associated with teacher-rated conduct disorder symptoms, and only in the children of women with PND. Intentionality mediated the associations between attachment security and gender and conduct disorder symptoms in the PND group.
Resumo:
Objectives: To examine doctors' (Experiment 1) and doctors' and lay people's (Experiment 2) interpretations of two sets of recommended verbal labels for conveying information about side effects incidence rates. Method: Both studies used a controlled empirical methodology in which participants were presented with a hypothetical, but realistic, scenario involving a prescribed medication that was said to be associated with either mild or severe side effects. The probability of each side effect was described using one of the five descriptors advocated by the European Union (Experiment 1) or one of the six descriptors advocated in Calman's risk scale (Experiment 2), and study participants were required to estimate (numerically) the probability of each side effect occurring. Key findings: Experiment 1 showed that the doctors significantly overestimated the risk of side effects occurring when interpreting the five EU descriptors, compared with the assigned probability ranges. Experiment 2 showed that both groups significantly overestimated risk when given the six Calman descriptors, although the degree of overestimation was not as great for the doctors as for the lay people. Conclusion: On the basis of our findings, we argue that we are still a long way from achieving a standardised language of risk for use by both professionals and the general public, although there might be more potential for use of standardised terms among professionals. In the meantime, the EU and other regulatory bodies and health professionals should be very cautious about advocating the use of particular verbal labels for describing medication side effects.
Resumo:
There are still major challenges in the area of automatic indexing and retrieval of digital data. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. Research has been ongoing for a few years in the field of ontological engineering with the aim of using ontologies to add knowledge to information. In this paper we describe the architecture of a system designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval.
Resumo:
Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.
Resumo:
This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.
Resumo:
A new robust neurofuzzy model construction algorithm has been introduced for the modeling of a priori unknown dynamical systems from observed finite data sets in the form of a set of fuzzy rules. Based on a Takagi-Sugeno (T-S) inference mechanism a one to one mapping between a fuzzy rule base and a model matrix feature subspace is established. This link enables rule based knowledge to be extracted from matrix subspace to enhance model transparency. In order to achieve maximized model robustness and sparsity, a new robust extended Gram-Schmidt (G-S) method has been introduced via two effective and complementary approaches of regularization and D-optimality experimental design. Model rule bases are decomposed into orthogonal subspaces, so as to enhance model transparency with the capability of interpreting the derived rule base energy level. A locally regularized orthogonal least squares algorithm, combined with a D-optimality used for subspace based rule selection, has been extended for fuzzy rule regularization and subspace based information extraction. By using a weighting for the D-optimality cost function, the entire model construction procedure becomes automatic. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.
Resumo:
Can human social cognitive processes and social motives be grasped by the methods of experimental economics? Experimental studies of strategic cognition and social preferences contribute to our understanding of the social aspects of economic decisions making. Yet, papers in this issue argue that the social aspects of decision-making introduce several difficulties for interpreting the results of economic experiments. In particular, the laboratory is itself a social context, and in many respects a rather distinctive one, which raises questions of external validity.
Resumo:
The winter climate of Europe and the Mediterranean is dominated by the weather systems of the mid-latitude storm tracks. The behaviour of the storm tracks is highly variable, particularly in the eastern North Atlantic, and has a profound impact on the hydroclimate of the Mediterranean region. A deeper understanding of the storm tracks and the factors that drive them is therefore crucial for interpreting past changes in Mediterranean climate and the civilizations it has supported over the last 12 000 years (broadly the Holocene period). This paper presents a discussion of how changes in climate forcing (e.g. orbital variations, greenhouse gases, ice sheet cover) may have impacted on the ‘basic ingredients’ controlling the mid-latitude storm tracks over the North Atlantic and the Mediterranean on intermillennial time scales. Idealized simulations using the HadAM3 atmospheric general circulation model (GCM) are used to explore the basic processes, while a series of timeslice simulations from a similar atmospheric GCM coupled to a thermodynamic slab ocean (HadSM3) are examined to identify the impact these drivers have on the storm track during the Holocene. The results suggest that the North Atlantic storm track has moved northward and strengthened with time since the Early to Mid-Holocene. In contrast, the Mediterranean storm track may have weakened over the same period. It is, however, emphasized that much remains still to be understood about the evolution of the North Atlantic and Mediterranean storm tracks during the Holocene period.