973 resultados para THEORETICAL PREDICTION


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The increasing prevalence of bovine tuberculosis (bTB) in the UK and the limitations of the currently available diagnostic and control methods require the development of complementary approaches to assist in the sustainable control of the disease. One potential approach is the identification of animals that are genetically more resistant to bTB, to enable breeding of animals with enhanced resistance. This paper focuses on prediction of resistance to bTB. We explore estimation of direct genomic estimated breeding values (DGVs) for bTB resistance in UK dairy cattle, using dense SNP chip data, and test these genomic predictions for situations when disease phenotypes are not available on selection candidates. Methodology/Principal Findings: We estimated DGVs using genomic best linear unbiased prediction methodology, and assessed their predictive accuracies with a cross validation procedure and receiver operator characteristic (ROC) curves. Furthermore, these results were compared with theoretical expectations for prediction accuracy and area-under-the-ROC- curve (AUC). The dataset comprised 1151 Holstein-Friesian cows (bTB cases or controls). All individuals (592 cases and 559 controls) were genotyped for 727,252 loci (Illumina Bead Chip). The estimated observed heritability of bTB resistance was 0.23±0.06 (0.34 on the liability scale) and five-fold cross validation, replicated six times, provided a prediction accuracy of 0.33 (95% C.I.: 0.26, 0.40). ROC curves, and the resulting AUC, gave a probability of 0.58, averaged across six replicates, of correctly classifying cows as diseased or as healthy based on SNP chip genotype alone using these data. Conclusions/Significance: These results provide a first step in the investigation of the potential feasibility of genomic selection for bTB resistance using SNP data. Specifically, they demonstrate that genomic selection is possible, even in populations with no pedigree data and on animals lacking bTB phenotypes. However, a larger training population will be required to improve prediction accuracies. © 2014 Tsairidou et al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coastal and estuarine landforms provide a physical template that not only accommodates diverse ecosystem functions and human activities, but also mediates flood and erosion risks that are expected to increase with climate change. In this paper, we explore some of the issues associated with the conceptualisation and modelling of coastal morphological change at time and space scales relevant to managers and policy makers. Firstly, we revisit the question of how to define the most appropriate scales at which to seek quantitative predictions of landform change within an age defined by human interference with natural sediment systems and by the prospect of significant changes in climate and ocean forcing. Secondly, we consider the theoretical bases and conceptual frameworks for determining which processes are most important at a given scale of interest and the related problem of how to translate this understanding into models that are computationally feasible, retain a sound physical basis and demonstrate useful predictive skill. In particular, we explore the limitations of a primary scale approach and the extent to which these can be resolved with reference to the concept of the coastal tract and application of systems theory. Thirdly, we consider the importance of different styles of landform change and the need to resolve not only incremental evolution of morphology but also changes in the qualitative dynamics of a system and/or its gross morphological configuration. The extreme complexity and spatially distributed nature of landform systems means that quantitative prediction of future changes must necessarily be approached through mechanistic modelling of some form or another. Geomorphology has increasingly embraced so-called ‘reduced complexity’ models as a means of moving from an essentially reductionist focus on the mechanics of sediment transport towards a more synthesist view of landform evolution. However, there is little consensus on exactly what constitutes a reduced complexity model and the term itself is both misleading and, arguably, unhelpful. Accordingly, we synthesise a set of requirements for what might be termed ‘appropriate complexity modelling’ of quantitative coastal morphological change at scales commensurate with contemporary management and policy-making requirements: 1) The system being studied must be bounded with reference to the time and space scales at which behaviours of interest emerge and/or scientific or management problems arise; 2) model complexity and comprehensiveness must be appropriate to the problem at hand; 3) modellers should seek a priori insights into what kind of behaviours are likely to be evident at the scale of interest and the extent to which the behavioural validity of a model may be constrained by its underlying assumptions and its comprehensiveness; 4) informed by qualitative insights into likely dynamic behaviour, models should then be formulated with a view to resolving critical state changes; and 5) meso-scale modelling of coastal morphological change should reflect critically on the role of modelling and its relation to the observable world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the importance of larval abundance in determining the recruitment of benthic marine invertebrates and as a major factor in marine benthic community structure, relating planktonic larval abundance with post-settlement post-larvae and juveniles in the benthos is difficult. It is hampered by several methodological difficulties, including sampling frequency, ability to follow larval and post-larval or juvenile cohorts, and ability to calculate growth and mortality rates. In our work, an intensive sampling strategy was used. Larvae in the plankton were collected at weekly intervals, while post-larvae that settled into collectors were analysed fortnightly. Planktonic larval and benthic post-larval/juvenile cohorts were determined, and growth and mortality rates calculated. Integration of all equations allowed the development of a theoretical formulation that, based on the abundance and planktonic larval duration, permits an estimation of the future abundance of post-larvae/juveniles during the first year of benthic life. The model can be applied to a sample in which it was necessary only to measure larval length.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An emerging consensus in cognitive science views the biological brain as a hierarchically-organized predictive processing system. This is a system in which higher-order regions are continuously attempting to predict the activity of lower-order regions at a variety of (increasingly abstract) spatial and temporal scales. The brain is thus revealed as a hierarchical prediction machine that is constantly engaged in the effort to predict the flow of information originating from the sensory surfaces. Such a view seems to afford a great deal of explanatory leverage when it comes to a broad swathe of seemingly disparate psychological phenomena (e.g., learning, memory, perception, action, emotion, planning, reason, imagination, and conscious experience). In the most positive case, the predictive processing story seems to provide our first glimpse at what a unified (computationally-tractable and neurobiological plausible) account of human psychology might look like. This obviously marks out one reason why such models should be the focus of current empirical and theoretical attention. Another reason, however, is rooted in the potential of such models to advance the current state-of-the-art in machine intelligence and machine learning. Interestingly, the vision of the brain as a hierarchical prediction machine is one that establishes contact with work that goes under the heading of 'deep learning'. Deep learning systems thus often attempt to make use of predictive processing schemes and (increasingly abstract) generative models as a means of supporting the analysis of large data sets. But are such computational systems sufficient (by themselves) to provide a route to general human-level analytic capabilities? I will argue that they are not and that closer attention to a broader range of forces and factors (many of which are not confined to the neural realm) may be required to understand what it is that gives human cognition its distinctive (and largely unique) flavour. The vision that emerges is one of 'homomimetic deep learning systems', systems that situate a hierarchically-organized predictive processing core within a larger nexus of developmental, behavioural, symbolic, technological and social influences. Relative to that vision, I suggest that we should see the Web as a form of 'cognitive ecology', one that is as much involved with the transformation of machine intelligence as it is with the progressive reshaping of our own cognitive capabilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple theoretical model for the intensification of tropical cyclones and polar lows is developed using a minimal set of physical assumptions. These disturbances are assumed to be balanced systems intensifying through the WISHE (Wind-Induced Surface Heat Exchange) intensification mechanism, driven by surface fluxes of heat and moisture into an atmosphere which is neutral to moist convection. The equation set is linearized about a resting basic state and solved as an initial-value problem. A system is predicted to intensify with an exponential perturbation growth rate scaled by the radial gradient of an efficiency parameter which crudely represents the effects of unsaturated processes. The form of this efficiency parameter is assumed to be defined by initial conditions, dependent on the nature of a pre-existing vortex required to precondition the atmosphere to a state in which the vortex can intensify. Evaluation of the simple model using a primitive-equation, nonlinear numerical model provides support for the prediction of exponential perturbation growth. Good agreement is found between the simple and numerical models for the sensitivities of the measured growth rate to various parameters, including surface roughness, the rate of transfer of heat and moisture from the ocean surface, and the scale for the growing vortex.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular dynamics simulations of the photodissociated state of carbonmonoxy myoglobin (MbCO) are presented using a fluctuating charge model for CO. A new three-point charge model is fitted to high-level ab initio calculations of the dipole and quadrupole moment functions taken from the literature. The infrared spectrum of the CO molecule in the heme pocket is calculated using the dipole moment time autocorrelation function and shows good agreement with experiment. In particular, the new model reproduces the experimentally observed splitting of the CO absorption spectrum. The splitting of 3–7 cm−1 (compared to the experimental value of 10 cm−1) can be directly attributed to the two possible orientations of CO within the docking site at the edge of the distal heme pocket (the B states), as previously suggested on the basis of experimental femtosecond time-resolved infrared studies. Further information on the time evolution of the position and orientation of the CO molecule is obtained and analyzed. The calculated difference in the free energy between the two possible orientations (Fe···CO and Fe···OC) is 0.3 kcal mol−1 and agrees well with the experimentally estimated value of 0.29 kcal mol−1. A comparison of the new fluctuating charge model with an established fixed charge model reveals some differences that may be critical for the correct prediction of the infrared spectrum and energy barriers. The photodissociation of CO from the myoglobin mutant L29F using the new model shows rapid escape of CO from the distal heme pocket, in good agreement with recent experimental data. The effect of the protein environment on the multipole moments of the CO ligand is investigated and taken into account in a refined model. Molecular dynamics simulations with this refined model are in agreement with the calculations based on the gas-phase model. However, it is demonstrated that even small changes in the electrostatics of CO alter the details of the dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this contribution we aim at anchoring Agent-Based Modeling (ABM) simulations in actual models of human psychology. More specifically, we apply unidirectional ABM to social psychological models using low level agents (i.e., intra-individual) to examine whether they generate better predictions, in comparison to standard statistical approaches, concerning the intentions of performing a behavior and the behavior. Moreover, this contribution tests to what extent the predictive validity of models of attitude such as the Theory of Planned Behavior (TPB) or Model of Goal-directed Behavior (MGB) depends on the assumption that peoples’ decisions and actions are purely rational. Simulations were therefore run by considering different deviations from rationality of the agents with a trembling hand method. Two data sets concerning respectively the consumption of soft drinks and physical activity were used. Three key findings emerged from the simulations. First, compared to standard statistical approach the agent-based simulation generally improves the prediction of behavior from intention. Second, the improvement in prediction is inversely proportional to the complexity of the underlying theoretical model. Finally, the introduction of varying degrees of deviation from rationality in agents’ behavior can lead to an improvement in the goodness of fit of the simulations. By demonstrating the potential of ABM as a complementary perspective to evaluating social psychological models, this contribution underlines the necessity of better defining agents in terms of psychological processes before examining higher levels such as the interactions between individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predictability is considered in the context of the seamless weather-climate prediction problem, and the notion is developed that there can be predictive power on all time-scales. On all scales there are phenomena that occur as well as longer time-scales and external conditions that should combine to give some predictability. To what extent this theoretical predictability may actually be realised and, further, to what extent it may be useful is not clear. However the potential should provide a stimulus to, and high profile for, our science and its application for many years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the fundamental questions in dynamical meteorology, and one of the basic objectives of GARP, is to determine the predictability of the atmosphere. In the early planning stage and preparation for GARP a number of theoretical and numerical studies were undertaken, indicating that there existed an inherent unpredictability in the atmosphere which even with the most ideal observing system would limit useful weather forecasting to 2-3 weeks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present work, a new approach for the determination of the partition coefficient in different interfaces based on the density function theory is proposed. Our results for log P(ow) considering a n-octanol/water interface for a large super cell for acetone -0.30 (-0.24) and methane 0.95 (0.78) are comparable with the experimental data given in parenthesis. We believe that these differences are mainly related to the absence of van der Walls interactions and the limited number of molecules considered in the super cell. The numerical deviations are smaller than that observed for interpolation based tools. As the proposed model is parameter free, it is not limited to the n-octanol/water interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chagas disease is nowadays the most serious parasitic health problem. This disease is caused by Trypanosoma cruzi. The great number of deaths and the insufficient effectiveness of drugs against this parasite have alarmed the scientific community worldwide. In an attempt to overcome this problem, a model for the design and prediction of new antitrypanosomal agents was obtained. This used a mixed approach, containing simple descriptors based on fragments and topological substructural molecular design descriptors. A data set was made up of 188 compounds, 99 of them characterized an antitrypanosomal activity and 88 compounds that belong to other pharmaceutical categories. The model showed sensitivity, specificity and accuracy values above 85%. Quantitative fragmental contributions were also calculated. Then, and to confirm the quality of the model, 15 structures of molecules tested as antitrypanosomal compounds (that we did not include in this study) were predicted, taking into account the information on the abovementioned calculated fragmental contributions. The model showed an accuracy of 100% which means that the ""in silico"" methodology developed by our team is promising for the rational design of new antitrypanosomal drugs. (C) 2009 Wiley Periodicals, Inc. J Comput Chem 31: 882-894. 2010

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing resistance of Mycobacterium tuberculosis to the existing drugs has alarmed the worldwide scientific community. In an attempt to overcome this problem, two models for the design and prediction of new antituberculosis agents were obtained. The first used a mixed approach, containing descriptors based on fragments and the topological substructural molecular design approach (TOPS-MODE) descriptors. The other model used a combination of two-dimensional (2D) and three-dimensional (3D) descriptors. A data set of 167 compounds with great structural variability, 72 of them antituberculosis agents and 95 compounds belonging to other pharmaceutical categories, was analyzed. The first model showed sensitivity, specificity, and accuracy values above 80% and the second one showed values higher than 75% for these statistical indices. Subsequently, 12 structures of imidazoles not included in this study were designed, taking into account the two models. In both cases accuracy was 100%, showing that the methodology in silico developed by us is promising for the rational design of antituberculosis drugs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A very high level of theoretical treatment (complete active space self-consistent field CASSCF/MRCI/aug-cc-pV5Z) was used to characterize the spectroscopic properties of a manifold of quartet and doublet states of the species BeP, as yet experimentally unknown. Potential energy curves for 11 electronic states were obtained, as well as the associated vibrational energy levels, and a whole set of spectroscopic constants. Dipole moment functions and vibrationally averaged dipole moments were also evaluated. Similarities and differences between BeN and BeP were analysed along with the isovalent SiB species. The molecule BeP has a X (4)Sigma(-) ground state, with an equilibrium bond distance of 2.073 angstrom, and a harmonic frequency of 516.2 cm(-1); it is followed closely by the states (2)Pi (R(e) = 2.081 angstrom, omega(e) = 639.6 cm(-1)) and (2)Sigma(-) (R(e) = 2.074 angstrom, omega(e) = 536.5 cm(-1)), at 502 and 1976 cm(-1), respectively. The other quartets investigated, A (4)Pi (R(e) = 1.991 angstrom, omega(e) = 555.3 cm(-1)) and B (4)Sigma(-) (R(e) = 2.758 angstrom, omega(e) = 292.2 cm(-1)) lie at 13 291 and 24 394 cm(-1), respectively. The remaining doublets ((2)Delta, (2)Sigma(+)(2) and (2)Pi(3)) all fall below 28 000 cm(-1). Avoided crossings between the (2)Sigma(+) states and between the (2)Pi states add an extra complexity to this manifold of states.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The signaling models have contributed to the literature of corporate finance by the formalization of "the informational content of dividends hypothesis". However, these models are under criticism of empirical works, as weak evidences were found supporting one of the main predictions: the positive relation between changes in dividends and changes in earnings. We claim that the failure to verify this prediction does not invalidate the signaling approach. The mo deIs developed up to now assume or derive utility functions with the single-crossing property. We show that signaling is possible in the absence of this property and, in this case, changes in dividend and changes in earnings can be positively or negatively related.