95 resultados para Multiple-trait model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic parameters and breeding values for dairy cow fertility were estimated from 62 443 lactation records. Two-trait analysis of fertility and milk yield was investigated as a method to estimate fertility breeding values when culling or selection based on milk yield in early lactation determines presence or absence of fertility observations in later lactations. Fertility traits were calving interval, intervals from calving to first service, calving to conception and first to last service, conception success to first service and number of services per conception. Milk production traits were 305-day milk, fat and protein yield. For fertility traits, range of estimates of heritability (h(2)) was 0.012 to 0.028 and of permanent environmental variance (c(2)) was 0.016 to 0.032. Genetic correlations (r(g)) among fertility traits were generally high ( > 0.70). Genetic correlations of fertility with milk production traits were unfavourable (range -0.11 to 0.46). Single and two-trait analyses of fertility were compared using the same data set. The estimates of h(2) and c(2) were similar for two types of analyses. However, there were differences between estimated breeding values and rankings for the same trait from single versus multi-trait analyses. The range for rank correlation was 0.69-0.83 for all animals in the pedigree and 0.89-0.96 for sires with more than 25 daughters. As single-trait method is biased due to selection on milk yield, a multi-trait evaluation of fertility with milk yield is recommended. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Shelf and coastal seas are regions of exceptionally high biological productivity, high rates of biogeochemical cycling and immense socio-economic importance. They are, however, poorly represented by the present generation of Earth system models, both in terms of resolution and process representation. Hence, these models cannot be used to elucidate the role of the coastal ocean in global biogeochemical cycles and the effects global change (both direct anthropogenic and climatic) are having on them. Here, we present a system for simulating all the coastal regions around the world (the Global Coastal Ocean Modelling System) in a systematic and practical fashion. It is based on automatically generating multiple nested model domains, using the Proudman Oceanographic Laboratory Coastal Ocean Modelling System coupled to the European Regional Seas Ecosystem Model. Preliminary results from the system are presented. These demonstrate the viability of the concept, and we discuss the prospects for using the system to explore key areas of global change in shelf seas, such as their role in the carbon cycle and climate change effects on fisheries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The prediction of climate variability and change requires the use of a range of simulation models. Multiple climate model simulations are needed to sample the inherent uncertainties in seasonal to centennial prediction. Because climate models are computationally expensive, there is a tradeoff between complexity, spatial resolution, simulation length, and ensemble size. The methods used to assess climate impacts are examined in the context of this trade-off. An emphasis on complexity allows simulation of coupled mechanisms, such as the carbon cycle and feedbacks between agricultural land management and climate. In addition to improving skill, greater spatial resolution increases relevance to regional planning. Greater ensemble size improves the sampling of probabilities. Research from major international projects is used to show the importance of synergistic research efforts. The primary climate impact examined is crop yield, although many of the issues discussed are relevant to hydrology and health modeling. Methods used to bridge the scale gap between climate and crop models are reviewed. Recent advances include large-area crop modeling, quantification of uncertainty in crop yield, and fully integrated crop–climate modeling. The implications of trends in computer power, including supercomputers, are also discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim: To describe the geographical pattern of mean body size of the non-volant mammals of the Nearctic and Neotropics and evaluate the influence of five environmental variables that are likely to affect body size gradients. Location: The Western Hemisphere. Methods: We calculated mean body size (average log mass) values in 110 × 110 km cells covering the continental Nearctic and Neotropics. We also generated cell averages for mean annual temperature, range in elevation, their interaction, actual evapotranspiration, and the global vegetation index and its coefficient of variation. Associations between mean body size and environmental variables were tested with simple correlations and ordinary least squares multiple regression, complemented with spatial autocorrelation analyses and split-line regression. We evaluated the relative support for each multiple-regression model using AIC. Results: Mean body size increases to the north in the Nearctic and is negatively correlated with temperature. In contrast, across the Neotropics mammals are largest in the tropical and subtropical lowlands and smaller in the Andes, generating a positive correlation with temperature. Finally, body size and temperature are nonlinearly related in both regions, and split-line linear regression found temperature thresholds marking clear shifts in these relationships (Nearctic 10.9 °C; Neotropics 12.6 °C). The increase in body sizes with decreasing temperature is strongest in the northern Nearctic, whereas a decrease in body size in mountains dominates the body size gradients in the warmer parts of both regions. Main conclusions: We confirm previous work finding strong broad-scale Bergmann trends in cold macroclimates but not in warmer areas. For the latter regions (i.e. the southern Nearctic and the Neotropics), our analyses also suggest that both local and broad-scale patterns of mammal body size variation are influenced in part by the strong mesoscale climatic gradients existing in mountainous areas. A likely explanation is that reduced habitat sizes in mountains limit the presence of larger-sized mammals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Robot-mediated therapies offer entirely new approaches to neurorehabilitation. In this paper we present the results obtained from trialling the GENTLE/S neurorehabilitation system assessed using the upper limb section of the Fugl-Meyer ( FM) outcome measure. Methods: We demonstrate the design of our clinical trial and its results analysed using a novel statistical approach based on a multivariate analytical model. This paper provides the rational for using multivariate models in robot-mediated clinical trials and draws conclusions from the clinical data gathered during the GENTLE/S study. Results: The FM outcome measures recorded during the baseline ( 8 sessions), robot-mediated therapy ( 9 sessions) and sling-suspension ( 9 sessions) was analysed using a multiple regression model. The results indicate positive but modest recovery trends favouring both interventions used in GENTLE/S clinical trial. The modest recovery shown occurred at a time late after stroke when changes are not clinically anticipated. Conclusion: This study has applied a new method for analysing clinical data obtained from rehabilitation robotics studies. While the data obtained during the clinical trial is of multivariate nature, having multipoint and progressive nature, the multiple regression model used showed great potential for drawing conclusions from this study. An important conclusion to draw from this paper is that this study has shown that the intervention and control phase both caused changes over a period of 9 sessions in comparison to the baseline. This might indicate that use of new challenging and motivational therapies can influence the outcome of therapies at a point when clinical changes are not expected. Further work is required to investigate the effects arising from early intervention, longer exposure and intensity of the therapies. Finally, more function-oriented robot-mediated therapies or sling-suspension therapies are needed to clarify the effects resulting from each intervention for stroke recovery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We use new neutron scattering instrumentation to follow in a single quantitative time-resolving experiment, the three key scales of structural development which accompany the crystallisation of synthetic polymers. These length scales span 3 orders of magnitude of the scattering vector. The study of polymer crystallisation dates back to the pioneering experiments of Keller and others who discovered the chain-folded nature of the thin lamellae crystals which are normally found in synthetic polymers. The inherent connectivity of polymers makes their crystallisation a multiscale transformation. Much understanding has developed over the intervening fifty years but the process has remained something of a mystery. There are three key length scales. The chain folded lamellar thickness is ~ 10nm, the crystal unit cell is ~ 1nm and the detail of the chain conformation is ~ 0.1nm. In previous work these length scales have been addressed using different instrumention or were coupled using compromised geometries. More recently researchers have attempted to exploit coupled time-resolved small-angle and wide-angle x-ray experiments. These turned out to be challenging experiments much related to the challenge of placing the scattering intensity on an absolute scale. However, they did stimulate the possibility of new phenomena in the very early stages of crystallisation. Although there is now considerable doubt on such experiments, they drew attention to the basic question as to the process of crystallisation in long chain molecules. We have used NIMROD on the second target station at ISIS to follow all three length scales in a time-resolving manner for poly(e-caprolactone). The technique can provide a single set of data from 0.01 to 100Å-1 on the same vertical scale. We present the results using a multiple scale model of the crystallisation process in polymers to analyse the results.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

MOTIVATION: The accurate prediction of the quality of 3D models is a key component of successful protein tertiary structure prediction methods. Currently, clustering or consensus based Model Quality Assessment Programs (MQAPs) are the most accurate methods for predicting 3D model quality; however they are often CPU intensive as they carry out multiple structural alignments in order to compare numerous models. In this study, we describe ModFOLDclustQ - a novel MQAP that compares 3D models of proteins without the need for CPU intensive structural alignments by utilising the Q measure for model comparisons. The ModFOLDclustQ method is benchmarked against the top established methods in terms of both accuracy and speed. In addition, the ModFOLDclustQ scores are combined with those from our older ModFOLDclust method to form a new method, ModFOLDclust2, that aims to provide increased prediction accuracy with negligible computational overhead. RESULTS: The ModFOLDclustQ method is competitive with leading clustering based MQAPs for the prediction of global model quality, yet it is up to 150 times faster than the previous version of the ModFOLDclust method at comparing models of small proteins (<60 residues) and over 5 times faster at comparing models of large proteins (>800 residues). Furthermore, a significant improvement in accuracy can be gained over the previous clustering based MQAPs by combining the scores from ModFOLDclustQ and ModFOLDclust to form the new ModFOLDclust2 method, with little impact on the overall time taken for each prediction. AVAILABILITY: The ModFOLDclustQ and ModFOLDclust2 methods are available to download from: http://www.reading.ac.uk/bioinf/downloads/ CONTACT: l.j.mcguffin@reading.ac.uk.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work, a fault-tolerant control scheme is applied to a air handling unit of a heating, ventilation and air-conditioning system. Using the multiple-model approach it is possible to identify faults and to control the system under faulty and normal conditions in an effective way. Using well known techniques to model and control the process, this work focuses on the importance of the cost function in the fault detection and its influence on the reconfigurable controller. Experimental results show how the control of the terminal unit is affected in the presence a fault, and how the recuperation and reconfiguration of the control action is able to deal with the effects of faults.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Proponents of the “fast and frugal” approach to decision-making suggest that inferential judgments are best made on the basis of limited information. For example, if only one of two cities is recognized and the task is to judge which city has the larger population, the recognition heuristic states that the recognized city should be selected. In preference choices with >2 options, it is also standard to assume that a “consideration set”, based upon some simple criterion, is established to reduce the options available. A multinomial processing tree model is outlined which provides the basis for estimating the extent to which recognition is used as a criterion in establishing a consideration set for inferential judgments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Motivation: Modelling the 3D structures of proteins can often be enhanced if more than one fold template is used during the modelling process. However, in many cases, this may also result in poorer model quality for a given target or alignment method. There is a need for modelling protocols that can both consistently and significantly improve 3D models and provide an indication of when models might not benefit from the use of multiple target-template alignments. Here, we investigate the use of both global and local model quality prediction scores produced by ModFOLDclust2, to improve the selection of target-template alignments for the construction of multiple-template models. Additionally, we evaluate clustering the resulting population of multi- and single-template models for the improvement of our IntFOLD-TS tertiary structure prediction method. Results: We find that using accurate local model quality scores to guide alignment selection is the most consistent way to significantly improve models for each of the sequence to structure alignment methods tested. In addition, using accurate global model quality for re-ranking alignments, prior to selection, further improves the majority of multi-template modelling methods tested. Furthermore, subsequent clustering of the resulting population of multiple-template models significantly improves the quality of selected models compared with the previous version of our tertiary structure prediction method, IntFOLD-TS.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The traditional economic approach for appraising the costs and benefits of construction project Net Present Values involves the calculation of net returns for each investment option under different discount rates. An alternative approach consists of multiple-project discount rates based on risk modelling. The example of a portfolio of microgeneration renewable energy technology (MRET) is presented to demonstrate that risks and future available budget for re-investment can be taken into account when setting discount rates for construction project specifications in presence of uncertainty. A formal demonstration is carried out through a reversed intertemporal approach of applied general equilibrium. It is demonstrated that risk and the estimated available budget for future re-investment can be included in the simultaneous assessment of the costs and benefits of multiple projects.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It has long been supposed that preference judgments between sets of to-be-considered possibilities are made by means of initially winnowing down the most promising-looking alternatives to form smaller “consideration sets” (Howard, 1963; Wright & Barbour, 1977). In preference choices with >2 options, it is standard to assume that a “consideration set”, based upon some simple criterion, is established to reduce the options available. Inferential judgments, in contrast, have more frequently been investigated in situations in which only two possibilities need to be considered (e.g., which of these two cities is the larger?) Proponents of the “fast and frugal” approach to decision-making suggest that such judgments are also made on the basis of limited, simple criteria. For example, if only one of two cities is recognized and the task is to judge which city has the larger population, the recognition heuristic states that the recognized city should be selected. A multinomial processing tree model is outlined which provides the basis for estimating the extent to which recognition is used as a criterion in establishing a consideration set for inferential judgments between three possible options.