907 resultados para Models and Methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this review article is to provide an overview of the role of pigs as a biomedical model for humans. The usefulness and limitations of porcine models have been discussed in terms of metabolic, cardiovascular, digestive and bone diseases in humans. Domestic pigs and minipigs are the main categories of pigs used as biomedical models. One drawback of minipigs is that they are in short supply and expensive compared with domestic pigs, which in contrast cost more to house, feed and medicate. Different porcine breeds show different responses to the induction of specific diseases. For example, ossabaw minipigs provide a better model than Yucatan for the metabolic syndrome as they exhibit obesity, insulin resistance and hypertension, all of which are absent in the Yucatan. Similar metabolic/physiological differences exist between domestic breeds (e.g. Meishan v. Pietrain). The modern commercial (e.g. Large White) domestic pig has been the preferred model for developmental programming due to the 2- to 3-fold variation in body weight among littermates providing a natural form of foetal growth retardation not observed in ancient (e.g. Meishan) domestic breeds. Pigs have been increasingly used to study chronic ischaemia, therapeutic angiogenesis, hypertrophic cardiomyopathy and abdominal aortic aneurysm as their coronary anatomy and physiology are similar to humans. Type 1 and II diabetes can be induced in swine using dietary regimes and/or administration of streptozotocin. Pigs are a good and extensively used model for specific nutritional studies as their protein and lipid metabolism is comparable with humans, although pigs are not as sensitive to protein restriction as rodents. Neonatal and weanling pigs have been used to examine the pathophysiology and prevention/treatment of microbial-associated diseases and immune system disorders. A porcine model mimicking various degrees of prematurity in infants receiving total parenteral nutrition has been established to investigate gut development, amino acid metabolism and non-alcoholic fatty liver disease. Endoscopic therapeutic methods for upper gastrointestinal tract bleeding are being developed. Bone remodelling cycle in pigs is histologically more similar to humans than that of rats or mice, and is used to examine the relationship between menopause and osteoporosis. Work has also been conducted on dental implants in pigs to consider loading; however with caution as porcine bone remodels slightly faster than human bone. We conclude that pigs are a valuable translational model to bridge the gap between classical rodent models and humans in developing new therapies to aid human health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Earth system models (ESMs) are increasing in complexity by incorporating more processes than their predecessors, making them potentially important tools for studying the evolution of climate and associated biogeochemical cycles. However, their coupled behaviour has only recently been examined in any detail, and has yielded a very wide range of outcomes. For example, coupled climate–carbon cycle models that represent land-use change simulate total land carbon stores at 2100 that vary by as much as 600 Pg C, given the same emissions scenario. This large uncertainty is associated with differences in how key processes are simulated in different models, and illustrates the necessity of determining which models are most realistic using rigorous methods of model evaluation. Here we assess the state-of-the-art in evaluation of ESMs, with a particular emphasis on the simulation of the carbon cycle and associated biospheric processes. We examine some of the new advances and remaining uncertainties relating to (i) modern and palaeodata and (ii) metrics for evaluation. We note that the practice of averaging results from many models is unreliable and no substitute for proper evaluation of individual models. We discuss a range of strategies, such as the inclusion of pre-calibration, combined process- and system-level evaluation, and the use of emergent constraints, that can contribute to the development of more robust evaluation schemes. An increasingly data-rich environment offers more opportunities for model evaluation, but also presents a challenge. Improved knowledge of data uncertainties is still necessary to move the field of ESM evaluation away from a "beauty contest" towards the development of useful constraints on model outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Individual-based models (IBMs) can simulate the actions of individual animals as they interact with one another and the landscape in which they live. When used in spatially-explicit landscapes IBMs can show how populations change over time in response to management actions. For instance, IBMs are being used to design strategies of conservation and of the exploitation of fisheries, and for assessing the effects on populations of major construction projects and of novel agricultural chemicals. In such real world contexts, it becomes especially important to build IBMs in a principled fashion, and to approach calibration and evaluation systematically. We argue that insights from physiological and behavioural ecology offer a recipe for building realistic models, and that Approximate Bayesian Computation (ABC) is a promising technique for the calibration and evaluation of IBMs. IBMs are constructed primarily from knowledge about individuals. In ecological applications the relevant knowledge is found in physiological and behavioural ecology, and we approach these from an evolutionary perspective by taking into account how physiological and behavioural processes contribute to life histories, and how those life histories evolve. Evolutionary life history theory shows that, other things being equal, organisms should grow to sexual maturity as fast as possible, and then reproduce as fast as possible, while minimising per capita death rate. Physiological and behavioural ecology are largely built on these principles together with the laws of conservation of matter and energy. To complete construction of an IBM information is also needed on the effects of competitors, conspecifics and food scarcity; the maximum rates of ingestion, growth and reproduction, and life-history parameters. Using this knowledge about physiological and behavioural processes provides a principled way to build IBMs, but model parameters vary between species and are often difficult to measure. A common solution is to manually compare model outputs with observations from real landscapes and so to obtain parameters which produce acceptable fits of model to data. However, this procedure can be convoluted and lead to over-calibrated and thus inflexible models. Many formal statistical techniques are unsuitable for use with IBMs, but we argue that ABC offers a potential way forward. It can be used to calibrate and compare complex stochastic models and to assess the uncertainty in their predictions. We describe methods used to implement ABC in an accessible way and illustrate them with examples and discussion of recent studies. Although much progress has been made, theoretical issues remain, and some of these are outlined and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison of the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. These large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis develops and evaluates statistical methods for different types of genetic analyses, including quantitative trait loci (QTL) analysis, genome-wide association study (GWAS), and genomic evaluation. The main contribution of the thesis is to provide novel insights in modeling genetic variance, especially via random effects models. In variance component QTL analysis, a full likelihood model accounting for uncertainty in the identity-by-descent (IBD) matrix was developed. It was found to be able to correctly adjust the bias in genetic variance component estimation and gain power in QTL mapping in terms of precision.  Double hierarchical generalized linear models, and a non-iterative simplified version, were implemented and applied to fit data of an entire genome. These whole genome models were shown to have good performance in both QTL mapping and genomic prediction. A re-analysis of a publicly available GWAS data set identified significant loci in Arabidopsis that control phenotypic variance instead of mean, which validated the idea of variance-controlling genes.  The works in the thesis are accompanied by R packages available online, including a general statistical tool for fitting random effects models (hglm), an efficient generalized ridge regression for high-dimensional data (bigRR), a double-layer mixed model for genomic data analysis (iQTL), a stochastic IBD matrix calculator (MCIBD), a computational interface for QTL mapping (qtl.outbred), and a GWAS analysis tool for mapping variance-controlling loci (vGWAS).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ties among event times are often recorded in survival studies. For example, in a two week laboratory study where event times are measured in days, ties are very likely to occur. The proportional hazards model might be used in this setting using an approximated partial likelihood function. This approximation works well when the number of ties is small. on the other hand, discrete regression models are suggested when the data are heavily tied. However, in many situations it is not clear which approach should be used in practice. In this work, empirical guidelines based on Monte Carlo simulations are provided. These recommendations are based on a measure of the amount of tied data present and the mean square error. An example illustrates the proposed criterion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aims of this study were: (1) to verify the validity of previous proposed models to estimate the lowest exercise duration (T (LOW)) and the highest intensity (I (HIGH)) at which VO(2)max is reached (2) to test the hypothesis that parameters involved in these models, and hence the validity of these models are affected by aerobic training status. Thirteen cyclists (EC), eleven runners (ER) and ten untrained (U) subjects performed several cycle-ergometer exercise tests to fatigue in order to determine and estimate T (LOW) (ET (LOW)) and I (HIGH) (EI (HIGH)). The relationship between the time to achieved VO(2)max and time to exhaustion (T (lim)) was used to estimate ET (LOW). EI (HIGH) was estimated using the critical power model. I (HIGH) was assumed as the highest intensity at which VO2 was equal or higher than the average of VO(2)max values minus one typical error. T (LOW) was considered T (lim) associated with I (HIGH). No differences were found in T (LOW) between ER (170 +/- 31 s) and U (209 +/- 29 s), however, both showed higher values than EC (117 +/- 29 s). I (HIGH) was similar between U (269 +/- 73 W) and ER (319 +/- 50 W), and both were lower than EC (451 +/- 33 W). EI (HIGH) was similar and significantly correlated with I-HIGH only in U (r = 0.87) and ER (r = 0.62). ET (LOW) and T (LOW) were different only for U and not significantly correlated in all groups. These data suggest that the aerobic training status affects the validity of the proposed models for estimating I (HIGH).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The data of four networks that can be used in carrying out comparative studies with methods for transmission network expansion planning are given. These networks are of various types and different levels of complexity. The main mathematical formulations used in transmission expansion studies-transportation models, hybrid models, DC power flow models, and disjunctive models are also summarised and compared. The main algorithm families are reviewed-both analytical, combinatorial and heuristic approaches. Optimal solutions are not yet known for some of the four networks when more accurate models (e.g. The DC model) are used to represent the power flow equations-the state of the art with regard to this is also summarised. This should serve as a challenge to authors searching for new, more efficient methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The search for alternative and effective forms of training simulation is needed due to ethical and medico-legal aspects involved in training surgical skills on living patients, human cadavers and living animals. Aims : To evaluate if the bench model fidelity interferes in the acquisition of elliptical excision skills by novice medical students. Materials and Methods: Forty novice medical students were randomly assigned to 5 practice conditions with instructor-directed elliptical excision skills' training (n = 8): didactic materials (control); organic bench model (low-fidelity); ethylene-vinyl acetate bench model (low-fidelity); chicken legs' skin bench model (high-fidelity); or pig foot skin bench model (high-fidelity). Pre- and post-tests were applied. Global rating scale, effect size, and self-perceived confidence based on Likert scale were used to evaluate all elliptical excision performances. Results : The analysis showed that after training, the students practicing on bench models had better performance based on Global rating scale (all P < 0.0000) and felt more confident to perform elliptical excision skills (all P < 0.0000) when compared to the control. There was no significant difference (all P > 0.05) between the groups that trained on bench models. The magnitude of the effect (basic cutaneous surgery skills' training) was considered large (>0.80) in all measurements. Conclusion : The acquisition of elliptical excision skills after instructor-directed training on low-fidelity bench models was similar to the training on high-fidelity bench models; and there was a more substantial increase in elliptical excision performances of students that trained on all simulators compared to the learning on didactic materials.