907 resultados para Models and Methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate has been changing in the last fifty years in China and will continue to change regardless any efforts for mitigation. Agriculture is a climate-dependent activity and highly sensitive to climate changes and climate variability. Understanding the interactions between climate change and agricultural production is essential for society stable development of China. The first mission is to fully understand how to predict future climate and link it with agriculture production system. In this paper, recent studies both domestic and international are reviewed in order to provide an overall image of the progress in climate change researches. The methods for climate change scenarios construction are introduced. The pivotal techniques linking crop model and climate models are systematically assessed and climate change impacts on Chinese crops yield among model results are summarized. The study found that simulated productions of grain crop inherit uncertainty from using different climate models, emission scenarios and the crops simulation models. Moreover, studies have different spatial resolutions, and methods for general circulation model (GCM) downscaling which increase the uncertainty for regional impacts assessment. However, the magnitude of change in crop production due to climate change (at 700 ppm CO2 eq correct) appears within ±10% for China in these assessments. In most literatures, the three cereal crop yields showed decline under climate change scenarios and only wheat in some region showed increase. Finally, the paper points out several gaps in current researches which need more studies to shorten the distance for objective recognizing the impacts of climate change on crops. The uncertainty for crop yield projection is associated with climate change scenarios, CO2 fertilization effects and adaptation options. Therefore, more studies on the fields such as free air CO2 enrichment experiment and practical adaptations implemented need to be carried out

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the challenge of representing structural differences in river channel cross-section geometry for regional to global scale river hydraulic models and the effect this can have on simulations of wave dynamics. Classically, channel geometry is defined using data, yet at larger scales the necessary information and model structures do not exist to take this approach. We therefore propose a fundamentally different approach where the structural uncertainty in channel geometry is represented using a simple parameterization, which could then be estimated through calibration or data assimilation. This paper first outlines the development of a computationally efficient numerical scheme to represent generalised channel shapes using a single parameter, which is then validated using a simple straight channel test case and shown to predict wetted perimeter to within 2% for the channels tested. An application to the River Severn, UK is also presented, along with an analysis of model sensitivity to channel shape, depth and friction. The channel shape parameter was shown to improve model simulations of river level, particularly for more physically plausible channel roughness and depth parameter ranges. Calibrating channel Manning’s coefficient in a rectangular channel provided similar water level simulation accuracy in terms of Nash-Sutcliffe efficiency to a model where friction and shape or depth were calibrated. However, the calibrated Manning coefficient in the rectangular channel model was ~2/3 greater than the likely physically realistic value for this reach and this erroneously slowed wave propagation times through the reach by several hours. Therefore, for large scale models applied in data sparse areas, calibrating channel depth and/or shape may be preferable to assuming a rectangular geometry and calibrating friction alone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review the effects of dynamical variability on clouds and radiation in observations and models and discuss their implications for cloud feedbacks. Jet shifts produce robust meridional dipoles in upper-level clouds and longwave cloud-radiative effect (CRE), but low-level clouds, which do not simply shift with the jet, dominate the shortwave CRE. Because the effect of jet variability on CRE is relatively small, future poleward jet shifts with global warming are only a second-order contribution to the total CRE changes around the midlatitudes, suggesting a dominant role for thermodynamic effects. This implies that constraining the dynamical response is unlikely to reduce the uncertainty in extratropical cloud feedback. However, we argue that uncertainty in the cloud-radiative response does affect the atmospheric circulation response to global warming, by modulating patterns of diabatic forcing. How cloud feedbacks can affect the dynamical response to global warming is an important topic of future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The level of agreement between climate model simulations and observed surface temperature change is a topic of scientific and policy concern. While the Earth system continues to accumulate energy due to anthropogenic and other radiative forcings, estimates of recent surface temperature evolution fall at the lower end of climate model projections. Global mean temperatures from climate model simulations are typically calculated using surface air temperatures, while the corresponding observations are based on a blend of air and sea surface temperatures. This work quantifies a systematic bias in model-observation comparisons arising from differential warming rates between sea surface temperatures and surface air temperatures over oceans. A further bias arises from the treatment of temperatures in regions where the sea ice boundary has changed. Applying the methodology of the HadCRUT4 record to climate model temperature fields accounts for 38% of the discrepancy in trend between models and observations over the period 1975–2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Population ecology is a discipline that studies changes in the number and composition (age, sex) of the individuals that form a population. Many of the mechanisms that generate these changes are associated with individual behavior, for example how individuals defend their territories, find mates or disperse. Therefore, it is important to model population dynamics considering the potential influence of behavior on the modeled dynamics. This study illustrates the diversity of behaviors that influence population dynamics describing several methods that allow integrating behavior into population models and range from simpler models that only consider the number of individuals to complex individual-based models that capture great levels of detail. A series of examples shows the importance of explicitly considering behavior in population modeling to avoid reaching erroneous conclusions. This integration is particularly relevant for conservation, as incorrect predictions regarding the dynamics of populations of conservation interest can lead to inadequate assessment and management. Improved predictions can favor effective protection of species and better use of the limited financial and human conservation resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate knowledge of the location and magnitude of ocean heat content (OHC) variability and change is essential for understanding the processes that govern decadal variations in surface temperature, quantifying changes in the planetary energy budget, and developing constraints on the transient climate response to external forcings. We present an overview of the temporal and spatial characteristics of OHC variability and change as represented by an ensemble of dynamical and statistical ocean reanalyses (ORAs). Spatial maps of the 0–300 m layer show large regions of the Pacific and Indian Oceans where the interannual variability of the ensemble mean exceeds ensemble spread, indicating that OHC variations are well-constrained by the available observations over the period 1993–2009. At deeper levels, the ORAs are less well-constrained by observations with the largest differences across the ensemble mostly associated with areas of high eddy kinetic energy, such as the Southern Ocean and boundary current regions. Spatial patterns of OHC change for the period 1997–2009 show good agreement in the upper 300 m and are characterized by a strong dipole pattern in the Pacific Ocean. There is less agreement in the patterns of change at deeper levels, potentially linked to differences in the representation of ocean dynamics, such as water mass formation processes. However, the Atlantic and Southern Oceans are regions in which many ORAs show widespread warming below 700 m over the period 1997–2009. Annual time series of global and hemispheric OHC change for 0–700 m show the largest spread for the data sparse Southern Hemisphere and a number of ORAs seem to be subject to large initialization ‘shock’ over the first few years. In agreement with previous studies, a number of ORAs exhibit enhanced ocean heat uptake below 300 and 700 m during the mid-1990s or early 2000s. The ORA ensemble mean (±1 standard deviation) of rolling 5-year trends in full-depth OHC shows a relatively steady heat uptake of approximately 0.9 ± 0.8 W m−2 (expressed relative to Earth’s surface area) between 1995 and 2002, which reduces to about 0.2 ± 0.6 W m−2 between 2004 and 2006, in qualitative agreement with recent analysis of Earth’s energy imbalance. There is a marked reduction in the ensemble spread of OHC trends below 300 m as the Argo profiling float observations become available in the early 2000s. In general, we suggest that ORAs should be treated with caution when employed to understand past ocean warming trends—especially when considering the deeper ocean where there is little in the way of observational constraints. The current work emphasizes the need to better observe the deep ocean, both for providing observational constraints for future ocean state estimation efforts and also to develop improved models and data assimilation methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human Body Thermoregulation Models have been widely used in the field of human physiology or thermal comfort studies. However there are few studies on the evaluation method for these models. This paper summarises the existing evaluation methods and critically analyses the flaws. Based on that, a method for the evaluating the accuracy of the Human Body Thermoregulation models is proposed. The new evaluation method contributes to the development of Human Body Thermoregulation models and validates their accuracy both statistically and empirically. The accuracy of different models can be compared by the new method. Furthermore, the new method is not only suitable for the evaluation of Human Body Thermoregulation Models, but also can be theoretically applied to the evaluation of the accuracy of the population-based models in other research fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phylogenetic comparative methods are increasingly used to give new insights into the dynamics of trait evolution in deep time. For continuous traits the core of these methods is a suite of models that attempt to capture evolutionary patterns by extending the Brownian constant variance model. However, the properties of these models are often poorly understood, which can lead to the misinterpretation of results. Here we focus on one of these models – the Ornstein Uhlenbeck (OU) model. We show that the OU model is frequently incorrectly favoured over simpler models when using Likelihood ratio tests, and that many studies fitting this model use datasets that are small and prone to this problem. We also show that very small amounts of error in datasets can have profound effects on the inferences derived from OU models. Our results suggest that simulating fitted models and comparing with empirical results is critical when fitting OU and other extensions of the Brownian model. We conclude by making recommendations for best practice in fitting OU models in phylogenetic comparative analyses, and for interpreting the parameters of the OU model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a comprehensive analysis of the spatial, kinematic and chemical properties of stars and globular clusters (GCs) in the `ordinary` elliptical galaxy NGC 4494 using data from the Keck and Subaru telescopes. We derive galaxy surface brightness and colour profiles out to large galactocentric radii. We compare the latter to metallicities derived using the near-infrared Calcium Triplet. We obtain stellar kinematics out to similar to 3.5 effective radii. The latter appear flattened or elongated beyond similar to 1.8 effective radii in contrast to the relatively round photometric isophotes. In fact, NGC 4494 may be a flattened galaxy, possibly even an S0, seen at an inclination of similar to 45 degrees. We publish a catalogue of 431 GC candidates brighter than i(0) = 24 based on the photometry, of which 109 are confirmed spectroscopically and 54 have measured spectroscopic metallicities. We also report the discovery of three spectroscopically confirmed ultra-compact dwarfs around NGC 4494 with measured metallicities of -0.4 less than or similar to [Fe/H] less than or similar to -0.3. Based on their properties, we conclude that they are simply bright GCs. The metal-poor GCs are found to be rotating with similar amplitude as the galaxy stars, while the metal-rich GCs show marginal rotation. We supplement our analysis with available literature data and results. Using model predictions of galaxy formation, and a suite of merger simulations, we find that many of the observational properties of NGC 4494 may be explained by formation in a relatively recent gas-rich major merger. Complete studies of individual galaxies incorporating a range of observational avenues and methods such as the one presented here will be an invaluable tool for constraining the fine details of galaxy formation models, especially at large galactocentric radii.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim of the study: Magnolia ovata (A.St.-Hil.) Spreng (formerly Talauma ovata), known as ""pinha-do-brejo"" or ""baguacu"", is a large tree widely distributed in Brazil. Its trunk bark has been used in folk medicine against fever. However, no data have been published to support the antipyretic ethnopharmacological use. This study investigated the antipyretic and anti-inflammatory effects of the ethanolic extract (EEMO). dichloromethane fraction (DCM), and the isolated compound costunolide. Materials and methods: The antipyretic and anti-inflammatory activities were evaluated in experimental models of fever and inflammation in mice. Results: The oral administration of EEMO, DCM and costunolide inhibited carrageenan (Cg)-induced paw oedema (ID(50) 72.35 (38.64-135.46) mg/kg, 5.8 (2.41-14.04) mg/kg and 0.18 (0.12-0.27) mg/kg, respectively) and was effective in abolishing lipopolysaccharide (LPS)-induced fever (30 mg/kg, 4.5 mg/kg and 0.15 mg/kg, respectively). EEMO was also effective in reducing cell migration in the pleurisy model. Intraplantar injection of costunolide also reduced the paw oedema, myeloperoxidase and N-acetyl-glucosaminidase activity induced by Cg in mice. Conclusions: Collectively, these results show, for the first time, that extracts obtained from Magnolia ovata possess antipyretic and anti-inflammatory properties, and costunolide appears to be the compound responsible for these effects. (C) 2009 Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear mixed models were developed to handle clustered data and have been a topic of increasing interest in statistics for the past 50 years. Generally. the normality (or symmetry) of the random effects is a common assumption in linear mixed models but it may, sometimes, be unrealistic, obscuring important features of among-subjects variation. In this article, we utilize skew-normal/independent distributions as a tool for robust modeling of linear mixed models under a Bayesian paradigm. The skew-normal/independent distributions is an attractive class of asymmetric heavy-tailed distributions that includes the skew-normal distribution, skew-t, skew-slash and the skew-contaminated normal distributions as special cases, providing an appealing robust alternative to the routine use of symmetric distributions in this type of models. The methods developed are illustrated using a real data set from Framingham cholesterol study. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two stochastic epidemic lattice models, the susceptible-infected-recovered and the susceptible-exposed-infected models, are studied on a Cayley tree of coordination number k. The spreading of the disease in the former is found to occur when the infection probability b is larger than b(c) = k/2(k - 1). In the latter, which is equivalent to a dynamic site percolation model, the spreading occurs when the infection probability p is greater than p(c) = 1/(k - 1). We set up and solve the time evolution equations for both models and determine the final and time-dependent properties, including the epidemic curve. We show that the two models are closely related by revealing that their relevant properties are exactly mapped into each other when p = b/[k - (k - 1) b]. These include the cluster size distribution and the density of individuals of each type, quantities that have been determined in closed forms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human parasitic diseases are the foremost threat to human health and welfare around the world. Trypanosomiasis is a very serious infectious disease against which the currently available drugs are limited and not effective. Therefore, there is an urgent need for new chemotherapeutic agents. One attractive drug target is the major cysteine protease from Trypanosoma cruzi, cruzain. In the present work, comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) studies were conducted on a series of thiosemicarbazone and semicarbazone derivatives as inhibitors of cruzain. Molecular modeling studies were performed in order to identify the preferred binding mode of the inhibitors into the enzyme active site, and to generate structural alignments for the three-dimensional quantitative structure-activity relationship (3D QSAR) investigations. Statistically significant models were obtained (CoMFA. r(2) = 0.96 and q(2) = 0.78; CoMSIA, r(2) = 0.91 and q(2) = 0.73), indicating their predictive ability for untested compounds. The models were externally validated employing a test set, and the predicted values were in good agreement with the experimental results. The final QSAR models and the information gathered from the 3D CoMFA and CoMSIA contour maps provided important insights into the chemical and structural basis involved in the molecular recognition process of this family of cruzain inhibitors, and should be useful for the design of new structurally related analogs with improved potency. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we discuss inferential aspects of the measurement error regression models with null intercepts when the unknown quantity x (latent variable) follows a skew normal distribution. We examine first the maximum-likelihood approach to estimation via the EM algorithm by exploring statistical properties of the model considered. Then, the marginal likelihood, the score function and the observed information matrix of the observed quantities are presented allowing direct inference implementation. In order to discuss some diagnostics techniques in this type of models, we derive the appropriate matrices to assessing the local influence on the parameter estimates under different perturbation schemes. The results and methods developed in this paper are illustrated considering part of a real data set used by Hadgu and Koch [1999, Application of generalized estimating equations to a dental randomized clinical trial. Journal of Biopharmaceutical Statistics, 9, 161-178].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The class of symmetric linear regression models has the normal linear regression model as a special case and includes several models that assume that the errors follow a symmetric distribution with longer-than-normal tails. An important member of this class is the t linear regression model, which is commonly used as an alternative to the usual normal regression model when the data contain extreme or outlying observations. In this article, we develop second-order asymptotic theory for score tests in this class of models. We obtain Bartlett-corrected score statistics for testing hypotheses on the regression and the dispersion parameters. The corrected statistics have chi-squared distributions with errors of order O(n(-3/2)), n being the sample size. The corrections represent an improvement over the corresponding original Rao`s score statistics, which are chi-squared distributed up to errors of order O(n(-1)). Simulation results show that the corrected score tests perform much better than their uncorrected counterparts in samples of small or moderate size.