953 resultados para Models and Principles


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The level of agreement between climate model simulations and observed surface temperature change is a topic of scientific and policy concern. While the Earth system continues to accumulate energy due to anthropogenic and other radiative forcings, estimates of recent surface temperature evolution fall at the lower end of climate model projections. Global mean temperatures from climate model simulations are typically calculated using surface air temperatures, while the corresponding observations are based on a blend of air and sea surface temperatures. This work quantifies a systematic bias in model-observation comparisons arising from differential warming rates between sea surface temperatures and surface air temperatures over oceans. A further bias arises from the treatment of temperatures in regions where the sea ice boundary has changed. Applying the methodology of the HadCRUT4 record to climate model temperature fields accounts for 38% of the discrepancy in trend between models and observations over the period 1975–2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two stochastic epidemic lattice models, the susceptible-infected-recovered and the susceptible-exposed-infected models, are studied on a Cayley tree of coordination number k. The spreading of the disease in the former is found to occur when the infection probability b is larger than b(c) = k/2(k - 1). In the latter, which is equivalent to a dynamic site percolation model, the spreading occurs when the infection probability p is greater than p(c) = 1/(k - 1). We set up and solve the time evolution equations for both models and determine the final and time-dependent properties, including the epidemic curve. We show that the two models are closely related by revealing that their relevant properties are exactly mapped into each other when p = b/[k - (k - 1) b]. These include the cluster size distribution and the density of individuals of each type, quantities that have been determined in closed forms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates which properties money-demand functions have to satisfy to be consistent with multidimensional extensions of Lucasí(2000) versions of the Sidrauski (1967) and the shopping-time models. We also investigate how such classes of models relate to each other regarding the rationalization of money demands. We conclude that money demand functions rationalizable by the shoppingtime model are always rationalizable by the Sidrauski model, but that the converse is not true. The log-log money demand with an interest-rate elasticity greater than or equal to one and the semi-log money demand are counterexamples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

VAMP (variable-mass particle) scenarios, in which the mass of the cold dark matter particles is a function of the scalar field responsible for the present acceleration of the Universe, have been proposed as a solution to the cosmic coincidence problem, since in the attractor regime both dark energy and dark matter scale in the same way. We find that only a narrow region in parameter space leads to models with viable values for the Hubble constant and dark energy density today. In the allowed region, the dark energy density starts to dominate around the present epoch and consequently such models cannot solve the coincidence problem. We show that the age of the Universe in this scenario is considerably higher than the age for noncoupled dark energy models, and conclude that more precise independent measurements of the age of the Universe would be useful in distinguishing between coupled and noncoupled dark energy models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A general construction of affine nonabelian (NA)-Toda models in terms of the axial and vector gauged two loop WZNW model is discussed. They represent integrable perturbations of the conformal sigma -models (with tachyons included) describing (charged) black hole type string backgrounds. We study the off-critical T-duality between certain families of axial and vector type integrable models for the case of affine NA-Toda theories with one global U(1) symmetry. In particular we find the Lie algebraic condition defining a subclass of T-selfdual torsionless NA-Toda models and their zero curvature representation. (C) 2001 Academic Press.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The data of four networks that can be used in carrying out comparative studies with methods for transmission network expansion planning are given. These networks are of various types and different levels of complexity. The main mathematical formulations used in transmission expansion studies-transportation models, hybrid models, DC power flow models, and disjunctive models are also summarised and compared. The main algorithm families are reviewed-both analytical, combinatorial and heuristic approaches. Optimal solutions are not yet known for some of the four networks when more accurate models (e.g. The DC model) are used to represent the power flow equations-the state of the art with regard to this is also summarised. This should serve as a challenge to authors searching for new, more efficient methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear mixed effects models have been widely used in analysis of data where responses are clustered around some random effects, so it is not reasonable to assume independence between observations in the same cluster. In most biological applications, it is assumed that the distributions of the random effects and of the residuals are Gaussian. This makes inferences vulnerable to the presence of outliers. Here, linear mixed effects models with normal/independent residual distributions for robust inferences are described. Specific distributions examined include univariate and multivariate versions of the Student-t, the slash and the contaminated normal. A Bayesian framework is adopted and Markov chain Monte Carlo is used to carry out the posterior analysis. The procedures are illustrated using birth weight data on rats in a texicological experiment. Results from the Gaussian and robust models are contrasted, and it is shown how the implementation can be used for outlier detection. The thick-tailed distributions provide an appealing robust alternative to the Gaussian process in linear mixed models, and they are easily implemented using data augmentation and MCMC techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The search for alternative and effective forms of training simulation is needed due to ethical and medico-legal aspects involved in training surgical skills on living patients, human cadavers and living animals. Aims : To evaluate if the bench model fidelity interferes in the acquisition of elliptical excision skills by novice medical students. Materials and Methods: Forty novice medical students were randomly assigned to 5 practice conditions with instructor-directed elliptical excision skills' training (n = 8): didactic materials (control); organic bench model (low-fidelity); ethylene-vinyl acetate bench model (low-fidelity); chicken legs' skin bench model (high-fidelity); or pig foot skin bench model (high-fidelity). Pre- and post-tests were applied. Global rating scale, effect size, and self-perceived confidence based on Likert scale were used to evaluate all elliptical excision performances. Results : The analysis showed that after training, the students practicing on bench models had better performance based on Global rating scale (all P < 0.0000) and felt more confident to perform elliptical excision skills (all P < 0.0000) when compared to the control. There was no significant difference (all P > 0.05) between the groups that trained on bench models. The magnitude of the effect (basic cutaneous surgery skills' training) was considered large (>0.80) in all measurements. Conclusion : The acquisition of elliptical excision skills after instructor-directed training on low-fidelity bench models was similar to the training on high-fidelity bench models; and there was a more substantial increase in elliptical excision performances of students that trained on all simulators compared to the learning on didactic materials.