922 resultados para model with default Vasicek model and Cir model for the short rate
Resumo:
The MATLAB model is contained within the compressed folders (versions are available as .zip and .tgz). This model uses MERRA reanalysis data (>34 years available) to estimate the hourly aggregated wind power generation for a predefined (fixed) distribution of wind farms. A ready made example is included for the wind farm distribution of Great Britain, April 2014 ("CF.dat"). This consists of an hourly time series of GB-total capacity factor spanning the period 1980-2013 inclusive. Given the global nature of reanalysis data, the model can be applied to any specified distribution of wind farms in any region of the world. Users are, however, strongly advised to bear in mind the limitations of reanalysis data when using this model/data. This is discussed in our paper: Cannon, Brayshaw, Methven, Coker, Lenaghan. "Using reanalysis data to quantify extreme wind power generation statistics: a 33 year case study in Great Britain". Submitted to Renewable Energy in March, 2014. Additional information about the model is contained in the model code itself, in the accompanying ReadMe file, and on our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/
Resumo:
Observations of the amplitudes and Doppler shifts of received HF radio waves are compared with model predictions made using a two-dimensional ray-tracing program. The signals are propagated over a sub-auroral path, which is shown to lie along the latitudes of the mid-latitude trough at times of low geomagnetic activity. Generalizing the predictions to include a simple model of the trough in the density and height of the F2 peak enables the explanation of the anomalous observed diurnal variations. The behavior of received amplitude, Doppler shift, and signal-to-noise ratio as a function of the Kp index value, the time of day, and the season (in 17 months of continuous recording) is found to agree closely with that predicted using the statistical position of the trough as deduced from 8 years of Alouette satellite soundings. The variation in the times of the observation of large signal amplitudes with the Kp value and the complete absence of such amplitudes when it exceeds 2.75 are two features that implicate the trough in these effects.
Resumo:
Model intercomparisons have identified important deficits in the representation of the stable boundary layer by turbulence parametrizations used in current weather and climate models. However, detrimental impacts of more realistic schemes on the large-scale flow have hindered progress in this area. Here we implement a total turbulent energy scheme into the climate model ECHAM6. The total turbulent energy scheme considers the effects of Earth’s rotation and static stability on the turbulence length scale. In contrast to the previously used turbulence scheme, the TTE scheme also implicitly represents entrainment flux in a dry convective boundary layer. Reducing the previously exaggerated surface drag in stable boundary layers indeed causes an increase in southern hemispheric zonal winds and large-scale pressure gradients beyond observed values. These biases can be largely removed by increasing the parametrized orographic drag. Reducing the neutral limit turbulent Prandtl number warms and moistens low-latitude boundary layers and acts to reduce longstanding radiation biases in the stratocumulus regions, the Southern Ocean and the equatorial cold tongue that are common to many climate models.
Resumo:
The predictability of high impact weather events on multiple time scales is a crucial issue both in scientific and socio-economic terms. In this study, a statistical-dynamical downscaling (SDD) approach is applied to an ensemble of decadal hindcasts obtained with the Max-Planck-Institute Earth System Model (MPI-ESM) to estimate the decadal predictability of peak wind speeds (as a proxy for gusts) over Europe. Yearly initialized decadal ensemble simulations with ten members are investigated for the period 1979–2005. The SDD approach is trained with COSMO-CLM regional climate model simulations and ERA-Interim reanalysis data and applied to the MPI-ESM hindcasts. The simulations for the period 1990–1993, which was characterized by several windstorm clusters, are analyzed in detail. The anomalies of the 95 % peak wind quantile of the MPI-ESM hindcasts are in line with the positive anomalies in reanalysis data for this period. To evaluate both the skill of the decadal predictability system and the added value of the downscaling approach, quantile verification skill scores are calculated for both the MPI-ESM large-scale wind speeds and the SDD simulated regional peak winds. Skill scores are predominantly positive for the decadal predictability system, with the highest values for short lead times and for (peak) wind speeds equal or above the 75 % quantile. This provides evidence that the analyzed hindcasts and the downscaling technique are suitable for estimating wind and peak wind speeds over Central Europe on decadal time scales. The skill scores for SDD simulated peak winds are slightly lower than those for large-scale wind speeds. This behavior can be largely attributed to the fact that peak winds are a proxy for gusts, and thus have a higher variability than wind speeds. The introduced cost-efficient downscaling technique has the advantage of estimating not only wind speeds but also estimates peak winds (a proxy for gusts) and can be easily applied to large ensemble datasets like operational decadal prediction systems.
Resumo:
Sea-level rise (SLR) from global warming may have severe consequences for coastal cities, particularly when combined with predicted increases in the strength of tidal surges. Predicting the regional impact of SLR flooding is strongly dependent on the modelling approach and accuracy of topographic data. Here, the areas under risk of sea water flooding for London boroughs were quantified based on the projected SLR scenarios reported in Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5) and UK climatic projections 2009 (UKCP09) using a tidally-adjusted bathtub modelling approach. Medium- to very high-resolution digital elevation models (DEMs) are used to evaluate inundation extents as well as uncertainties. Depending on the SLR scenario and DEMs used, it is estimated that 3%–8% of the area of Greater London could be inundated by 2100. The boroughs with the largest areas at risk of flooding are Newham, Southwark, and Greenwich. The differences in inundation areas estimated from a digital terrain model and a digital surface model are much greater than the root mean square error differences observed between the two data types, which may be attributed to processing levels. Flood models from SRTM data underestimate the inundation extent, so their results may not be reliable for constructing flood risk maps. This analysis provides a broad-scale estimate of the potential consequences of SLR and uncertainties in the DEM-based bathtub type flood inundation modelling for London boroughs.
Resumo:
The benefits of breastfeeding for the children`s health have been highlighted in many studies. The innovative aspect of the present study lies in its use of a multilevel model, a technique that has rarely been applied to studies on breastfeeding. The data reported were collected from a larger study, the Family Budget Survey-Pesquisa de Orcamentos Familiares, carried out between 2002 and 2003 in Brazil that involved a sample of 48 470 households. A representative national sample of 1477 infants aged 0-6 months was used. The statistical analysis was performed using a multilevel model, with two levels grouped by region. In Brazil, breastfeeding prevalence was 58%. The factors that bore a negative influence on breastfeeding were over four residents living in the same household [odds ratio (OR) = 0.68, 90% confidence interval (CI) = 0.51-0.89] and mothers aged 30 years or more (OR = 0.68, 90% CI = 0.53-0.89). The factors that positively influenced breastfeeding were the following: higher socio-economic levels (OR = 1.37, 90% CI = 1.01-1.88), families with over two infants under 5 years (OR = 1.25, 90% CI = 1.00-1.58) and being a resident in rural areas (OR = 1.25, 90% CI = 1.00-1.58). Although majority of the mothers was aware of the value of maternal milk and breastfed their babies, the prevalence of breastfeeding remains lower than the rate advised by the World Health Organization, and the number of residents living in the same household along with mothers aged 30 years or older were both factors associated with early cessation of infant breastfeeding before 6 months.
Resumo:
A previously proposed model describing the trapping site of the interstitial atomic hydrogen in borate glasses is analyzed. In this model the atomic hydrogen is stabilized at the centers of oxygen polygons belonging to B-O ring structures in the glass network by van der Waals forces. The previously reported atomic hydrogen isothermal decay experimental data are discussed in the light of this microscopic model. A coupled differential equation system of the observed decay kinetics was solved numerically using the Runge Kutta method. The experimental untrapping activation energy of 0.7 x 10(-19) J is in good agreement with the calculated results of dispersion interaction between the stabilized atomic hydrogen and the neighboring oxygen atoms at the vertices of hexagonal ring structures. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is interest in studying latent variables. These latent variables are directly considered in the Item Response Models (IRM) and they are usually called latent traits. A usual assumption for parameter estimation of the IRM, considering one group of examinees, is to assume that the latent traits are random variables which follow a standard normal distribution. However, many works suggest that this assumption does not apply in many cases. Furthermore, when this assumption does not hold, the parameter estimates tend to be biased and misleading inference can be obtained. Therefore, it is important to model the distribution of the latent traits properly. In this paper we present an alternative latent traits modeling based on the so-called skew-normal distribution; see Genton (2004). We used the centred parameterization, which was proposed by Azzalini (1985). This approach ensures the model identifiability as pointed out by Azevedo et al. (2009b). Also, a Metropolis Hastings within Gibbs sampling (MHWGS) algorithm was built for parameter estimation by using an augmented data approach. A simulation study was performed in order to assess the parameter recovery in the proposed model and the estimation method, and the effect of the asymmetry level of the latent traits distribution on the parameter estimation. Also, a comparison of our approach with other estimation methods (which consider the assumption of symmetric normality for the latent traits distribution) was considered. The results indicated that our proposed algorithm recovers properly all parameters. Specifically, the greater the asymmetry level, the better the performance of our approach compared with other approaches, mainly in the presence of small sample sizes (number of examinees). Furthermore, we analyzed a real data set which presents indication of asymmetry concerning the latent traits distribution. The results obtained by using our approach confirmed the presence of strong negative asymmetry of the latent traits distribution. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
We introduce a stochastic heterogeneous interacting-agent model for the short-time non-equilibrium evolution of excess demand and price in a stylized asset market. We consider a combination of social interaction within peer groups and individually heterogeneous fundamentalist trading decisions which take into account the market price and the perceived fundamental value of the asset. The resulting excess demand is coupled to the market price. Rigorous analysis reveals that this feedback may lead to price oscillations, a single bounce, or monotonic price behaviour. The model is a rare example of an analytically tractable interacting-agent model which allows LIS to deduce in detail the origin of these different collective patterns. For a natural choice of initial distribution, the results are independent of the graph structure that models the peer network of agents whose decisions influence each other. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The Grubbs` measurement model is frequently used to compare several measuring devices. It is common to assume that the random terms have a normal distribution. However, such assumption makes the inference vulnerable to outlying observations, whereas scale mixtures of normal distributions have been an interesting alternative to produce robust estimates, keeping the elegancy and simplicity of the maximum likelihood theory. The aim of this paper is to develop an EM-type algorithm for the parameter estimation, and to use the local influence method to assess the robustness aspects of these parameter estimates under some usual perturbation schemes, In order to identify outliers and to criticize the model building we use the local influence procedure in a Study to compare the precision of several thermocouples. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
In clinical trials, it may be of interest taking into account physical and emotional well-being in addition to survival when comparing treatments. Quality-adjusted survival time has the advantage of incorporating information about both survival time and quality-of-life. In this paper, we discuss the estimation of the expected value of the quality-adjusted survival, based on multistate models for the sojourn times in health states. Semiparametric and parametric (with exponential distribution) approaches are considered. A simulation study is presented to evaluate the performance of the proposed estimator and the jackknife resampling method is used to compute bias and variance of the estimator. (C) 2007 Elsevier B.V. All rights reserved.
The shoving model for the glass-former LiCl center dot 6H(2)O: A molecular dynamics simulation study
Resumo:
Molecular dynamics (MD) simulations of LiCl center dot 6H(2)O Showed that the diffusion coefficient D, and also I lie structural relaxation time
Resumo:
The nonadiabatic photochemistry of the guanine molecule (2-amino-6-oxopurine) and some of its tautomers has been studied by means of the high-level theoretical ab initio quantum chemistry methods CASSCF and CASPT2. Accurate computations, based by the first time on minimum energy reaction paths, states minima, transition states, reaction barriers, and conical intersections on the potential energy hypersurfaces of the molecules lead to interpret the photochemistry of guanine and derivatives within a three-state model. As in the other purine DNA nucleobase, adenine, the ultrafast subpicosecond fluorescence decay measured in guanine is attributed to the barrierless character of the path leading from the initially populated (1)(pi pi* L-a) spectroscopic state of the molecule toward the low-lying methanamine-like conical intersection (gs/pi pi* L-a)(CI). On the contrary, other tautomers are shown to have a reaction energy barrier along the main relaxation profile. A second, slower decay is attributed to a path involving switches toward two other states, (1)(pi pi* L-b) and, in particular, (1)(n(o)pi*), ultimately leading to conical intersections with the ground state. A common framework for the ultrafast relaxation of the natural nucleobases is obtained in which the predominant role of a pi pi*-type state is confirmed.
Resumo:
Setup time reduction facilitate the flexibility needed for just-in-time production. An integrated steel mill with meltshop, continuous caster and hot rolling mill is often operated as decoupled processes. Setup time reduction provides the flexibility needed to reduce buffering, shorten lead times and create an integrated process flow. The interdependency of setup times, process flexibility and integration were analysed through system dynamics simulation. The results showed significant reductions of energy consumption and tied capital. It was concluded that setup time reduction in the hot strip mill can aid process integration and hence improve production economy while reducing environmental impact.
Resumo:
In infinite horizon financial markets economies, competitive equilibria fail to exist if one does not impose restrictions on agents' trades that rule out Ponzi schemes. When there is limited commitment and collateral repossession is the unique default punishment, Araujo, Páscoa and Torres-Martínez (2002) proved that Ponzi schemes are ruled out without imposing any exogenous/endogenous debt constraints on agents' trades. Recently Páscoa and Seghir (2009) have shown that this positive result is not robust to the presence of additional default punishments. They provide several examples showing that, in the absence of debt constraints, harsh default penalties may induce agents to run Ponzi schemes that jeopardize equilibrium existence. The objective of this paper is to close a theoretical gap in the literature by identifying endogenous borrowing constraints that rule out Ponzi schemes and ensure existence of equilibria in a model with limited commitment and (possible) default. We appropriately modify the definition of finitely effective debt constraints, introduced by Levine and Zame (1996) (see also Levine and Zame (2002)), to encompass models with limited commitment, default penalties and collateral. Along this line, we introduce in the setting of Araujo, Páscoa and Torres-Martínez (2002), Kubler and Schmedders (2003) and Páscoa and Seghir (2009) the concept of actions with finite equivalent payoffs. We show that, independently of the level of default penalties, restricting plans to have finite equivalent payoffs rules out Ponzi schemes and guarantees the existence of an equilibrium that is compatible with the minimal ability to borrow and lend that we expect in our model. An interesting feature of our debt constraints is that they give rise to budget sets that coincide with the standard budget sets of economies having a collateral structure but no penalties (as defined in Araujo, Páscoa and Torres-Martínez (2002)). This illustrates the hidden relation between finitely effective debt constraints and collateral requirements.