894 resultados para model with default Vasicek model and Cir model for the short rate
Resumo:
Sea-level rise (SLR) from global warming may have severe consequences for coastal cities, particularly when combined with predicted increases in the strength of tidal surges. Predicting the regional impact of SLR flooding is strongly dependent on the modelling approach and accuracy of topographic data. Here, the areas under risk of sea water flooding for London boroughs were quantified based on the projected SLR scenarios reported in Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5) and UK climatic projections 2009 (UKCP09) using a tidally-adjusted bathtub modelling approach. Medium- to very high-resolution digital elevation models (DEMs) are used to evaluate inundation extents as well as uncertainties. Depending on the SLR scenario and DEMs used, it is estimated that 3%–8% of the area of Greater London could be inundated by 2100. The boroughs with the largest areas at risk of flooding are Newham, Southwark, and Greenwich. The differences in inundation areas estimated from a digital terrain model and a digital surface model are much greater than the root mean square error differences observed between the two data types, which may be attributed to processing levels. Flood models from SRTM data underestimate the inundation extent, so their results may not be reliable for constructing flood risk maps. This analysis provides a broad-scale estimate of the potential consequences of SLR and uncertainties in the DEM-based bathtub type flood inundation modelling for London boroughs.
Resumo:
The benefits of breastfeeding for the children`s health have been highlighted in many studies. The innovative aspect of the present study lies in its use of a multilevel model, a technique that has rarely been applied to studies on breastfeeding. The data reported were collected from a larger study, the Family Budget Survey-Pesquisa de Orcamentos Familiares, carried out between 2002 and 2003 in Brazil that involved a sample of 48 470 households. A representative national sample of 1477 infants aged 0-6 months was used. The statistical analysis was performed using a multilevel model, with two levels grouped by region. In Brazil, breastfeeding prevalence was 58%. The factors that bore a negative influence on breastfeeding were over four residents living in the same household [odds ratio (OR) = 0.68, 90% confidence interval (CI) = 0.51-0.89] and mothers aged 30 years or more (OR = 0.68, 90% CI = 0.53-0.89). The factors that positively influenced breastfeeding were the following: higher socio-economic levels (OR = 1.37, 90% CI = 1.01-1.88), families with over two infants under 5 years (OR = 1.25, 90% CI = 1.00-1.58) and being a resident in rural areas (OR = 1.25, 90% CI = 1.00-1.58). Although majority of the mothers was aware of the value of maternal milk and breastfed their babies, the prevalence of breastfeeding remains lower than the rate advised by the World Health Organization, and the number of residents living in the same household along with mothers aged 30 years or older were both factors associated with early cessation of infant breastfeeding before 6 months.
Resumo:
A previously proposed model describing the trapping site of the interstitial atomic hydrogen in borate glasses is analyzed. In this model the atomic hydrogen is stabilized at the centers of oxygen polygons belonging to B-O ring structures in the glass network by van der Waals forces. The previously reported atomic hydrogen isothermal decay experimental data are discussed in the light of this microscopic model. A coupled differential equation system of the observed decay kinetics was solved numerically using the Runge Kutta method. The experimental untrapping activation energy of 0.7 x 10(-19) J is in good agreement with the calculated results of dispersion interaction between the stabilized atomic hydrogen and the neighboring oxygen atoms at the vertices of hexagonal ring structures. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is interest in studying latent variables. These latent variables are directly considered in the Item Response Models (IRM) and they are usually called latent traits. A usual assumption for parameter estimation of the IRM, considering one group of examinees, is to assume that the latent traits are random variables which follow a standard normal distribution. However, many works suggest that this assumption does not apply in many cases. Furthermore, when this assumption does not hold, the parameter estimates tend to be biased and misleading inference can be obtained. Therefore, it is important to model the distribution of the latent traits properly. In this paper we present an alternative latent traits modeling based on the so-called skew-normal distribution; see Genton (2004). We used the centred parameterization, which was proposed by Azzalini (1985). This approach ensures the model identifiability as pointed out by Azevedo et al. (2009b). Also, a Metropolis Hastings within Gibbs sampling (MHWGS) algorithm was built for parameter estimation by using an augmented data approach. A simulation study was performed in order to assess the parameter recovery in the proposed model and the estimation method, and the effect of the asymmetry level of the latent traits distribution on the parameter estimation. Also, a comparison of our approach with other estimation methods (which consider the assumption of symmetric normality for the latent traits distribution) was considered. The results indicated that our proposed algorithm recovers properly all parameters. Specifically, the greater the asymmetry level, the better the performance of our approach compared with other approaches, mainly in the presence of small sample sizes (number of examinees). Furthermore, we analyzed a real data set which presents indication of asymmetry concerning the latent traits distribution. The results obtained by using our approach confirmed the presence of strong negative asymmetry of the latent traits distribution. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
We introduce a stochastic heterogeneous interacting-agent model for the short-time non-equilibrium evolution of excess demand and price in a stylized asset market. We consider a combination of social interaction within peer groups and individually heterogeneous fundamentalist trading decisions which take into account the market price and the perceived fundamental value of the asset. The resulting excess demand is coupled to the market price. Rigorous analysis reveals that this feedback may lead to price oscillations, a single bounce, or monotonic price behaviour. The model is a rare example of an analytically tractable interacting-agent model which allows LIS to deduce in detail the origin of these different collective patterns. For a natural choice of initial distribution, the results are independent of the graph structure that models the peer network of agents whose decisions influence each other. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The Grubbs` measurement model is frequently used to compare several measuring devices. It is common to assume that the random terms have a normal distribution. However, such assumption makes the inference vulnerable to outlying observations, whereas scale mixtures of normal distributions have been an interesting alternative to produce robust estimates, keeping the elegancy and simplicity of the maximum likelihood theory. The aim of this paper is to develop an EM-type algorithm for the parameter estimation, and to use the local influence method to assess the robustness aspects of these parameter estimates under some usual perturbation schemes, In order to identify outliers and to criticize the model building we use the local influence procedure in a Study to compare the precision of several thermocouples. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
In clinical trials, it may be of interest taking into account physical and emotional well-being in addition to survival when comparing treatments. Quality-adjusted survival time has the advantage of incorporating information about both survival time and quality-of-life. In this paper, we discuss the estimation of the expected value of the quality-adjusted survival, based on multistate models for the sojourn times in health states. Semiparametric and parametric (with exponential distribution) approaches are considered. A simulation study is presented to evaluate the performance of the proposed estimator and the jackknife resampling method is used to compute bias and variance of the estimator. (C) 2007 Elsevier B.V. All rights reserved.
The shoving model for the glass-former LiCl center dot 6H(2)O: A molecular dynamics simulation study
Resumo:
Molecular dynamics (MD) simulations of LiCl center dot 6H(2)O Showed that the diffusion coefficient D, and also I lie structural relaxation time
Resumo:
The nonadiabatic photochemistry of the guanine molecule (2-amino-6-oxopurine) and some of its tautomers has been studied by means of the high-level theoretical ab initio quantum chemistry methods CASSCF and CASPT2. Accurate computations, based by the first time on minimum energy reaction paths, states minima, transition states, reaction barriers, and conical intersections on the potential energy hypersurfaces of the molecules lead to interpret the photochemistry of guanine and derivatives within a three-state model. As in the other purine DNA nucleobase, adenine, the ultrafast subpicosecond fluorescence decay measured in guanine is attributed to the barrierless character of the path leading from the initially populated (1)(pi pi* L-a) spectroscopic state of the molecule toward the low-lying methanamine-like conical intersection (gs/pi pi* L-a)(CI). On the contrary, other tautomers are shown to have a reaction energy barrier along the main relaxation profile. A second, slower decay is attributed to a path involving switches toward two other states, (1)(pi pi* L-b) and, in particular, (1)(n(o)pi*), ultimately leading to conical intersections with the ground state. A common framework for the ultrafast relaxation of the natural nucleobases is obtained in which the predominant role of a pi pi*-type state is confirmed.
Resumo:
Setup time reduction facilitate the flexibility needed for just-in-time production. An integrated steel mill with meltshop, continuous caster and hot rolling mill is often operated as decoupled processes. Setup time reduction provides the flexibility needed to reduce buffering, shorten lead times and create an integrated process flow. The interdependency of setup times, process flexibility and integration were analysed through system dynamics simulation. The results showed significant reductions of energy consumption and tied capital. It was concluded that setup time reduction in the hot strip mill can aid process integration and hence improve production economy while reducing environmental impact.
Resumo:
In infinite horizon financial markets economies, competitive equilibria fail to exist if one does not impose restrictions on agents' trades that rule out Ponzi schemes. When there is limited commitment and collateral repossession is the unique default punishment, Araujo, Páscoa and Torres-Martínez (2002) proved that Ponzi schemes are ruled out without imposing any exogenous/endogenous debt constraints on agents' trades. Recently Páscoa and Seghir (2009) have shown that this positive result is not robust to the presence of additional default punishments. They provide several examples showing that, in the absence of debt constraints, harsh default penalties may induce agents to run Ponzi schemes that jeopardize equilibrium existence. The objective of this paper is to close a theoretical gap in the literature by identifying endogenous borrowing constraints that rule out Ponzi schemes and ensure existence of equilibria in a model with limited commitment and (possible) default. We appropriately modify the definition of finitely effective debt constraints, introduced by Levine and Zame (1996) (see also Levine and Zame (2002)), to encompass models with limited commitment, default penalties and collateral. Along this line, we introduce in the setting of Araujo, Páscoa and Torres-Martínez (2002), Kubler and Schmedders (2003) and Páscoa and Seghir (2009) the concept of actions with finite equivalent payoffs. We show that, independently of the level of default penalties, restricting plans to have finite equivalent payoffs rules out Ponzi schemes and guarantees the existence of an equilibrium that is compatible with the minimal ability to borrow and lend that we expect in our model. An interesting feature of our debt constraints is that they give rise to budget sets that coincide with the standard budget sets of economies having a collateral structure but no penalties (as defined in Araujo, Páscoa and Torres-Martínez (2002)). This illustrates the hidden relation between finitely effective debt constraints and collateral requirements.
Resumo:
In a country with high probability of default, higher interest rates may render the currency less attractive if sovereign default is costly. This paper develops that intuition in a simple model and estimates the effect of changes in interest rates on the exchange rate in Brazil using data from the dates surrounding the monetary policy committee meetings and the methodology of identification through heteroskedasticity. Indeed, we find that unexpected increases in interest rates tend to lead the Brazilian currency to depreciate. It follows that granting more independence to a central bank that focus solely on inflation is not always a free-lunch.
Resumo:
The scalar sector of the simplest version of the 3-3-1 electroweak model is constructed with three Higgs triplets only. We show that a relation involving two of the constants of the model, two vacuum expectation values of the neutral scalars, and the mass of the doubly charged Higgs boson leads to important information concerning the signals of this scalar particle.
Resumo:
Ionospheric scintillations are caused by time-varying electron density irregularities in the ionosphere, occurring more often at equatorial and high latitudes. This paper focuses exclusively on experiments undertaken in Europe, at geographic latitudes between similar to 50 degrees N and similar to 80 degrees N, where a network of GPS receivers capable of monitoring Total Electron Content and ionospheric scintillation parameters was deployed. The widely used ionospheric scintillation indices S4 and sigma(phi) represent a practical measure of the intensity of amplitude and phase scintillation affecting GNSS receivers. However, they do not provide sufficient information regarding the actual tracking errors that degrade GNSS receiver performance. Suitable receiver tracking models, sensitive to ionospheric scintillation, allow the computation of the variance of the output error of the receiver PLL (Phase Locked Loop) and DLL (Delay Locked Loop), which expresses the quality of the range measurements used by the receiver to calculate user position. The ability of such models of incorporating phase and amplitude scintillation effects into the variance of these tracking errors underpins our proposed method of applying relative weights to measurements from different satellites. That gives the least squares stochastic model used for position computation a more realistic representation, vis-a-vis the otherwise 'equal weights' model. For pseudorange processing, relative weights were computed, so that a 'scintillation-mitigated' solution could be performed and compared to the (non-mitigated) 'equal weights' solution. An improvement between 17 and 38% in height accuracy was achieved when an epoch by epoch differential solution was computed over baselines ranging from 1 to 750 km. The method was then compared with alternative approaches that can be used to improve the least squares stochastic model such as weighting according to satellite elevation angle and by the inverse of the square of the standard deviation of the code/carrier divergence (sigma CCDiv). The influence of multipath effects on the proposed mitigation approach is also discussed. With the use of high rate scintillation data in addition to the scintillation indices a carrier phase based mitigated solution was also implemented and compared with the conventional solution. During a period of occurrence of high phase scintillation it was observed that problems related to ambiguity resolution can be reduced by the use of the proposed mitigated solution.