967 resultados para Generalized Driven Nonlinear Threshold Model
Resumo:
One of the main implications of the efficient market hypothesis (EMH) is that expected future returns on financial assets are not predictable if investors are risk neutral. In this paper we argue that financial time series offer more information than that this hypothesis seems to supply. In particular we postulate that runs of very large returns can be predictable for small time periods. In order to prove this we propose a TAR(3,1)-GARCH(1,1) model that is able to describe two different types of extreme events: a first type generated by large uncertainty regimes where runs of extremes are not predictable and a second type where extremes come from isolated dread/joy events. This model is new in the literature in nonlinear processes. Its novelty resides on two features of the model that make it different from previous TAR methodologies. The regimes are motivated by the occurrence of extreme values and the threshold variable is defined by the shock affecting the process in the preceding period. In this way this model is able to uncover dependence and clustering of extremes in high as well as in low volatility periods. This model is tested with data from General Motors stocks prices corresponding to two crises that had a substantial impact in financial markets worldwide; the Black Monday of October 1987 and September 11th, 2001. By analyzing the periods around these crises we find evidence of statistical significance of our model and thereby of predictability of extremes for September 11th but not for Black Monday. These findings support the hypotheses of a big negative event producing runs of negative returns in the first case, and of the burst of a worldwide stock market bubble in the second example. JEL classification: C12; C15; C22; C51 Keywords and Phrases: asymmetries, crises, extreme values, hypothesis testing, leverage effect, nonlinearities, threshold models
Resumo:
In this thesis we implement estimating procedures in order to estimate threshold parameters for the continuous time threshold models driven by stochastic di®erential equations. The ¯rst procedure is based on the EM (expectation-maximization) algorithm applied to the threshold model built from the Brownian motion with drift process. The second procedure mimics one of the fundamental ideas in the estimation of the thresholds in time series context, that is, conditional least squares estimation. We implement this procedure not only for the threshold model built from the Brownian motion with drift process but also for more generic models as the ones built from the geometric Brownian motion or the Ornstein-Uhlenbeck process. Both procedures are implemented for simu- lated data and the least squares estimation procedure is also implemented for real data of daily prices from a set of international funds. The ¯rst fund is the PF-European Sus- tainable Equities-R fund from the Pictet Funds company and the second is the Parvest Europe Dynamic Growth fund from the BNP Paribas company. The data for both funds are daily prices from the year 2004. The last fund to be considered is the Converging Europe Bond fund from the Schroder company and the data are daily prices from the year 2005.
Resumo:
The objectives of the current study were to assess the feasibility of using stayability traits to improve fertility of Nellore cows and to examine the genetic relationship among the stayabilities at different ages. Stayability was defined as whether a cow calved every year up to the age of 5 (Stay5), 6 (Stay6), or 7 (Stay7) yr of age or more, given that she was provided the opportunity to breed. Data were analyzed based on a maximum a posteriori probit threshold model to predict breeding values on the liability scale, whereas the Gibbs sampler was used to estimate variance components. The EBV were obtained using all animals included in the pedigree or bulls with at least 10 daughters with stayability observations, and average genetic trends were obtained in the liability and transformed to the probability scale. Additional analyses were performed to study the genetic relationship among stayability traits, which were compared by contrasting results in terms of EBV and the average genetic superiority as a function of the selected proportion of sires. Heritability estimates and SD were 0.25 +/- 0.02, 0.22 +/- 0.03, and 0.28 +/- 0.03 for Stay5, Stay6, and Stay7, respectively. Average genetic trends, by year, were 0.51 +/- 0.34, and 0.38% for Stay5, Stay6, and Stay7, respectively. Estimates of EBV SD, in the probability scale, for all animals included in the pedigree and for bulls with at least 10 daughters with stayability observations were 7.98 and 12.95, 6.93 and 11.38, and 8.24 and 14.30% for Stay5, Stay6, and Stay7, respectively. A reduction in the average genetic superiorities in Stay7 would be expected if the selection were based on Stay5 or Stay6. Nonetheless, the reduction in EPD, depending on selection intensity, is on average 0.74 and 1.55%, respectively. Regressions of the sires' EBV for Stay5 and Stay6 on the sires' EBV for Stay7 confirmed these results. The heritability and genetic trend estimates for all stayability traits indicate that it is possible to improve fertility with selection based on a threshold analysis of stayability. The SD of EBV for stayability traits show that there is adequate genetic variability among animals to justify inclusion of stayability as a selection criterion. The potential linear relationship among stayability traits indicates that selection for improved female traits would be more effective by having predictions on the Stay5 trait.
Resumo:
Among several process variability sources, valve friction and inadequate controller tuning are supposed to be two of the most prevalent. Friction quantification methods can be applied to the development of model-based compensators or to diagnose valves that need repair, whereas accurate process models can be used in controller retuning. This paper extends existing methods that jointly estimate the friction and process parameters, so that a nonlinear structure is adopted to represent the process model. The developed estimation algorithm is tested with three different data sources: a simulated first order plus dead time process, a hybrid setup (composed of a real valve and a simulated pH neutralization process) and from three industrial datasets corresponding to real control loops. The results demonstrate that the friction is accurately quantified, as well as ""good"" process models are estimated in several situations. Furthermore, when a nonlinear process model is considered, the proposed extension presents significant advantages: (i) greater accuracy for friction quantification and (ii) reasonable estimates of the nonlinear steady-state characteristics of the process. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In a sample of censored survival times, the presence of an immune proportion of individuals who are not subject to death, failure or relapse, may be indicated by a relatively high number of individuals with large censored survival times. In this paper the generalized log-gamma model is modified for the possibility that long-term survivors may be present in the data. The model attempts to separately estimate the effects of covariates on the surviving fraction, that is, the proportion of the population for which the event never occurs. The logistic function is used for the regression model of the surviving fraction. Inference for the model parameters is considered via maximum likelihood. Some influence methods, such as the local influence and total local influence of an individual are derived, analyzed and discussed. Finally, a data set from the medical area is analyzed under the log-gamma generalized mixture model. A residual analysis is performed in order to select an appropriate model.
Resumo:
A case sensitive intelligent model editor has been developed for constructing consistent lumped dynamic process models and for simplifying them using modelling assumptions. The approach is based on a systematic assumption-driven modelling procedure and on the syntax and semantics of process,models and the simplifying assumptions.
Resumo:
We investigate the influence of a single-mode cavity on the Autler-Townes doublet that arises when a three-level atom is strongly driven by a laser field tuned to one of the atomic transitions and probed by a tunable, weak field coupled to the other transition. We assume that the cavity mode is coupled to the driven transition and the cavity and laser frequencies are equal to the atomic transition frequency. We find that the Autler-Townes spectrum can have one, two or three peaks depending on the relative magnitudes of the Rabi frequencies of the cavity and driving fields. We show that, in order to understand the three-peaked spectrum, it is necessary to go beyond the secular approximation, leading to interesting quantum interference effects. We find that the positions and relative intensities of the three spectral components are affected strongly by the atom-cavity coupling strength g and the cavity damping K. For an increasing g and/or decreasing K the triplet evolves into a single peak. This results in 'undressing' of the system such that the atom collapses into its ground state. We interpret the spectral features in terms of the semiclassical dressed-atom model, and also provide complementary views of the cavity effects in terms of quantum Langevin equations and the fully quantized, 'double -dressing' model.
Resumo:
Proceedings of International Conference - SPIE 7477, Image and Signal Processing for Remote Sensing XV - 28 September 2009
Resumo:
In the context of the two-stage threshold model of decision making, with the agent’s choices determined by the interaction Of three “structural variables,” we study the restrictions on behavior that arise when one or more variables are xogenously known. Our results supply necessary and sufficient conditions for consistency with the model for all possible states of partial Knowledge, and for both single- and multivalued choice functions.
Resumo:
This paper presents a general equilibrium model of money demand wherethe velocity of money changes in response to endogenous fluctuations in the interest rate. The parameter space can be divided into two subsets: one where velocity is constant and equal to one as in cash-in-advance models, and another one where velocity fluctuates as in Baumol (1952). Despite its simplicity, in terms of paramaters to calibrate, the model performs surprisingly well. In particular, it approximates the variability of money velocity observed in the U.S. for the post-war period. The model is then used to analyze the welfare costs of inflation under uncertainty. This application calculates the errors derived from computing the costs of inflation with deterministic models. It turns out that the size of this difference is small, at least for the levels of uncertainty estimated for the U.S. economy.
Resumo:
Division of labor in social insects is determinant to their ecological success. Recent models emphasize that division of labor is an emergent property of the interactions among nestmates obeying to simple behavioral rules. However, the role of evolution in shaping these rules has been largely neglected. Here, we investigate a model that integrates the perspectives of self-organization and evolution. Our point of departure is the response threshold model, where we allow thresholds to evolve. We ask whether the thresholds will evolve to a state where division of labor emerges in a form that fits the needs of the colony. We find that division of labor can indeed evolve through the evolutionary branching of thresholds, leading to workers that differ in their tendency to take on a given task. However, the conditions under which division of labor evolves depend on the strength of selection on the two fitness components considered: amount of work performed and on worker distribution over tasks. When selection is strongest on the amount of work performed, division of labor evolves if switching tasks is costly. When selection is strongest on worker distribution, division of labor is less likely to evolve. Furthermore, we show that a biased distribution (like 3:1) of workers over tasks is not easily achievable by a threshold mechanism, even under strong selection. Contrary to expectation, multiple matings of colony foundresses impede the evolution of specialization. Overall, our model sheds light on the importance of considering the interaction between specific mechanisms and ecological requirements to better understand the evolutionary scenarios that lead to division of labor in complex systems. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s00265-012-1343-2) contains supplementary material, which is available to authorized users.
Resumo:
Nature is full of phenomena which we call "chaotic", the weather being a prime example. What we mean by this is that we cannot predict it to any significant accuracy, either because the system is inherently complex, or because some of the governing factors are not deterministic. However, during recent years it has become clear that random behaviour can occur even in very simple systems with very few number of degrees of freedom, without any need for complexity or indeterminacy. The discovery that chaos can be generated even with the help of systems having completely deterministic rules - often models of natural phenomena - has stimulated a lo; of research interest recently. Not that this chaos has no underlying order, but it is of a subtle kind, that has taken a great deal of ingenuity to unravel. In the present thesis, the author introduce a new nonlinear model, a ‘modulated’ logistic map, and analyse it from the view point of ‘deterministic chaos‘.