58 resultados para Ornstein-Uhlenbeck
Resumo:
Phylogenetic comparative methods are increasingly used to give new insights into the dynamics of trait evolution in deep time. For continuous traits the core of these methods is a suite of models that attempt to capture evolutionary patterns by extending the Brownian constant variance model. However, the properties of these models are often poorly understood, which can lead to the misinterpretation of results. Here we focus on one of these models – the Ornstein Uhlenbeck (OU) model. We show that the OU model is frequently incorrectly favoured over simpler models when using Likelihood ratio tests, and that many studies fitting this model use datasets that are small and prone to this problem. We also show that very small amounts of error in datasets can have profound effects on the inferences derived from OU models. Our results suggest that simulating fitted models and comparing with empirical results is critical when fitting OU and other extensions of the Brownian model. We conclude by making recommendations for best practice in fitting OU models in phylogenetic comparative analyses, and for interpreting the parameters of the OU model.
Resumo:
Pair trading is an old and well-known technique among traders. In this paper, we discuss an important element not commonly debated in Brazil: the cointegration between pairs, which would guarantee the spread stability. We run the Dickey-Fuller test to check cointegration, and then compare the results with non-cointegrated pairs. We found that the Sharpe ratio of cointegrated pairs is greater than the non-cointegrated. We also use the Ornstein-Uhlenbeck equation in order to calculate the half-life of the pairs. Again, this improves their performance. Last, we use the leverage suggested by Kelly Formula, once again improving the results.
Resumo:
Studiamo l'operatore di Ornstein-Uhlenbeck e il semigruppo di Ornstein-Uhlenbeck in un sottoinsieme aperto convesso $\Omega$ di uno spazio di Banach separabile $X$ dotato di una misura Gaussiana centrata non degnere $\gamma$. In particolare dimostriamo la disuguaglianza di Sobolev logaritmica e la disuguaglianza di Poincaré, e grazie a queste disuguaglianze deduciamo le proprietà spettrali dell'operatore di Ornstein-Uhlenbeck. Inoltre studiamo l'equazione ellittica $\lambdau+L^{\Omega}u=f$ in $\Omega$, dove $L^\Omega$ è l'operatore di Ornstein-Uhlenbeck. Dimostriamo che per $\lambda>0$ e $f\in L^2(\Omega,\gamma)$ la soluzione debole $u$ appartiene allo spazio di Sobolev $W^{2,2}(\Omega,\gamma)$. Inoltre dimostriamo che $u$ soddisfa la condizione di Neumann nel senso di tracce al bordo di $\Omega$. Questo viene fatto finita approssimazione dimensionale.
Resumo:
2000 Mathematics Subject Classification: 60J60, 62M99.
Resumo:
In this thesis we implement estimating procedures in order to estimate threshold parameters for the continuous time threshold models driven by stochastic di®erential equations. The ¯rst procedure is based on the EM (expectation-maximization) algorithm applied to the threshold model built from the Brownian motion with drift process. The second procedure mimics one of the fundamental ideas in the estimation of the thresholds in time series context, that is, conditional least squares estimation. We implement this procedure not only for the threshold model built from the Brownian motion with drift process but also for more generic models as the ones built from the geometric Brownian motion or the Ornstein-Uhlenbeck process. Both procedures are implemented for simu- lated data and the least squares estimation procedure is also implemented for real data of daily prices from a set of international funds. The ¯rst fund is the PF-European Sus- tainable Equities-R fund from the Pictet Funds company and the second is the Parvest Europe Dynamic Growth fund from the BNP Paribas company. The data for both funds are daily prices from the year 2004. The last fund to be considered is the Converging Europe Bond fund from the Schroder company and the data are daily prices from the year 2005.
Resumo:
Dissertação para obtenção do Grau de Mestre em Matemática e Aplicações
Resumo:
The relationship between body size and geographic range was analyzed for 70 species of terrestrial Carnivora ("fissipeds") of the New World, after the control of phylogenetic patterns in the data using phylogenetic eigenvector regression. The analysis from EcoSim software showed that the variables are related as a triangular envelope. Phylogenetic patterns in data were detected by means of phylogenetic correlograms, and 200 simulations of the phenotypic evolution were also performed over the phylogeny. For body size, the simulations suggested a non-linear relationship for the evolution of this character along the phylogeny. For geographic range size, the correlogram showed no phylogenetic patterns. A phylogenetic eigenvector regression was performed on original data and on data simulated under Ornstein-Uhlenbeck process. Since both characters did not evolve under a simple Brownian motion process, the Type I errors should be around 10%, compatible with other methods to analyze correlated evolution. The significant correlation of the original data (r = 0.38; P < 0.05), as well as the triangular envelope, then indicate ecological and adaptive processes connecting the two variables, such as those proposed in minimum viable population models.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
This paper presents a two-factor (Vasicek-CIR) model of the term structure of interest rates and develops its pricing and empirical properties. We assume that default free discount bond prices are determined by the time to maturity and two factors, the long-term interest rate and the spread. Assuming a certain process for both factors, a general bond pricing equation is derived and a closed-form expression for bond prices is obtained. Empirical evidence of the model's performance in comparisson with a double Vasicek model is presented. The main conclusion is that the modeling of the volatility in the long-term rate process can help (in a large amount) to fit the observed data can improve - in a reasonable quantity - the prediction of the future movements in the medium- and long-term interest rates. However, for shorter maturities, it is shown that the pricing errors are, basically, negligible and it is not so clear which is the best model to be used.
Resumo:
This paper presents a two--factor model of the term structure ofinterest rates. We assume that default free discount bond prices aredetermined by the time to maturity and two factors, the long--term interestrate and the spread (difference between the long--term rate and theshort--term (instantaneous) riskless rate). Assuming that both factorsfollow a joint Ornstein--Uhlenbeck process, a general bond pricing equationis derived. We obtain a closed--form expression for bond prices andexamine its implications for the term structure of interest rates. We alsoderive a closed--form solution for interest rate derivatives prices. Thisexpression is applied to price European options on discount bonds andmore complex types of options. Finally, empirical evidence of the model'sperformance is presented.
Resumo:
We consider a general class of non-Markovian processes defined by stochastic differential equations with Ornstein-Uhlenbeck noise. We present a general formalism to evaluate relaxation times associated with correlation functions in the steady state. This formalism is a generalization of a previous approach for Markovian processes. The theoretical results are shown to be in satisfactory agreement both with experimental data for a cubic bistable system and also with a computer simulation of the Stratonovich model. We comment on the dynamical role of the non-Markovianicity in different situations.
Resumo:
The evolution of continuous traits is the central component of comparative analyses in phylogenetics, and the comparison of alternative models of trait evolution has greatly improved our understanding of the mechanisms driving phenotypic differentiation. Several factors influence the comparison of models, and we explore the effects of random errors in trait measurement on the accuracy of model selection. We simulate trait data under a Brownian motion model (BM) and introduce different magnitudes of random measurement error. We then evaluate the resulting statistical support for this model against two alternative models: Ornstein-Uhlenbeck (OU) and accelerating/decelerating rates (ACDC). Our analyses show that even small measurement errors (10%) consistently bias model selection towards erroneous rejection of BM in favour of more parameter-rich models (most frequently the OU model). Fortunately, methods that explicitly incorporate measurement errors in phylogenetic analyses considerably improve the accuracy of model selection. Our results call for caution in interpreting the results of model selection in comparative analyses, especially when complex models garner only modest additional support. Importantly, as measurement errors occur in most trait data sets, we suggest that estimation of measurement errors should always be performed during comparative analysis to reduce chances of misidentification of evolutionary processes.
Resumo:
Extreme times techniques, generally applied to nonequilibrium statistical mechanical processes, are also useful for a better understanding of financial markets. We present a detailed study on the mean first-passage time for the volatility of return time series. The empirical results extracted from daily data of major indices seem to follow the same law regardless of the kind of index thus suggesting an universal pattern. The empirical mean first-passage time to a certain level L is fairly different from that of the Wiener process showing a dissimilar behavior depending on whether L is higher or lower than the average volatility. All of this indicates a more complex dynamics in which a reverting force drives volatility toward its mean value. We thus present the mean first-passage time expressions of the most common stochastic volatility models whose approach is comparable to the random diffusion description. We discuss asymptotic approximations of these models and confront them to empirical results with a good agreement with the exponential Ornstein-Uhlenbeck model.