937 resultados para Algoritmic pairs trading, statistical arbitrage, Kalman filter, mean reversion.
Resumo:
SANTANA, Andr M.; SOUZA, Anderson A. S.; BRITTO, Ricardo S.; ALSINA, Pablo J.; MEDEIROS, Adelardo A. D. Localization of a mobile robot based on odometry and natural landmarks using extended Kalman Filter. In: INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, 5., 2008, Funchal, Portugal. Proceedings... Funchal, Portugal: ICINCO, 2008.
Resumo:
The present paper describes a system for the construction of visual maps ("mosaics") and motion estimation for a set of AUVs (Autonomous Underwater Vehicles). Robots are equipped with down-looking camera which is used to estimate their motion with respect to the seafloor and built an online mosaic. As the mosaic increases in size, a systematic bias is introduced in its alignment, resulting in an erroneous output. The theoretical concepts associated with the use of an Augmented State Kalman Filter (ASKF) were applied to optimally estimate both visual map and the fleet position.
Resumo:
SANTANA, Andr M.; SOUZA, Anderson A. S.; BRITTO, Ricardo S.; ALSINA, Pablo J.; MEDEIROS, Adelardo A. D. Localization of a mobile robot based on odometry and natural landmarks using extended Kalman Filter. In: INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, 5., 2008, Funchal, Portugal. Proceedings... Funchal, Portugal: ICINCO, 2008.
Resumo:
The increasing interest in the decarbonization process led to a rapidly growing trend of electrification strategies in the automotive industry. In particular, OEMs are pushing towards the development and production of efficient electric vehicles. Moreover, research on electric motors and their control are exploding in popularity. The increase of computational power in embedded control hardware is allowing the development of new control algorithm, such as sensorless control strategy. Such control strategy allows the reduction of the number of sensors, which implies reduced costs and increased system reliability. The thesis objective is to realize a sensorless control for high-performance automotive motors. Several algorithms for rotor angle observers are implemented in the MATLAB and Simulink environment, with emphasis on the Kalman observer. One of the Kalman algorithms already available in the literature has been selected, implemented and benchmarked, with emphasis on its comparison with the Sliding Mode observer. Different models characterized by increasing levels of complexity are simulated. A simplified synchronous motor with constant parameters, controlled by an ideal inverter is first analyzed; followed by a complete model defined by real motor maps, and controlled by a switching inverter. Finally, it was possible to test the developed algorithm on a real electric motor mounted on a test bench. A wide range of different electric motors have been simulated, which led to an exhaustive review of the sensorless control algorithm. The final results underline the capability of the Kalman observer to effectively control the motor on a real test bench.
Stochastic particle models: mean reversion and burgers dynamics. An application to commodity markets
Resumo:
The aim of this study is to propose a stochastic model for commodity markets linked with the Burgers equation from fluid dynamics. We construct a stochastic particles method for commodity markets, in which particles represent market participants. A discontinuity in the model is included through an interacting kernel equal to the Heaviside function and its link with the Burgers equation is given. The Burgers equation and the connection of this model with stochastic differential equations are also studied. Further, based on the law of large numbers, we prove the convergence, for large N, of a system of stochastic differential equations describing the evolution of the prices of N traders to a deterministic partial differential equation of Burgers type. Numerical experiments highlight the success of the new proposal in modeling some commodity markets, and this is confirmed by the ability of the model to reproduce price spikes when their effects occur in a sufficiently long period of time.
Resumo:
This paper will show that short horizon stock returns for UK portfolios are more predictable than suggested by sample autocorrelation co-efficients. Four capitalisation based portfolios are constructed for the period 19761991. It is shown that the first order autocorrelation coefficient of monthly returns can explain no more than 10% of the variation in monthly portfolio returns. Monthly autocorrelation coefficients assume that each weekly return of the previous month contains the same amount of information. However, this will not be the case if short horizon returns contain predictable components which dissipate rapidly. In this case, the return of the most recent week would say a lot more about the future monthly portfolio return than other weeks. This suggests that when predicting future monthly portfolio returns more weight should be given to the most recent weeks of the previous month, because, the most recent weekly returns provide the most information about the subsequent months' performance. We construct a model which exploits the mean reverting characteristics of monthly portfolio returns. Using this model we forecast future monthly portfolio returns. When compared to forecasts that utilise the autocorrelation statistic the model which exploits the mean reverting characteristics of monthlyportfolio returns can forecast future returns better than the autocorrelation statistic, both in and out of sample.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
Esta pesquisa busca testar a eficcia de uma estratgia de arbitragem de taxas de juros no Brasil baseada na utilizao do modelo de Nelson-Siegel dinmico aplicada curva de contratos futuros de taxa de juros de 1 dia da BM&FBovespa para o perodo compreendido entre 02 de janeiro de 2008 e 03 de dezembro de 2012. O trabalho adapta para o mercado brasileiro o modelo original proposto por Nelson e Siegel (1987), e algumas de suas extenses e interpretaes, chegando a um dos modelos propostos por Diebold, Rudebusch e Aruoba (2006), no qual estimam os parmetros do modelo de Nelson-Siegel em uma nica etapa, colocando-o em formato de espao de estados e utilizando o Filtro de Kalman para realizar a previso dos fatores, assumindo que o comportamento dos mesmos um VAR de ordem 1. Desta maneira, o modelo possui a vantagem de que todos os parmetros so estimados simultaneamente, e os autores mostraram que este modelo possui bom poder preditivo. Os resultados da estratgia adotada foram animadores quando considerados para negociao apenas os 7 primeiros vencimentos abertos para negociao na BM&FBovespa, que possuem maturidade mxima prxima a 1 ano.
Resumo:
Uma das principais vantagens das estratgias de negociao em pares est relacionada baixa correlao com os retornos do mercado. Ao tomar posies compradas e vendidas, estas estratgias so capazes de controlar a magnitude do beta de mercado, mantendo-se praticamente zero ou estatsticamente no significativas. A idia consiste na realizao de arbitragem estatstica, aproveitando os desvios de preos de equilbrio de longo prazo. Como tal, elas envolvem modelos de correo de equilbrio para os pares de retornos dos ativos. Ns mostramos como construir uma estratgia de negociao de pares que beneficiada no s pela relao de equilbrio de longo prazo entre os pares de preos dos ativos da carteira, mas tambm pela velocidade com que os preos ajustam os desvios para o equilbrio. At ento, a grande maioria das estratgias envolvendo negociao em pares se baseavam na hiptese de que a obteno de retornos positivos estaria relacionada reverso mdia caracterizada pela relao de cointegrao dos pares, mas ignorava a possibilidade de seleo dos pares testando a velocidade de ajustamento do Vetor de Correo de Erros desta relao. Os resutados deste trabalho indicaram baixos nveis de correlao com o mercado, neutralidade das estratgias, associados a retornos financeiros lquidos e ndice de Sharpe anualizados de 15,05% e 1,96 respectivamente.
Resumo:
Controlling the quality variables (such as basis weight, moisture etc.) is a vital part of making top quality paper or board. In this thesis, an advanced data assimilation tool is applied to the quality control system (QCS) of a paper or board machine. The functionality of the QCS is based on quality observations that are measured with a traversing scanner making a zigzag path. The basic idea is the following: The measured quality variable has to be separated into its machine direction (MD) and cross direction (CD) variations due to the fact that the QCS works separately in MD and CD. Traditionally this is done simply by assuming one scan of the zigzag path to be the CD profile and its mean value to be one point of the MD trend. In this thesis, a more advanced method is introduced. The fundamental idea is to use the signals frequency components to represent the variation in both CD and MD. To be able to get to the frequency domain, the Fourier transform is utilized. The frequency domain, that is, the Fourier components are then used as a state vector in a Kalman filter. The Kalman filter is a widely used data assimilation tool to combine noisy observations with a model. The observations here refer to the quality measurements and the model to the Fourier frequency components. By implementing the two dimensional Fourier transform into the Kalman filter, we get an advanced tool for the separation of CD and MD components in total variation or, to be more general, for data assimilation. A piece of a paper roll is analyzed and this tool is applied to model the dataset. As a result, it is clear that the Kalman filter algorithm is able to reconstruct the main features of the dataset from a zigzag path. Although the results are made with a very short sample of paper roll, it seems that this method has great potential to be used later on as a part of the quality control system.
Resumo:
Thse numrise par la Division de la gestion de documents et des archives de l'Universit de Montral
Resumo:
Ensemble clustering (EC) can arise in data assimilation with ensemble square root filters (EnSRFs) using non-linear models: an M-member ensemble splits into a single outlier and a cluster of M1 members. The stochastic Ensemble Kalman Filter does not present this problem. Modifications to the EnSRFs by a periodic resampling of the ensemble through random rotations have been proposed to address it. We introduce a metric to quantify the presence of EC and present evidence to dispel the notion that EC leads to filter failure. Starting from a univariate model, we show that EC is not a permanent but transient phenomenon; it occurs intermittently in non-linear models. We perform a series of data assimilation experiments using a standard EnSRF and a modified EnSRF by a resampling though random rotations. The modified EnSRF thus alleviates issues associated with EC at the cost of traceability of individual ensemble trajectories and cannot use some of algorithms that enhance performance of standard EnSRF. In the non-linear regimes of low-dimensional models, the analysis root mean square error of the standard EnSRF slowly grows with ensemble size if the size is larger than the dimension of the model state. However, we do not observe this problem in a more complex model that uses an ensemble size much smaller than the dimension of the model state, along with inflation and localisation. Overall, we find that transient EC does not handicap the performance of the standard EnSRF.