894 resultados para robust extended kalman filter


Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the past 20 years, researchers have applied the Kalman filter to the modeling and forecasting the term structure of interest rates. Despite its impressive performance in in-sample fitting yield curves, little research has focused on the out-of-sample forecast of yield curves using the Kalman filter. The goal of this thesis is to develop a unified dynamic model based on Diebold and Li (2006) and Nelson and Siegel’s (1987) three-factor model, and estimate this dynamic model using the Kalman filter. We compare both in-sample and out-of-sample performance of our dynamic methods with various other models in the literature. We find that our dynamic model dominates existing models in medium- and long-horizon yield curve predictions. However, the dynamic model should be used with caution when forecasting short maturity yields

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thèse numérisée par la Division de la gestion de documents et des archives de l'Université de Montréal

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ma thèse est composée de trois chapitres reliés à l'estimation des modèles espace-état et volatilité stochastique. Dans le première article, nous développons une procédure de lissage de l'état, avec efficacité computationnelle, dans un modèle espace-état linéaire et gaussien. Nous montrons comment exploiter la structure particulière des modèles espace-état pour tirer les états latents efficacement. Nous analysons l'efficacité computationnelle des méthodes basées sur le filtre de Kalman, l'algorithme facteur de Cholesky et notre nouvelle méthode utilisant le compte d'opérations et d'expériences de calcul. Nous montrons que pour de nombreux cas importants, notre méthode est plus efficace. Les gains sont particulièrement grands pour les cas où la dimension des variables observées est grande ou dans les cas où il faut faire des tirages répétés des états pour les mêmes valeurs de paramètres. Comme application, on considère un modèle multivarié de Poisson avec le temps des intensités variables, lequel est utilisé pour analyser le compte de données des transactions sur les marchés financières. Dans le deuxième chapitre, nous proposons une nouvelle technique pour analyser des modèles multivariés à volatilité stochastique. La méthode proposée est basée sur le tirage efficace de la volatilité de son densité conditionnelle sachant les paramètres et les données. Notre méthodologie s'applique aux modèles avec plusieurs types de dépendance dans la coupe transversale. Nous pouvons modeler des matrices de corrélation conditionnelles variant dans le temps en incorporant des facteurs dans l'équation de rendements, où les facteurs sont des processus de volatilité stochastique indépendants. Nous pouvons incorporer des copules pour permettre la dépendance conditionnelle des rendements sachant la volatilité, permettant avoir différent lois marginaux de Student avec des degrés de liberté spécifiques pour capturer l'hétérogénéité des rendements. On tire la volatilité comme un bloc dans la dimension du temps et un à la fois dans la dimension de la coupe transversale. Nous appliquons la méthode introduite par McCausland (2012) pour obtenir une bonne approximation de la distribution conditionnelle à posteriori de la volatilité d'un rendement sachant les volatilités d'autres rendements, les paramètres et les corrélations dynamiques. Le modèle est évalué en utilisant des données réelles pour dix taux de change. Nous rapportons des résultats pour des modèles univariés de volatilité stochastique et deux modèles multivariés. Dans le troisième chapitre, nous évaluons l'information contribuée par des variations de volatilite réalisée à l'évaluation et prévision de la volatilité quand des prix sont mesurés avec et sans erreur. Nous utilisons de modèles de volatilité stochastique. Nous considérons le point de vue d'un investisseur pour qui la volatilité est une variable latent inconnu et la volatilité réalisée est une quantité d'échantillon qui contient des informations sur lui. Nous employons des méthodes bayésiennes de Monte Carlo par chaîne de Markov pour estimer les modèles, qui permettent la formulation, non seulement des densités a posteriori de la volatilité, mais aussi les densités prédictives de la volatilité future. Nous comparons les prévisions de volatilité et les taux de succès des prévisions qui emploient et n'emploient pas l'information contenue dans la volatilité réalisée. Cette approche se distingue de celles existantes dans la littérature empirique en ce sens que ces dernières se limitent le plus souvent à documenter la capacité de la volatilité réalisée à se prévoir à elle-même. Nous présentons des applications empiriques en utilisant les rendements journaliers des indices et de taux de change. Les différents modèles concurrents sont appliqués à la seconde moitié de 2008, une période marquante dans la récente crise financière.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Underwater target localization and tracking attracts tremendous research interest due to various impediments to the estimation task caused by the noisy ocean environment. This thesis envisages the implementation of a prototype automated system for underwater target localization, tracking and classification using passive listening buoy systems and target identification techniques. An autonomous three buoy system has been developed and field trials have been conducted successfully. Inaccuracies in the localization results, due to changes in the environmental parameters, measurement errors and theoretical approximations are refined using the Kalman filter approach. Simulation studies have been conducted for the tracking of targets with different scenarios even under maneuvering situations. This system can as well be used for classifying the unknown targets by extracting the features of the noise emanations from the targets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a pose-based algorithm to solve the full SLAM problem for an autonomous underwater vehicle (AUV), navigating in an unknown and possibly unstructured environment. The technique incorporate probabilistic scan matching with range scans gathered from a mechanical scanning imaging sonar (MSIS) and the robot dead-reckoning displacements estimated from a Doppler velocity log (DVL) and a motion reference unit (MRU). The proposed method utilizes two extended Kalman filters (EKF). The first, estimates the local path travelled by the robot while grabbing the scan as well as its uncertainty and provides position estimates for correcting the distortions that the vehicle motion produces in the acoustic images. The second is an augment state EKF that estimates and keeps the registered scans poses. The raw data from the sensors are processed and fused in-line. No priory structural information or initial pose are considered. The algorithm has been tested on an AUV guided along a 600 m path within a marina environment, showing the viability of the proposed approach

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Las estrategias de inversión pairs trading se basan en desviaciones del precio entre pares de acciones correlacionadas y han sido ampliamente implementadas por fondos de inversión tomando posiciones largas y cortas en las acciones seleccionadas cuando surgen divergencias y obteniendo utilidad cerrando la posición al converger. Se describe un modelo de reversión a la media para analizar la dinámica que sigue el diferencial del precio entre acciones ordinarias y preferenciales de una misma empresa en el mismo mercado. La media de convergencia en el largo plazo es obtenida con un filtro de media móvil, posteriormente, los parámetros del modelo de reversión a la media se estiman mediante un filtro de Kalman bajo una formulación de estado espacio sobre las series históricas. Se realiza un backtesting a la estrategia de pairs trading algorítmico sobre el modelo propuesto indicando potenciales utilidades en mercados financieros que se observan por fuera del equilibrio. Aplicaciones de los resultados podrían mostrar oportunidades para mejorar el rendimiento de portafolios, corregir errores de valoración y sobrellevar mejor periodos de bajos retornos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este documento propone un modelo para la estructura a plazos del riesgo interbancario a partir del spread entre los Interest Rate Swap (IRS) y los Overnight Indexed Swaps (OIS) en dólares durante la crisis financiera 2007-08 y la crisis del euro en 2010. Adicionalmente hace la descomposición del riesgo interbancario entre riesgo de default y no-default (liquidez). Los resultados sugieren que la crisis financiera tuvo importantes repercusiones en la estructura a plazos del riesgo interbancario y sus componentes: en los años previos a la crisis, el riesgo de no-default explicaba la mayor parte del riesgo interbancario; durante la crisis y posterior a ella, el riesgo de default conducía el comportamiento del riesgo interbancario. Adicionalmente, se encuentra que, a partir de la estructura a plazos de cada componente del riesgo interbancario, la crisis financiera se caracterizó por ser un problema más de corto que de largo plazo, en contraste con la crisis del euro de 2010. Estos resultados siguen lo propuesto por Filipovic & Trolle (2012) y dejan importantes implicaciones sobre el riesgo interbancario durante los periodos de stress financiero.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este documento propone un modelo para la estructura a plazos del riesgo interbancario a partir del spread entre los Interest Rate Swap (IRS) y los Overnight Indexed Swaps (OIS) en dólares durante la crisis financiera 2007-08 y la crisis del euro en 2010. Adicionalmente hace la descomposición del riesgo interbancario entre riesgo de default y no-default (liquidez). Los resultados sugieren que la crisis financiera tuvo importantes repercusiones en la estructura a plazos del riesgo interbancario y sus componentes: en los años previos a la crisis, el riesgo de no-default explicaba la mayor parte del riesgo interbancario; durante la crisis y posterior a ella, el riesgo de default conducía el comportamiento del riesgo interbancario. Adicionalmente, se encuentra que, a partir de la estructura a plazos de cada componente del riesgo interbancario, la crisis financiera se caracterizó por ser un problema más de corto que de largo plazo, en contraste con la crisis del euro de 2010. Estos resultados siguen lo propuesto por Filipovic & Trolle (2012) y dejan importantes implicaciones sobre el riesgo interbancario durante los periodos de stress financiero.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large scale image mosaicing methods are in great demand among scientists who study different aspects of the seabed, and have been fostered by impressive advances in the capabilities of underwater robots in gathering optical data from the seafloor. Cost and weight constraints mean that lowcost Remotely operated vehicles (ROVs) usually have a very limited number of sensors. When a low-cost robot carries out a seafloor survey using a down-looking camera, it usually follows a predetermined trajectory that provides several non time-consecutive overlapping image pairs. Finding these pairs (a process known as topology estimation) is indispensable to obtaining globally consistent mosaics and accurate trajectory estimates, which are necessary for a global view of the surveyed area, especially when optical sensors are the only data source. This thesis presents a set of consistent methods aimed at creating large area image mosaics from optical data obtained during surveys with low-cost underwater vehicles. First, a global alignment method developed within a Feature-based image mosaicing (FIM) framework, where nonlinear minimisation is substituted by two linear steps, is discussed. Then, a simple four-point mosaic rectifying method is proposed to reduce distortions that might occur due to lens distortions, error accumulation and the difficulties of optical imaging in an underwater medium. The topology estimation problem is addressed by means of an augmented state and extended Kalman filter combined framework, aimed at minimising the total number of matching attempts and simultaneously obtaining the best possible trajectory. Potential image pairs are predicted by taking into account the uncertainty in the trajectory. The contribution of matching an image pair is investigated using information theory principles. Lastly, a different solution to the topology estimation problem is proposed in a bundle adjustment framework. Innovative aspects include the use of fast image similarity criterion combined with a Minimum spanning tree (MST) solution, to obtain a tentative topology. This topology is improved by attempting image matching with the pairs for which there is the most overlap evidence. Unlike previous approaches for large-area mosaicing, our framework is able to deal naturally with cases where time-consecutive images cannot be matched successfully, such as completely unordered sets. Finally, the efficiency of the proposed methods is discussed and a comparison made with other state-of-the-art approaches, using a series of challenging datasets in underwater scenarios

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data assimilation – the set of techniques whereby information from observing systems and models is combined optimally – is rapidly becoming prominent in endeavours to exploit Earth Observation for Earth sciences, including climate prediction. This paper explains the broad principles of data assimilation, outlining different approaches (optimal interpolation, three-dimensional and four-dimensional variational methods, the Kalman Filter), together with the approximations that are often necessary to make them practicable. After pointing out a variety of benefits of data assimilation, the paper then outlines some practical applications of the exploitation of Earth Observation by data assimilation in the areas of operational oceanography, chemical weather forecasting and carbon cycle modelling. Finally, some challenges for the future are noted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data assimilation is a sophisticated mathematical technique for combining observational data with model predictions to produce state and parameter estimates that most accurately approximate the current and future states of the true system. The technique is commonly used in atmospheric and oceanic modelling, combining empirical observations with model predictions to produce more accurate and well-calibrated forecasts. Here, we consider a novel application within a coastal environment and describe how the method can also be used to deliver improved estimates of uncertain morphodynamic model parameters. This is achieved using a technique known as state augmentation. Earlier applications of state augmentation have typically employed the 4D-Var, Kalman filter or ensemble Kalman filter assimilation schemes. Our new method is based on a computationally inexpensive 3D-Var scheme, where the specification of the error covariance matrices is crucial for success. A simple 1D model of bed-form propagation is used to demonstrate the method. The scheme is capable of recovering near-perfect parameter values and, therefore, improves the capability of our model to predict future bathymetry. Such positive results suggest the potential for application to more complex morphodynamic models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impact of targeted sonde observations on the 1-3 day forecasts for northern Europe is evaluated using the Met Office four-dimensional variational data assimilation scheme and a 24 km gridlength limited-area version of the Unified Model (MetUM). The targeted observations were carried out during February and March 2007 as part of the Greenland Flow Distortion Experiment, using a research aircraft based in Iceland. Sensitive area predictions using either total energy singular vectors or an ensemble transform Kalman filter were used to predict where additional observations should be made to reduce errors in the initial conditions of forecasts for northern Europe. Targeted sonde data was assimilated operationally into the MetUM. Hindcasts show that the impact of the sondes was mixed. Only two out of the five cases showed clear forecast improvement; the maximum forecast improvement seen over the verifying region was approximately 5% of the forecast error 24 hours into the forecast. These two cases are presented in more detail: in the first the improvement propagates into the verification region with a developing polar low; and in the second the improvement is associated with an upper-level trough. The impact of cycling targeted data in the background of the forecast (including the memory of previous targeted observations) is investigated. This is shown to cause a greater forecast impact, but does not necessarily lead to a greater forecast improvement. Finally, the robustness of the results is assessed using a small ensemble of forecasts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper aims to summarise the current performance of ozone data assimilation (DA) systems, to show where they can be improved, and to quantify their errors. It examines 11 sets of ozone analyses from 7 different DA systems. Two are numerical weather prediction (NWP) systems based on general circulation models (GCMs); the other five use chemistry transport models (CTMs). The systems examined contain either linearised or detailed ozone chemistry, or no chemistry at all. In most analyses, MIPAS (Michelson Interferometer for Passive Atmospheric Sounding) ozone data are assimilated; two assimilate SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric Chartography) observations instead. Analyses are compared to independent ozone observations covering the troposphere, stratosphere and lower mesosphere during the period July to November 2003. Biases and standard deviations are largest, and show the largest divergence between systems, in the troposphere, in the upper-troposphere/lower-stratosphere, in the upper-stratosphere and mesosphere, and the Antarctic ozone hole region. However, in any particular area, apart from the troposphere, at least one system can be found that agrees well with independent data. In general, none of the differences can be linked to the assimilation technique (Kalman filter, three or four dimensional variational methods, direct inversion) or the system (CTM or NWP system). Where results diverge, a main explanation is the way ozone is modelled. It is important to correctly model transport at the tropical tropopause, to avoid positive biases and excessive structure in the ozone field. In the southern hemisphere ozone hole, only the analyses which correctly model heterogeneous ozone depletion are able to reproduce the near-complete ozone destruction over the pole. In the upper-stratosphere and mesosphere (above 5 hPa), some ozone photochemistry schemes caused large but easily remedied biases. The diurnal cycle of ozone in the mesosphere is not captured, except by the one system that includes a detailed treatment of mesospheric chemistry. These results indicate that when good observations are available for assimilation, the first priority for improving ozone DA systems is to improve the models. The analyses benefit strongly from the good quality of the MIPAS ozone observations. Using the analyses as a transfer standard, it is seen that MIPAS is similar to 5% higher than HALOE (Halogen Occultation Experiment) in the mid and upper stratosphere and mesosphere (above 30 hPa), and of order 10% higher than ozonesonde and HALOE in the lower stratosphere (100 hPa to 30 hPa). Analyses based on SCIAMACHY total column are almost as good as the MIPAS analyses; analyses based on SCIAMACHY limb profiles are worse in some areas, due to problems in the SCIAMACHY retrievals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Remote sensing from space-borne platforms is often seen as an appealing method of monitoring components of the hydrological cycle, including river discharge, due to its spatial coverage. However, data from these platforms is often less than ideal because the geophysical properties of interest are rarely measured directly and the measurements that are taken can be subject to significant errors. This study assimilated water levels derived from a TerraSAR-X synthetic aperture radar image and digital aerial photography with simulations from a two dimensional hydraulic model to estimate discharge, inundation extent, depths and velocities at the confluence of the rivers Severn and Avon, UK. An ensemble Kalman filter was used to assimilate spot heights water levels derived by intersecting shorelines from the imagery with a digital elevation model. Discharge was estimated from the ensemble of simulations using state augmentation and then compared with gauge data. Assimilating the real data reduced the error between analyzed mean water levels and levels from three gauging stations to less than 0.3 m, which is less than typically found in post event water marks data from the field at these scales. Measurement bias was evident, but the method still provided a means of improving estimates of discharge for high flows where gauge data are unavailable or of poor quality. Posterior estimates of discharge had standard deviations between 63.3 m3s-1 and 52.7 m3s-1, which were below 15% of the gauged flows along the reach. Therefore, assuming a roughness uncertainty of 0.03-0.05 and no model structural errors discharge could be estimated by the EnKF with accuracy similar to that arguably expected from gauging stations during flood events. Quality control prior to assimilation, where measurements were rejected for being in areas of high topographic slope or close to tall vegetation and trees, was found to be essential. The study demonstrates the potential, but also the significant limitations of currently available imagery to reduce discharge uncertainty in un-gauged or poorly gauged basins when combined with model simulations in a data assimilation framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present an on-line estimation algorithm for an uncertain time delay in a continuous system based on the observational input-output data, subject to observational noise. The first order Pade approximation is used to approximate the time delay. At each time step, the algorithm combines the well known Kalman filter algorithm and the recursive instrumental variable least squares (RIVLS) algorithm in cascade form. The instrumental variable least squares algorithm is used in order to achieve the consistency of the delay parameter estimate, since an error-in-the-variable model is involved. An illustrative example is utilized to demonstrate the efficacy of the proposed approach.