922 resultados para High-frequency data
Resumo:
The goal of this paper is twofold. First, using five of the most actively traded stocks in the Brazilian financial market, this paper shows that the normality assumption commonly used in the risk management area to describe the distributions of returns standardized by volatilities is not compatible with volatilities estimated by EWMA or GARCH models. In sharp contrast, when the information contained in high frequency data is used to construct the realized volatilies measures, we attain the normality of the standardized returns, giving promise of improvements in Value at Risk statistics. We also describe the distributions of volatilities of the Brazilian stocks, showing that the distributions of volatilities are nearly lognormal. Second, we estimate a simple linear model to the log of realized volatilities that differs from the ones in other studies. The main difference is that we do not find evidence of long memory. The estimated model is compared with commonly used alternatives in an out-of-sample experiment.
Resumo:
This work proposes a method to examine variations in the cointegration relation between preferred and common stocks in the Brazilian stock market via Markovian regime switches. It aims on contributing for future works in "pairs trading" and, more specifically, to price discovery, given that, conditional on the state, the system is assumed stationary. This implies there exists a (conditional) moving average representation from which measures of "information share" (IS) could be extracted. For identification purposes, the Markov error correction model is estimated within a Bayesian MCMC framework. Inference and capability of detecting regime changes are shown using a Montecarlo experiment. I also highlight the necessity of modeling financial effects of high frequency data for reliable inference.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The objective of this paper is to verify and analyze the existence in Brazil of stylized facts observed in financial time series: volatility clustering, probability distributions with fat tails, the presence of long run memory in absolute return time series, absence of linear return autocorrelation, gain/loss asymmetry, aggregative gaussianity, slow absolute return autocorrelation decay, trading volume/volatility correlation and leverage effect. We analyzed intraday prices for 10 stocks traded at the BM&FBovespa, responsible for 52.1% of the Ibovespa portfolio on Sept. 01, 2009. The data analysis confirms the stylized facts, whose behavior is consistent with what is observed in international markets.
Resumo:
This thesis consists of three self-contained papers. In the first paper I analyze the labor supply behavior of Bologna Pizza Delivery Vendors. Recent influential papers analyze labor supply behavior of taxi drivers (Camerer et al., 1997; and Crawford and Meng, 2011) and suggest that reference-dependence preferences have an important influence on drivers’ labor-supply decisions. Unlike previous papers, I am able to identify an exogenous and transitory change in labor demand. Using high frequency data on orders and rainfall as an exogenous demand shifter, I invariably find that reference-dependent preferences play no role in their labor’ supply decisions and the behavior of pizza vendors is perfectly consistent with the predictions of the standard model of labor’ supply. In the second paper, I investigate how the voting behavior of Members of Parliament is influenced by the Members seating nearby. By exploiting the random seating arrangements in the Icelandic Parliament, I show that being seated next to Members of a different party increases the probability of not being aligned with one’s own party. Using the exact spatial orientation of the peers, I provide evidence that supports the hypothesis that interaction is the main channel that explain these results. In the third paper, I provide an estimate of the trade flows that there would have been between the UK and Europe if the UK had joined the Euro. As an alternative approach to the standard log-linear gravity equation I employ the synthetic control method. I show that the aggregate trade flows between Britain and Europe would have been 13% higher if the UK had adopted the Euro.
Resumo:
The recent development of in-situ monitoring devices, such as UV-spectrometers, makes the study of short-term stream chemistry variation relevant, especially the study of diurnal cycles, which are not yet fully understood. Our study is based on high-frequency data from an agricultural catchment (Studienlandschaft Schwingbachtal, Germany). We propose a novel approach, i.e. the combination of cluster analysis and Linear Discriminant Analysis, to mine from these data nitrate behavior patterns. As a result, we observe a seasonality of nitrate diurnal cycles, that differs from the most common cycle seasonality described in the literature, i.e. pre-dawn peaks in spring. Our cycles appear in summer and the maximum and minimum shift to a later time in late summer/autumn. This is observed both for water- and energy-limited years, thus potentially stressing the role of evapotranspiration. This concluding hypothesis on the role of evapotranspiration on nitrate stream concentration, which was obtained through data mining, broadens the perspective on the diurnal cycling of stream nitrate concentrations.
Resumo:
Are the learning procedures of genetic algorithms (GAs) able to generate optimal architectures for artificial neural networks (ANNs) in high frequency data? In this experimental study,GAs are used to identify the best architecture for ANNs. Additional learning is undertaken by the ANNs to forecast daily excess stock returns. No ANN architectures were able to outperform a random walk,despite the finding of non-linearity in the excess returns. This failure is attributed to the absence of suitable ANN structures and further implies that researchers need to be cautious when making inferences from ANN results that use high frequency data.
Resumo:
Purpose – The purpose of this paper is to investigate the impact of foreign exchange and interest rate changes on US banks’ stock returns. Design/methodology/approach – The approach employs an EGARCH model to account for the ARCH effects in daily returns. Most prior studies have used standard OLS estimation methods with the result that the presence of ARCH effects would have affected estimation efficiency. For comparative purposes, the standard OLS estimation method is also used to measure sensitivity. Findings – The findings are as follows: under the conditional t-distributional assumption, the EGARCH model generated a much better fit to the data although the goodness-of-fit of the model is not entirely satisfactory; the market index return accounts for most of the variation in stock returns at both the individual bank and portfolio levels; and the degree of sensitivity of the stock returns to interest rate and FX rate changes is not very pronounced despite the use of high frequency data. Earlier results had indicated that daily data provided greater evidence of exposure sensitivity. Practical implications – Assuming that banks do not hedge perfectly, these findings have important financial implications as they suggest that the hedging policies of the banks are not reflected in their stock prices. Alternatively, it is possible that different GARCH-type models might be more appropriate when modelling high frequency returns. Originality/value – The paper contributes to existing knowledge in the area by showing that ARCH effects do impact on measures of sensitivity.
Resumo:
This study examines the selectivity and timing performance of 218 UK investment trusts over the period July 1981 to June 2009. We estimate the Treynor and Mazuy (1966) and Henriksson and Merton (1981) models augmented with the size, value, and momentum factors, either under the OLS method adjusted with the Newey-West procedure or under the GARCH(1,1)-in-mean method following the specification of Glosten et al. (1993; hereafter GJR-GARCH-M). We find that the OLS method provides little evidence in favour of the selectivity and timing ability, consistent with previous studies. Interestingly, the GJR-GARCH-M method reverses this result, showing some relatively strong evidence on favourable selectivity ability, particularly for international funds, as well as favourable timing ability, particularly for domestic funds. We conclude that the GJR-GARCH-M method performs better in evaluating fund performance compared with the OLS method and the non-parametric approach, as it essentially accounts for the time-varying characteristics of factor loadings and hence obtains more reliable results, in particular, when the high frequency data, such as the daily returns, are used in the analysis. Our results are robust to various in-sample and out-of-sample tests and have valuable implications for practitioners in making their asset allocation decisions across different fund styles. © 2012 Elsevier B.V.
Resumo:
The paper develops a novel realized matrix-exponential stochastic volatility model of multivariate returns and realized covariances that incorporates asymmetry and long memory (hereafter the RMESV-ALM model). The matrix exponential transformation guarantees the positivedefiniteness of the dynamic covariance matrix. The contribution of the paper ties in with Robert Basmann’s seminal work in terms of the estimation of highly non-linear model specifications (“Causality tests and observationally equivalent representations of econometric models”, Journal of Econometrics, 1988, 39(1-2), 69–104), especially for developing tests for leverage and spillover effects in the covariance dynamics. Efficient importance sampling is used to maximize the likelihood function of RMESV-ALM, and the finite sample properties of the quasi-maximum likelihood estimator of the parameters are analysed. Using high frequency data for three US financial assets, the new model is estimated and evaluated. The forecasting performance of the new model is compared with a novel dynamic realized matrix-exponential conditional covariance model. The volatility and co-volatility spillovers are examined via the news impact curves and the impulse response functions from returns to volatility and co-volatility.
Resumo:
Fluvial sediment transport is controlled by hydraulics, sediment properties and arrangement, and flow history across a range of time scales. This physical complexity has led to ambiguous definition of the reference frame (Lagrangian or Eulerian) in which sediment transport is analysed. A general Eulerian-Lagrangian approach accounts for inertial characteristics of particles in a Lagrangian (particle fixed) frame, and for the hydrodynamics in an independent Eulerian frame. The necessary Eulerian-Lagrangian transformations are simplified under the assumption of an ideal Inertial Measurement Unit (IMU), rigidly attached at the centre of the mass of a sediment particle. Real, commercially available IMU sensors can provide high frequency data on accelerations and angular velocities (hence forces and energy) experienced by grains during entrainment and motion, if adequately customized. IMUs are subjected to significant error accu- mulation but they can be used for statistical parametrisation of an Eulerian-Lagrangian model, for coarse sediment particles and over the temporal scale of individual entrainment events. In this thesis an Eulerian-Lagrangian model is introduced and evaluated experimentally. Absolute inertial accelerations were recorded at a 4 Hz frequency from a spherical instrumented particle (111 mm diameter and 2383 kg/m3 density) in a series of entrainment threshold experiments on a fixed idealised bed. The grain-top inertial acceleration entrainment threshold was approximated at 44 and 51 mg for slopes 0.026 and 0.037 respectively. The saddle inertial acceleration entrainment threshold was at 32 and 25 mg for slopes 0.044 and 0.057 respectively. For the evaluation of the complete Eulerian-Lagrangian model two prototype sensors are presented: an idealised (spherical) with a diameter of 90 mm and an ellipsoidal with axes 100, 70 and 30 mm. Both are instrumented with a complete IMU, capable of sampling 3D inertial accelerations and 3D angular velocities at 50 Hz. After signal analysis, the results can be used to parametrize sediment movement but they do not contain positional information. The two sensors (spherical and ellipsoidal) were tested in a series of entrainment experiments, similar to the evaluation of the 111 mm prototype, for a slope of 0.02. The spherical sensor entrained at discharges of 24.8 ± 1.8 l/s while the same threshold for the ellipsoidal sensor was 45.2 ± 2.2 l/s. Kinetic energy calculations were used to quantify the particle-bed energy exchange under fluvial (discharge at 30 l/s) and non-fluvial conditions. All the experiments suggest that the effect of the inertial characteristics of coarse sediments on their motion is comparable to the effect hydrodynamic forces. The coupling of IMU sensors with advanced telemetric systems can lead to the tracking of Lagrangian particle trajectories, at a frequency and accuracy that will permit the testing of diffusion/dispersion models across the range of particle diameters.
Resumo:
We propose a method denoted as synthetic portfolio for event studies in market microstructure that is particularly interesting to use with high frequency data and thinly traded markets. The method is based on Synthetic Control Method and provides a robust data driven method to build a counterfactual for evaluating the effects of the volatility call auctions. We find that SMC could be used if the loss function is defined as the difference between the returns of the asset and the returns of a synthetic portfolio. We apply SCM to test the performance of the volatility call auction as a circuit breaker in the context of an event study. We find that for Colombian Stock Market securities, the asynchronicity of intraday data reduces the analysis to a selected group of stocks, however it is possible to build a tracking portfolio. The realized volatility increases after the auction, indicating that the mechanism is not enhancing the price discovery process.
Resumo:
Map-matching algorithms that utilise road segment connectivity along with other data (i.e.position, speed and heading) in the process of map-matching are normally suitable for high frequency (1 Hz or higher) positioning data from GPS. While applying such map-matching algorithms to low frequency data (such as data from a fleet of private cars, buses or light duty vehicles or smartphones), the performance of these algorithms reduces to in the region of 70% in terms of correct link identification, especially in urban and sub-urban road networks. This level of performance may be insufficient for some real-time Intelligent Transport System (ITS) applications and services such as estimating link travel time and speed from low frequency GPS data. Therefore, this paper develops a new weight-based shortest path and vehicle trajectory aided map-matching (stMM) algorithm that enhances the map-matching of low frequency positioning data on a road map. The well-known A* search algorithm is employed to derive the shortest path between two points while taking into account both link connectivity and turn restrictions at junctions. In the developed stMM algorithm, two additional weights related to the shortest path and vehicle trajectory are considered: one shortest path-based weight is related to the distance along the shortest path and the distance along the vehicle trajectory, while the other is associated with the heading difference of the vehicle trajectory. The developed stMM algorithm is tested using a series of real-world datasets of varying frequencies (i.e. 1 s, 5 s, 30 s, 60 s sampling intervals). A high-accuracy integrated navigation system (a high-grade inertial navigation system and a carrier-phase GPS receiver) is used to measure the accuracy of the developed algorithm. The results suggest that the algorithm identifies 98.9% of the links correctly for every 30 s GPS data. Omitting the information from the shortest path and vehicle trajectory, the accuracy of the algorithm reduces to about 73% in terms of correct link identification. The algorithm can process on average 50 positioning fixes per second making it suitable for real-time ITS applications and services.
Resumo:
Most information retrieval (IR) models treat the presence of a term within a document as an indication that the document is somehow "about" that term, they do not take into account when a term might be explicitly negated. Medical data, by its nature, contains a high frequency of negated terms - e.g. "review of systems showed no chest pain or shortness of breath". This papers presents a study of the effects of negation on information retrieval. We present a number of experiments to determine whether negation has a significant negative affect on IR performance and whether language models that take negation into account might improve performance. We use a collection of real medical records as our test corpus. Our findings are that negation has some affect on system performance, but this will likely be confined to domains such as medical data where negation is prevalent.