933 resultados para High-Frequency Market Making
Resumo:
I develop a new methodology for measuring tail risks using the cross section of bid-ask spreads. Market makers embed tail risk information into spreads because (1) they lose to arbitrageurs when changes to asset values exceed the cost of liquidity and (2) underlying price movements and potential costs are linear in factor loadings. Using this insight, simple cross-sectional regressions relating spreads and trading volume to factor betas can recover tail risks in real time for priced or non-priced return factors. The methodology disentangles financial and aggregate market risks during the 2007-2008 Financial Crisis; anticipates jump risks associated with Federal Open Market Committee announcements; and quantifies a sharp, temporary increase in market tail risk before and throughout the 2010 Flash Crash. The recovered time series of implied market risks also aligns closely with both realized market jumps and the VIX.
Resumo:
Empirical modeling of high-frequency currency market data reveals substantial evidence for nonnormality, stochastic volatility, and other nonlinearities. This paper investigates whether an equilibrium monetary model can account for nonlinearities in weekly data. The model incorporates time-nonseparable preferences and a transaction cost technology. Simulated sample paths are generated using Marcet's parameterized expectations procedure. The paper also develops a new method for estimation of structural economic models. The method forces the model to match (under a GMM criterion) the score function of a nonparametric estimate of the conditional density of observed data. The estimation uses weekly U.S.-German currency market data, 1975-90. © 1995.
Resumo:
We assess the predictive ability of three VPIN metrics on the basis of two highly volatile market events of China, and examine the association between VPIN and toxic-induced volatility through conditional probability analysis and multiple regression. We examine the dynamic relationship on VPIN and high-frequency liquidity using Vector Auto-Regression models, Granger Causality tests, and impulse response analysis. Our results suggest that Bulk Volume VPIN has the best risk-warning effect among major VPIN metrics. VPIN has a positive association with market volatility induced by toxic information flow. Most importantly, we document a positive feedback effect between VPIN and high-frequency liquidity, where a negative liquidity shock boosts up VPIN, which, in turn, leads to further liquidity drain. Our study provides empirical evidence that reflects an intrinsic game between informed traders and market makers when facing toxic information in the high-frequency trading world.
Resumo:
O Mercado Acionário Americano evoluiu rapidamente na última década. Este tornou-se uma arquitetura aberta em que participantes com tecnologia inovadora podem competir de forma eficaz. Várias mudanças regulatórias e inovações tecnológicas permitiram mudanças profundas na estrutura do mercado. Essas mudanças, junto com o desenvolvimento tecnológico de redes de alta velocidade, agiu como um catalisador, dando origem a uma nova forma de negociação, denominada Negociação em Alta Frequência (HFT). As empresas de HFT surgiram e se apropriaram em larga escala do negócio de formação de mercado, no fornecimento de liquidez. Embora HFT tem crescido massivamente, ao longo dos últimos quatro anos, HFT perdeu rentabilidade significativamente, uma vez que mais empresas aderiram ao setor reduzindo as margens. Portanto, diante deste contexto, esta tese buscou apresentar uma breve revisão sobre a atividade de HFT, seguida de uma análise dos limites deste setor, bem como, das características do macroambiente do HFT. Para tanto, a tese realizou uma extensa revisão do histórico literário, documentos públicos qualitativos, tais como, jornais, atas de reunião e relatórios oficiais. A tese empregou um ferramental de análise, Barreiras de Entrada e Mobilidade (Porter, 1980); Modelos de Evolução Setorial (McGahan, 2004); Estrutura do Setor de Informação Intensiva (Sampler, 1998), para analisar os limites do setor de HFT. Adicionalmente, empregou as ferramentas de análise, Modelos de Evolução Setorial (McGahan, 2004) e PESTEL (JOHNSON, SCHOLES, and WHITTINGTON, 2011), para analisar o setor e o contexto que envolve o negócio de HFT. A análise concluiu que as empresas que empregam HFT para atuar e competir no mercado acionário, compoem um setor independente.
Resumo:
Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.
Resumo:
Nous développons dans cette thèse, des méthodes de bootstrap pour les données financières de hautes fréquences. Les deux premiers essais focalisent sur les méthodes de bootstrap appliquées à l’approche de "pré-moyennement" et robustes à la présence d’erreurs de microstructure. Le "pré-moyennement" permet de réduire l’influence de l’effet de microstructure avant d’appliquer la volatilité réalisée. En se basant sur cette ap- proche d’estimation de la volatilité intégrée en présence d’erreurs de microstructure, nous développons plusieurs méthodes de bootstrap qui préservent la structure de dépendance et l’hétérogénéité dans la moyenne des données originelles. Le troisième essai développe une méthode de bootstrap sous l’hypothèse de Gaussianité locale des données financières de hautes fréquences. Le premier chapitre est intitulé: "Bootstrap inference for pre-averaged realized volatility based on non-overlapping returns". Nous proposons dans ce chapitre, des méthodes de bootstrap robustes à la présence d’erreurs de microstructure. Particulièrement nous nous sommes focalisés sur la volatilité réalisée utilisant des rendements "pré-moyennés" proposés par Podolskij et Vetter (2009), où les rendements "pré-moyennés" sont construits sur des blocs de rendements à hautes fréquences consécutifs qui ne se chevauchent pas. Le "pré-moyennement" permet de réduire l’influence de l’effet de microstructure avant d’appliquer la volatilité réalisée. Le non-chevauchement des blocs fait que les rendements "pré-moyennés" sont asymptotiquement indépendants, mais possiblement hétéroscédastiques. Ce qui motive l’application du wild bootstrap dans ce contexte. Nous montrons la validité théorique du bootstrap pour construire des intervalles de type percentile et percentile-t. Les simulations Monte Carlo montrent que le bootstrap peut améliorer les propriétés en échantillon fini de l’estimateur de la volatilité intégrée par rapport aux résultats asymptotiques, pourvu que le choix de la variable externe soit fait de façon appropriée. Nous illustrons ces méthodes en utilisant des données financières réelles. Le deuxième chapitre est intitulé : "Bootstrapping pre-averaged realized volatility under market microstructure noise". Nous développons dans ce chapitre une méthode de bootstrap par bloc basée sur l’approche "pré-moyennement" de Jacod et al. (2009), où les rendements "pré-moyennés" sont construits sur des blocs de rendements à haute fréquences consécutifs qui se chevauchent. Le chevauchement des blocs induit une forte dépendance dans la structure des rendements "pré-moyennés". En effet les rendements "pré-moyennés" sont m-dépendant avec m qui croît à une vitesse plus faible que la taille d’échantillon n. Ceci motive l’application d’un bootstrap par bloc spécifique. Nous montrons que le bloc bootstrap suggéré par Bühlmann et Künsch (1995) n’est valide que lorsque la volatilité est constante. Ceci est dû à l’hétérogénéité dans la moyenne des rendements "pré-moyennés" au carré lorsque la volatilité est stochastique. Nous proposons donc une nouvelle procédure de bootstrap qui combine le wild bootstrap et le bootstrap par bloc, de telle sorte que la dépendance sérielle des rendements "pré-moyennés" est préservée à l’intérieur des blocs et la condition d’homogénéité nécessaire pour la validité du bootstrap est respectée. Sous des conditions de taille de bloc, nous montrons que cette méthode est convergente. Les simulations Monte Carlo montrent que le bootstrap améliore les propriétés en échantillon fini de l’estimateur de la volatilité intégrée par rapport aux résultats asymptotiques. Nous illustrons cette méthode en utilisant des données financières réelles. Le troisième chapitre est intitulé: "Bootstrapping realized covolatility measures under local Gaussianity assumption". Dans ce chapitre nous montrons, comment et dans quelle mesure on peut approximer les distributions des estimateurs de mesures de co-volatilité sous l’hypothèse de Gaussianité locale des rendements. En particulier nous proposons une nouvelle méthode de bootstrap sous ces hypothèses. Nous nous sommes focalisés sur la volatilité réalisée et sur le beta réalisé. Nous montrons que la nouvelle méthode de bootstrap appliquée au beta réalisé était capable de répliquer les cummulants au deuxième ordre, tandis qu’il procurait une amélioration au troisième degré lorsqu’elle est appliquée à la volatilité réalisée. Ces résultats améliorent donc les résultats existants dans cette littérature, notamment ceux de Gonçalves et Meddahi (2009) et de Dovonon, Gonçalves et Meddahi (2013). Les simulations Monte Carlo montrent que le bootstrap améliore les propriétés en échantillon fini de l’estimateur de la volatilité intégrée par rapport aux résultats asymptotiques et les résultats de bootstrap existants. Nous illustrons cette méthode en utilisant des données financières réelles.
Resumo:
Many recent papers have documented periodicities in returns, return volatility, bid–ask spreads and trading volume, in both equity and foreign exchange markets. We propose and employ a new test for detecting subtle periodicities in time series data based on a signal coherence function. The technique is applied to a set of seven half-hourly exchange rate series. Overall, we find the signal coherence to be maximal at the 8-h and 12-h frequencies. Retaining only the most coherent frequencies for each series, we implement a trading rule that is based on these observed periodicities. Our results demonstrate in all cases except one that, in gross terms, the rules can generate returns that are considerably greater than those of a buy-and-hold strategy, although they cannot retain their profitability net of transactions costs. We conjecture that this methodology could constitute an important tool for financial market researchers which will enable them to detect, quantify and rank the various periodic components in financial data better.
Resumo:
This paper forecasts Daily Sterling exchange rate returns using various naive, linear and non-linear univariate time-series models. The accuracy of the forecasts is evaluated using mean squared error and sign prediction criteria. These show only a very modest improvement over forecasts generated by a random walk model. The Pesaran–Timmerman test and a comparison with forecasts generated artificially shows that even the best models have no evidence of market timing ability.
Resumo:
Doctor-patient jokes are universally popular because of the information asymmetries within the diagnostic relationship. We contend that entrepreneurial diagnosis is present in markets where consumers are unable to diagnose their own problems and, instead, may rely on the entrepreneur to diagnose them. Entrepreneurial diagnosis is a cognitive skill possessed by the entrepreneur. It is an identifiable subset of entrepreneurial judgment and can be modeled – which we attempt to do. In order to overcome the information asymmetries and exploit opportunities, we suggest that entrepreneurs must invest in market making innovations (as distinct from product innovations) such as trustworthy reputations. The diagnostic entrepreneur described in this paper represents a creative response to difficult diagnostic problems and helps to explain the success of many firms whose products are not particularly innovative but which are perceived as offering high standards of service. These firms are trusted not only for their truthfulness about the quality of their product, but for their honesty, confidentiality and understanding in helping customers identify the most appropriate product to their needs.
Resumo:
This paper develops a framework to test whether discrete-valued irregularly-spaced financial transactions data follow a subordinated Markov process. For that purpose, we consider a specific optional sampling in which a continuous-time Markov process is observed only when it crosses some discrete level. This framework is convenient for it accommodates not only the irregular spacing of transactions data, but also price discreteness. Further, it turns out that, under such an observation rule, the current price duration is independent of previous price durations given the current price realization. A simple nonparametric test then follows by examining whether this conditional independence property holds. Finally, we investigate whether or not bid-ask spreads follow Markov processes using transactions data from the New York Stock Exchange. The motivation lies on the fact that asymmetric information models of market microstructures predict that the Markov property does not hold for the bid-ask spread. The results are mixed in the sense that the Markov assumption is rejected for three out of the five stocks we have analyzed.
Resumo:
Aiming at empirical findings, this work focuses on applying the HEAVY model for daily volatility with financial data from the Brazilian market. Quite similar to GARCH, this model seeks to harness high frequency data in order to achieve its objectives. Four variations of it were then implemented and their fit compared to GARCH equivalents, using metrics present in the literature. Results suggest that, in such a market, HEAVY does seem to specify daily volatility better, but not necessarily produces better predictions for it, what is, normally, the ultimate goal. The dataset used in this work consists of intraday trades of U.S. Dollar and Ibovespa future contracts from BM&FBovespa.
Resumo:
This dissertation presents the theory and the conducted activity that lead to the construction of a high voltage high frequency arbitrary waveform voltage generator. The generator has been specifically designed to supply power to a wide range of plasma actuators. The system has been completely designed, manufactured and tested at the Department of Electrical, Electronic and Information Engineering of the University of Bologna. The generator structure is based on the single phase cascaded H-bridge multilevel topology and is comprised of 24 elementary units that are series connected in order to form the typical staircase output voltage waveform of a multilevel converter. The total number of voltage levels that can be produced by the generator is 49. Each level is 600 V making the output peak-to-peak voltage equal to 28.8 kV. The large number of levels provides high resolution with respect to the output voltage having thus the possibility to generate arbitrary waveforms. Maximum frequency of operation is 20 kHz. A study of the relevant literature shows that this is the first time that a cascaded multilevel converter of such dimensions has been constructed. Isolation and control challenges had to be solved for the realization of the system. The biggest problem of the current technology in power supplies for plasma actuators is load matching. Resonant converters are the most used power supplies and are seriously affected by this problem. The manufactured generator completely solves this issue providing consistent voltage output independently of the connected load. This fact is very important when executing tests and during the comparison of the results because all measures should be comparable and not dependent from matching issues. The use of the multilevel converter for power supplying a plasma actuator is a real technological breakthrough that has provided and will continue to provide very significant experimental results.
Resumo:
This thesis describes an industrial research project carried out in collaboration with STC Components, Harlow, Essex. Technical and market trends in the use of surface acoustic wave (SAW) devices are reviewed. As a result, three areas not previously addressed by STC were identified: lower insertion loss designs, higher operating frequencies and improved temperature dependent stability. A review of the temperature performance of alternative lower insertion loss designs,shows that greater use could be made of the on-site quartz growing plant. Data is presented for quartz cuts in the ST-AT range. This data is used to modify the temperature performance of a SAW filter. Several recently identified quartz orientations have been tested. These are SST, LST and X33. Problems associated with each cut are described and devices demonstrated. LST quartz, although sensitive to accuracy of cut, is shown to have an improved temperature coefficient over the normal ST orientation. Results show that its use is restricted due to insertion loss variations with temperature. Effects associated with split-finger transducers on LST-quartz are described. Two low-loss options are studied, coupled resonator filters for very narrow bandwidth applications and single phase unidirectional transducers (SPUDT) for fractional bandwidths up to about 1%. Both designs can be implemented with one quarter wavelength transducer geometries at operating frequencies up to 1GHz. The SPUDT design utilised an existing impulse response model to provide analysis of ladder or rung transducers. A coupled resonator filter at 400MHz is demonstrated with a matched insertion loss of less than 3.5dB and bandwidth of 0.05%. A SPUDT device is designed as a re-timing filter for timing extraction in a long haul PCM transmission system. Filters operating at 565MHz are demonstrated with insertion losses of less than 6dB. This basic SPUDT design is extended to a maximally distributed version and demonstrated at 450MHz with 9.8dB insertion loss.
Resumo:
Data from the World Federation of Exchanges show that Brazil’s Sao Paulo stock exchange is one of the largest worldwide in terms of market value. Thus, the objective of this study is to obtain univariate and bivariate forecasting models based on intraday data from the futures and spot markets of the BOVESPA index. The interest is to verify if there exist arbitrage opportunities in Brazilian financial market. To this end, three econometric forecasting models were built: ARFIMA, vector autoregressive (VAR), and vector error correction (VEC). Furthermore, it presents the results of a Granger causality test for the aforementioned series. This type of study shows that it is important to identify arbitrage opportunities in financial markets and, in particular, in the application of these models on data of this nature. In terms of the forecasts made with these models, VEC showed better results. The causality test shows that futures BOVESPA index Granger causes spot BOVESPA index. This result may indicate arbitrage opportunities in Brazil.