115 resultados para Intraday volatility
Resumo:
Abstract: We scrutinize the realized stock-bond correlation based upon high frequency returns. We use quantile regressions to pin down the systematic variation of the extreme tails over their economic determinants. The correlation dependence behaves differently when the correlation is large negative and large positive. The important explanatory variables at the extreme low quantile are the short rate, the yield spread, and the volatility index. At the extreme high quantile the bond market liquidity is also important. The empirical fi ndings are only partially robust to using less precise measures of the stock-bond correlation. The results are not caused by the recent financial crisis. Keywords: Extreme returns; Financial crisis; Realized stock-bond correlation; Quantile regressions; VIX. JEL Classifi cations: C22; G01; G11; G12
Resumo:
Abstract: We analyze the realized stock-bond correlation. Gradual transitions between negative and positive stock-bond correlation is accommodated by the smooth transition regression (STR) model. The changes in regime are de ned by economic and financial transition variables. Both in sample and out-of- sample results document that STR models with multiple transition variables outperform STR models with a single transition variable. The most important transition variables are the short rate, the yield spread, and the VIX volatility index. Keywords: realized correlation; smooth transition regressions; stock-bond correlation; VIX index JEL Classifi cations: C22; G11; G12; G17
Resumo:
The Stability and Growth Pact (SGP) was established to govern discretionary fiscal policy in the European Monetary Union. This article studies the effects created when there is uncertainty about the members’ commitment to respecting the established deficit limits in the SGP. We will show that, even if countries respect the SGP deficit ceiling, the presence of uncertainty about their compliance will bring about higher volatility in key economic variables, which could, in turn, affect unemployment and growth negatively. This finding shows that it is important to reduce uncertainty about the members’ commitment towards the SGP. Keywords: fiscal policy rules, monetary union, Stability and Growth Pact, uncertainty, commitment. JEL No.: E63, F55, H62, H87
Resumo:
One of the major problems when using non-dedicated volunteer resources in adistributed network is the high volatility of these hosts since they can go offlineor become unavailable at any time without control. Furthermore, the use ofvolunteer resources implies some security issues due to the fact that they aregenerally anonymous entities which we know nothing about. So, how to trustin someone we do not know?.Over the last years an important number of reputation-based trust solutionshave been designed to evaluate the participants' behavior in a system.However, most of these solutions are addressed to P2P and ad-hoc mobilenetworks that may not fit well with other kinds of distributed systems thatcould take advantage of volunteer resources as recent cloud computinginfrastructures.In this paper we propose a first approach to design an anonymous reputationmechanism for CoDeS [1], a middleware for building fogs where deployingservices using volunteer resources. The participants are reputation clients(RC), a reputation authority (RA) and a certification authority (CA). Users needa valid public key certificate from the CA to register to the RA and obtain thedata needed to participate into the system, as now an opaque identifier thatwe call here pseudonym and an initial reputation value that users provide toother users when interacting together. The mechanism prevents not only themanipulation of the provided reputation values but also any disclosure of theusers' identities to any other users or authorities so the anonymity isguaranteed.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified
Resumo:
In this paper, we study the determinants of political myopia in a rational model of electoral accountability where the key elements are informational frictions and uncertainty. We build a framework where political ability is ex-ante unknown and policy choices are not perfectly observable. On the one hand, elections improve accountability and allow to keep well-performing incumbents. On the other, politicians invest too little in costly policies with future returns in an attempt to signal high ability and increase their reelection probability. Contrary to the conventional wisdom, uncertainty reduces political myopia and may, under some conditions, increase social welfare. We use the model to study how political rewards can be set so as to maximise social welfare and the desirability of imposing a one-term limit to governments. The predictions of our theory are consistent with a number of stylised facts and with a new empirical observation documented in this paper: aggregate uncertainty, measured by economic volatility, is associated to better ...scal discipline in a panel of 20 OECD countries.
Resumo:
This paper analyzes the implications of pre-trade transpareny on market performance. We find that transparency increases the precision held by agents, however we show that this increase in precision may not be due to prices themselves. In competitive markets, transparency increases market liquidity and reduces price volatility, whereas these results may not hold under imperfect competition. More importantly, market depth and volatility might be positively related with proper priors. Moreover, we study the incentives for liquidity traders to engage in sunshine trading. We obtain that the choice of sunshine/dark trading for a noise trader is independent of his order size, being the traders with higher liquidity needs more interested in sunshine trading, as long as this practice is desirable. Key words: Market Microstructure, Transparency, Prior Information, Market Quality, Sunshine Trading
Resumo:
This paper analyzes how ownership concentration and managerial incentives influences bank risk for a large sample of US banks over the period 1997-2007. Using 2SLS simultaneous equations models, we show that ownership concentration has a positive total effect on bank risk. This is the result of a positive direct effect, which reflects monitoring and opportunistic behavior, and a negative indirect effect, which works through the design of managerial incentive contracts and reflects shareholder preferences toward risk. Large shareholders reduce bank risk by reducing the sensitivity of CEO wealth to stock volatility (Vega) and by increasing the CEO pay-performance sensitivity (Delta). In addition, we show that the direct and indirect effect of ownership concentration on bank risk depends on the type of the largest shareholder (a family, a bank, a corporation or an institutional investor), as well as, on the total shareholding held by each type as a group. Our results suggest that the positive relation between ownership concentration and risk is not the result of preferences towards more risk. Rather, they point at opportunistic behavior of large shareholders.
Resumo:
In this paper we investigate the goodness of fit of the Kirk's approximation formula for spread option prices in the correlated lognormal framework. Towards this end, we use the Malliavin calculus techniques to find an expression for the short-time implied volatility skew of options with random strikes. In particular, we obtain that this skew is very pronounced in the case of spread options with extremely high correlations, which cannot be reproduced by a constant volatility approximation as in the Kirk's formula. This fact agrees with the empirical evidence. Numerical examples are given.
Resumo:
We analyze how unemployment, job finding and job separation rates react to neutral and investment-specific technology shocks. Neutral shocks increase unemployment and explain a substantial portion of unemployment volatility; investment-specific shocks expand employment and hours worked and mostly contribute to hours worked volatility. Movements in the job separation rates are responsible for the impact response of unemployment while job finding rates for movements along its adjustment path. Our evidence qualifies the conclusions by Hall (2005) and Shimer (2007) and warns against using search models with exogenous separation rates to analyze the effects of technology shocks.
Resumo:
This paper studies the apparent contradiction between two strands of the literature on the effects of financial intermediation on economic activity. On the one hand, the empirical growth literature finds a positive effect of financial depth as measured by, for instance, private domestic credit and liquid liabilities (e.g., Levine, Loayza, and Beck 2000). On the other hand, the banking and currency crisis literature finds that monetary aggregates, such as domestic credit, are among the best predictors of crises and their related economic downturns (e.g., Kaminski and Reinhart 1999). The paper accounts for these contrasting effects based on the distinction between the short- and long-run impacts of financial intermediation. Working with a panel of cross-country and time-series observations, the paper estimates an encompassing model of short- and long-run effects using the Pooled Mean Group estimator developed by Pesaran, Shin, and Smith (1999). The conclusion from this analysis is that a positive long-run relationship between financial intermediation and output growth co-exists with a, mostly, negative short-run relationship. The paper further develops an explanation for these contrasting effects by relating them to recent theoretical models, by linking the estimated short-run effects to measures of financial fragility (namely, banking crises and financial volatility), and by jointly analyzing the effects of financial depth and fragility in classic panel growth regressions.
Resumo:
We analyze the labor market effects of neutral and investment-specific technology shocks along the intensive margin (hours worked) and the extensive margin (unemployment). We characterize the dynamic response of unemployment in terms of the job separation and the job finding rate. Labor market adjustments occur along the extensive margin in response to neutral shocks, along the intensive margin in response to investment specific shocks. The job separation rate accounts for a major portion of the impact response of unemployment. Neutral shocks prompt a contemporaneous increase in unemployment because of a sharp rise in the separation rate. This is prolonged by a persistent fall in the job finding rate. Investment specific shocks rise employment and hours worked. Neutral shocks explain a substantial portion of the volatility of unemployment and output; investment specific shocks mainly explain hours worked volatility. This suggests that neutral progress is consistent with Schumpeterian creative destruction, while investment-specific progress operates as in a neoclassical growth model.
Resumo:
We study the quantitative properties of a dynamic general equilibrium model in which agents face both idiosyncratic and aggregate income risk, state-dependent borrowing constraints that bind in some but not all periods and markets are incomplete. Optimal individual consumption-savings plans and equilibrium asset prices are computed under various assumptions about income uncertainty. Then we investigate whether our general equilibrium model with incomplete markets replicates two empirical observations: the high correlation between individual consumption and individual income, and the equity premium puzzle. We find that, when the driving processes are calibrated according to the data from wage income in different sectors of the US economy, the results move in the direction of explaining these observations, but the model falls short of explaining the observed correlations quantitatively. If the incomes of agents are assumed independent of each other, the observations can be explained quantitatively.
Resumo:
Many empirical studies of business cycles have followed the practise ofapplying the Hodrick-Prescott filter for cross-country comparisons. Thestandard procedure is to set the weight \lambda, which determines the'smoothness' of the trend equal to 1600. We show that if this value isused for against common wisdom about business cycles. As an example, weshow that the long recession occurred inSpain between 1975 and 1985 goesunnotoced by the HP filter. We propose a method for adjusting \lambda byreinterpreting the HP-filter as the solution to a constrained minimizationproblem. We argue that the common practice of fixing \lambda across countriesamounts to chankging the constraints on trend variability across countries.Our proposed method is easy to apply, retains all the virtues of thestandard HP-filter and when applied to Spanish data the results are inthe line with economic historian's view. Applying the method to a numberof OECD countries we find that, with the exception of Spain, Italy andJapan, the standard choice of \lambda=1600 is sensible.