16 resultados para volatility spillovers

em Duke University


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We develop general model-free adjustment procedures for the calculation of unbiased volatility loss functions based on practically feasible realized volatility benchmarks. The procedures, which exploit recent nonparametric asymptotic distributional results, are both easy-to-implement and highly accurate in empirically realistic situations. We also illustrate that properly accounting for the measurement errors in the volatility forecast evaluations reported in the existing literature can result in markedly higher estimates for the true degree of return volatility predictability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article examines the behavior of equity trading volume and volatility for the individual firms composing the Standard & Poor's 100 composite index. Using multivariate spectral methods, we find that fractionally integrated processes best describe the long-run temporal dependencies in both series. Consistent with a stylized mixture-of-distributions hypothesis model in which the aggregate "news"-arrival process possesses long-memory characteristics, the long-run hyperbolic decay rates appear to be common across each volume-volatility pair.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We exploit the distributional information contained in high-frequency intraday data in constructing a simple conditional moment estimator for stochastic volatility diffusions. The estimator is based on the analytical solutions of the first two conditional moments for the latent integrated volatility, the realization of which is effectively approximated by the sum of the squared high-frequency increments of the process. Our simulation evidence indicates that the resulting GMM estimator is highly reliable and accurate. Our empirical implementation based on high-frequency five-minute foreign exchange returns suggests the presence of multiple latent stochastic volatility factors and possible jumps. © 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent empirical findings suggest that the long-run dependence in U.S. stock market volatility is best described by a slowly mean-reverting fractionally integrated process. The present study complements this existing time-series-based evidence by comparing the risk-neutralized option pricing distributions from various ARCH-type formulations. Utilizing a panel data set consisting of newly created exchange traded long-term equity anticipation securities, or leaps, on the Standard and Poor's 500 stock market index with maturity times ranging up to three years, we find that the degree of mean reversion in the volatility process implicit in these prices is best described by a Fractionally Integrated EGARCH (FIEGARCH) model. © 1999 Elsevier Science S.A. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses dynamic impulse response analysis to investigate the interrelationships among stock price volatility, trading volume, and the leverage effect. Dynamic impulse response analysis is a technique for analyzing the multi-step-ahead characteristics of a nonparametric estimate of the one-step conditional density of a strictly stationary process. The technique is the generalization to a nonlinear process of Sims-style impulse response analysis for linear models. In this paper, we refine the technique and apply it to a long panel of daily observations on the price and trading volume of four stocks actively traded on the NYSE: Boeing, Coca-Cola, IBM, and MMM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consistent with the implications from a simple asymmetric information model for the bid-ask spread, we present empirical evidence that the size of the bid-ask spread in the foreign exchange market is positively related to the underlying exchange rate uncertainty. The estimation results are based on an ordered probit analysis that captures the discreteness in the spread distribution, with the uncertainty of the spot exchange rate being quantified through a GARCH type model. The data sets consists of more than 300,000 continuously recorded Deutschemark/dollar quotes over the period from April 1989 to June 1989. © 1994.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Empirical modeling of high-frequency currency market data reveals substantial evidence for nonnormality, stochastic volatility, and other nonlinearities. This paper investigates whether an equilibrium monetary model can account for nonlinearities in weekly data. The model incorporates time-nonseparable preferences and a transaction cost technology. Simulated sample paths are generated using Marcet's parameterized expectations procedure. The paper also develops a new method for estimation of structural economic models. The method forces the model to match (under a GMM criterion) the score function of a nonparametric estimate of the conditional density of observed data. The estimation uses weekly U.S.-German currency market data, 1975-90. © 1995.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While there is growing interest in measuring the size and scope of local spillovers, it is well understood that such spillovers cannot be distinguished from unobservable local attributes using solely the observed location decisions of individuals or firms. We propose an empirical strategy for recovering estimates of spillovers in the presence of unobserved local attributes for a broadly applicable class of equilibrium sorting models. Our approach relies on an IV strategy derived from the internal logic of the sorting model itself. We show practically how the strategy is implemented, provide intuition for our instruments, discuss the role of effective choice-set variation in identifying the model, and carry-out a series of Monte Carlo simulations to demonstrate performance in small samples. © 2007 The Author(s). Journal compilation Royal Economic Society 2007.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a strategy for Markov chain Monte Carlo analysis of non-linear, non-Gaussian state-space models involving batch analysis for inference on dynamic, latent state variables and fixed model parameters. The key innovation is a Metropolis-Hastings method for the time series of state variables based on sequential approximation of filtering and smoothing densities using normal mixtures. These mixtures are propagated through the non-linearities using an accurate, local mixture approximation method, and we use a regenerating procedure to deal with potential degeneracy of mixture components. This provides accurate, direct approximations to sequential filtering and retrospective smoothing distributions, and hence a useful construction of global Metropolis proposal distributions for simulation of posteriors for the set of states. This analysis is embedded within a Gibbs sampler to include uncertain fixed parameters. We give an example motivated by an application in systems biology. Supplemental materials provide an example based on a stochastic volatility model as well as MATLAB code.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using novel data on European firms, this paper investigates the relationship between business groups and innovation. Controlling for various firm characteristics, we find that group affiliates are more innovative than standalones. We examine several hypotheses to explain this finding, focusing on group internal capital markets and knowledge spillovers. We find that group affiliation is particularly important for innovation in industries that rely more on external funding and in groups with more diversified capital sources, consistent with the internal capital markets hypothesis. Our results suggest that knowledge spillovers are not the main driver of innovation in business groups because firms affiliated with the same group do not have a common research focus and are unlikely to cite each other's patents. © 2010 INFORMS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss a general approach to dynamic sparsity modeling in multivariate time series analysis. Time-varying parameters are linked to latent processes that are thresholded to induce zero values adaptively, providing natural mechanisms for dynamic variable inclusion/selection. We discuss Bayesian model specification, analysis and prediction in dynamic regressions, time-varying vector autoregressions, and multivariate volatility models using latent thresholding. Application to a topical macroeconomic time series problem illustrates some of the benefits of the approach in terms of statistical and economic interpretations as well as improved predictions. Supplementary materials for this article are available online. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We assess different policies for reducing carbon dioxide emissions and promoting innovation and diffusion of renewable energy. We evaluate the relative performance of policies according to incentives provided for emissions reduction, efficiency, and other outcomes. We also assess how the nature of technological progress through learning and research and development (R&D), and the degree of knowledge spillovers, affects the desirability of different policies. Due to knowledge spillovers, optimal policy involves a portfolio of different instruments targeted at emissions, learning, and R&D. Although the relative cost of individual policies in achieving reductions depends on parameter values and the emissions target, in a numerical application to the U.S. electricity sector, the ranking is roughly as follows: (1) emissions price, (2) emissions performance standard, (3) fossil power tax, (4) renewables share requirement, (5) renewables subsidy, and (6) R&D subsidy. Nonetheless, an optimal portfolio of policies achieves emissions reductions at a significantly lower cost than any single policy. © 2007 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The approach used to model technological change in a climate policy model is a critical determinant of its results in terms of the time path of CO2 prices and costs required to achieve various emission reduction goals. We provide an overview of the different approaches used in the literature, with an emphasis on recent developments regarding endogenous technological change, research and development, and learning. Detailed examination sheds light on the salient features of each approach, including strengths, limitations, and policy implications. Key issues include proper accounting for the opportunity costs of climate-related knowledge generation, treatment of knowledge spillovers and appropriability, and the empirical basis for parameterizing technological relationships. No single approach appears to dominate on all these dimensions, and different approaches may be preferred depending on the purpose of the analysis, be it positive or normative. © 2008 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Market failures associated with environmental pollution interact with market failures associated with the innovation and diffusion of new technologies. These combined market failures provide a strong rationale for a portfolio of public policies that foster emissions reduction as well as the development and adoption of environmentally beneficial technology. Both theory and empirical evidence suggest that the rate and direction of technological advance is influenced by market and regulatory incentives, and can be cost-effectively harnessed through the use of economic-incentive based policy. In the presence of weak or nonexistent environmental policies, investments in the development and diffusion of new environmentally beneficial technologies are very likely to be less than would be socially desirable. Positive knowledge and adoption spillovers and information problems can further weaken innovation incentives. While environmental technology policy is fraught with difficulties, a long-term view suggests a strategy of experimenting with policy approaches and systematically evaluating their success. © 2005 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantity-based regulation with banking allows regulated firms to shift obligations across time in response to periods of unexpectedly high or low marginal costs. Despite its wide prevalence in existing and proposed emission trading programs, banking has received limited attention in past welfare analyses of policy choice under uncertainty. We address this gap with a model of banking behavior that captures two key constraints: uncertainty about the future from the firm's perspective and a limit on negative bank values (e.g. borrowing). We show conditions where banking provisions reduce price volatility and lower expected costs compared to quantity policies without banking. For plausible parameter values related to U.S. climate change policy, we find that bankable quantities produce behavior quite similar to price policies for about two decades and, during this period, improve welfare by about a $1 billion per year over fixed quantities. © 2012 Elsevier B.V.