971 resultados para exchange-traded fund
Resumo:
This paper presents a theoretical framework analysing the signalling channel of exchange rate interventions as an informational trigger. We develop an implicit target zone framework with learning in order to model the signalling channel. The theoretical premise of the model is that interventions convey signals that communicate information about the exchange rate objectives of central bank. The model is used to analyse the impact of Japanese FX interventions during the period 1999 -2011 on the yen/US dollar dynamics.
Resumo:
The efficient markets hypothesis implies that arbitrage opportunities in markets such as those for foreign exchange (FX) would be, at most, short-lived. The present paper surveys the fragmented nature of FX markets, revealing that information in these markets is also likely to be fragmented. The “quant” workforce in the hedge fund featured in The Fear Index novel by Robert Harris would have little or no reason for their existence in an EMH world. The four currency combinatorial analysis of arbitrage sequences contained in Cross, Kozyakin, O’Callaghan, Pokrovskii and Pokrovskiy (2012) is then considered. Their results suggest that arbitrage processes, rather than being self-extinguishing, tend to be periodic in nature. This helps explain the fact that arbitrage dealing tends to be endemic in FX markets.
Resumo:
The efficient markets hypothesis implies that arbitrage opportunities in markets such as those for foreign exchange (FX) would be, at most, short-lived. The present paper surveys the fragmented nature of FX markets, revealing that information in these markets is also likely to be fragmented. The “quant” workforce in the hedge fund featured in The Fear Index novel by Robert Harris would have little or no reason for their existence in an EMH world. The four currency combinatorial analysis of arbitrage sequences contained in Cross, Kozyakin, O’Callaghan, Pokrovskii and Pokrovskiy (2012) is then considered. Their results suggest that arbitrage processes, rather than being self-extinguishing, tend to be periodic in nature. This helps explain the fact that arbitrage dealing tends to be endemic in FX markets.
Resumo:
We analyze and quantify co-movements in real effective exchange rates while considering the regional location of countries. More specifically, using the dynamic hierarchical factor model (Moench et al. (2011)), we decompose exchange rate movements into several latent components; worldwide and two regional factors as well as country-specific elements. Then, we provide evidence that the worldwide common factor is closely related to monetary policies in large advanced countries while regional common factors tend to be captured by those in the rest of the countries in a region. However, a substantial proportion of the variation in the real exchange rates is reported to be country-specific; even in Europe country-specific movements exceed worldwide and regional common factors.
Resumo:
Using survey expectations data and Markov-switching models, this paper evaluates the characteristics and evolution of investors' forecast errors about the yen/dollar exchange rate. Since our model is derived from the uncovered interest rate parity (UIRP) condition and our data cover a period of low interest rates, this study is also related to the forward premium puzzle and the currency carry trade strategy. We obtain the following results. First, with the same forecast horizon, exchange rate forecasts are homogeneous among different industry types, but within the same industry, exchange rate forecasts differ if the forecast time horizon is different. In particular, investors tend to undervalue the future exchange rate for long term forecast horizons; however, in the short run they tend to overvalue the future exchange rate. Second, while forecast errors are found to be partly driven by interest rate spreads, evidence against the UIRP is provided regardless of the forecasting time horizon; the forward premium puzzle becomes more significant in shorter term forecasting errors. Consistent with this finding, our coefficients on interest rate spreads provide indirect evidence of the yen carry trade over only a short term forecast horizon. Furthermore, the carry trade seems to be active when there is a clear indication that the interest rate will be low in the future.
Resumo:
The efficient markets hypothesis implies that arbitrage opportunities in markets such as those for foreign exchange (FX) would be, at most, short-lived. The present paper surveys the fragmented nature of FX markets, revealing that information in these markets is also likely to be fragmented. The “quant” workforce in the hedge fund featured in The Fear Index novel by Robert Harris would have little or no reason for their existence in an EMH world. The four currency combinatorial analysis of arbitrage sequences contained in Cross, Kozyakin, O’Callaghan, Pokrovskii and Pokrovskiy (2012) is then considered. Their results suggest that arbitrage processes, rather than being self-extinguishing, tend to be periodic in nature. This helps explain the fact that arbitrage dealing tends to be endemic in FX markets.
Resumo:
An expanding literature articulates the view that Taylor rules are helpful in predicting exchange rates. In a changing world however, Taylor rule parameters may be subject to structural instabilities, for example during the Global Financial Crisis. This paper forecasts exchange rates using such Taylor rules with Time Varying Parameters (TVP) estimated by Bayesian methods. In core out-of-sample results, we improve upon a random walk benchmark for at least half, and for as many as eight out of ten, of the currencies considered. This contrasts with a constant parameter Taylor rule model that yields a more limited improvement upon the benchmark. In further results, Purchasing Power Parity and Uncovered Interest Rate Parity TVP models beat a random walk benchmark, implying our methods have some generality in exchange rate prediction.
Resumo:
This paper employs an unobserved component model that incorporates a set of economic fundamentals to obtain the Euro-Dollar permanent equilibrium exchange rates (PEER) for the period 1975Q1 to 2008Q4. The results show that for most of the sample period, the Euro-Dollar exchange rate closely followed the values implied by the PEER. The only significant deviations from the PEER occurred in the years immediately before and after the introduction of the single European currency. The forecasting exercise shows that incorporating economic fundamentals provides a better long-run exchange rate forecasting performance than a random walk process.
Resumo:
We analyse the role of time-variation in coefficients and other sources of uncertainty in exchange rate forecasting regressions. Our techniques incorporate the notion that the relevant set of predictors and their corresponding weights, change over time. We find that predictive models which allow for sudden rather than smooth, changes in coefficients significantly beat the random walk benchmark in out-of-sample forecasting exercise. Using innovative variance decomposition scheme, we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients' variability, as the main factors hindering models' forecasting performance. The uncertainty regarding the choice of the predictor is small.
Resumo:
This paper proposes a simple model for understanding transaction costs for their composition, size and policy implications. We distinguish between investments in institutions that facilitate exchange and the cost of conducting exchange itself. Institutional quality and market size are determined by the decisions of risk averse agents and conditions are discussed under which the efficient allocation may be decentralized. We highlight a number of differences with models where transaction costs are exogenous, including the implications for taxation and measurement issues.
Resumo:
In this paper we analyze the persistence of aggregate real exchange rates (RERs) for a group of EU-15 countries by using sectoral data. The tight relation between aggregate and sectoral persistence recently investigated by Mayoral (2008) allows us to decompose aggregate RER persistence into the persistence of its different subcomponents. We show that the distribution of sectoral persistence is highly heterogeneous and very skewed to the right, and that a limited number of sectors are responsible for the high levels of persistence observed at the aggregate level. We use quantile regression to investigate whether the traditional theories proposed to account for the slow reversion to parity (lack of arbitrage due to nontradibilities or imperfect competition and price stickiness) are able to explain the behavior of the upper quantiles of sectoral persistence. We conclude that pricing to market in the intermediate goods sector together with price stickiness have more explanatory power than variables related to the tradability of the goods or their inputs.
Resumo:
Background: Sponsoring of physicians meetings by life science companies has led to reduced participation fees but might influence physician's prescription practices. A ban on such sponsoring may increase participation fees. We aimed to evaluate factors associated with physicians' willingness to pay for medical meetings, their position on the sponsoring of medical meetings and their opinion on alternative financing options. Methods: An anonymous web-based questionnaire was sent to 447 general practitioners in one state in Switzerland, identified through their affiliation to a medical association. The questionnaire evaluated physicians' willingness to pay for medical meetings, their perception of a bias in prescription practices induced by commercial support, their opinion on the introduction of a binding legislation and alternative financing options, their frequency of exchange with sales representatives and other relevant socioeconomic factors. We built a multivariate predictor logistic regression model to identify determinants of willingness to pay. Results: Of the 115 physicians who responded (response rate 26%), 48% were willing to pay more than what they currently pay for congresses, 79% disagreed that commercial support introduced a bias in their prescription practices and 61% disagreed that it introduced a bias in their colleagues' prescription practices. Based on the multivariate logistic regression, perception of a bias in peers prescription practices (OR=7.47, 95% CI 1.65-38.18) and group practice structure (OR=4.62, 95% CI 1.34-22.29) were significantly associated with an increase in willingness to pay. Two thirds (76%) of physicians did not support the introduction of a binding legislation and 53% were in favour of creating a general fund administered by an independent body. Conclusion: Our results suggest that almost half of physicians surveyed are willing to pay more than what they currently pay for congresses. Predictors of an increase in physicians' willingness to pay were perception of the influence of bias in peers prescription practices and group practice structure. Most responders did not agree that sponsoring introduced prescribing bias nor did they support the 2 introduction of a binding legislation prohibiting sponsoring but a majority did agree to an independent body that would centrally administer a general fund.
Resumo:
We demonstrate that the step of DNA strand exchange during RecA-mediated recombination reaction can occur equally efficiently in the presence or absence of ATP hydrolysis. The polarity of strand exchange is the same when instead of ATP its non-hydrolyzable analog adenosine-5'-O-(3-thiotriphosphate) is used. We show that the ATP dependence of recombination reaction is limited to the post-exchange stages of the reactions. The low DNA affinity state of RecA protomers, induced after ATP hydrolysis, is necessary for the dissociation of RecA-DNA complexes at the end of the reaction. This dissociation of RecA from DNA is necessary for the release of recombinant DNA molecules from the complexes formed with RecA and for the recycling of RecA protomers for another round of the recombination reaction.
Resumo:
The International Molecular Exchange (IMEx) consortium is an international collaboration between major public interaction data providers to share literature-curation efforts and make a nonredundant set of protein interactions available in a single search interface on a common website (http://www.imexconsortium.org/). Common curation rules have been developed, and a central registry is used to manage the selection of articles to enter into the dataset. We discuss the advantages of such a service to the user, our quality-control measures and our data-distribution practices.
Resumo:
Indirect calorimetry based on respiratory exchange measurement has been successfully used from the beginning of the century to obtain an estimate of heat production (energy expenditure) in human subjects and animals. The errors inherent to this classical technique can stem from various sources: 1) model of calculation and assumptions, 2) calorimetric factors used, 3) technical factors and 4) human factors. The physiological and biochemical factors influencing the interpretation of calorimetric data include a change in the size of the bicarbonate and urea pools and the accumulation or loss (via breath, urine or sweat) of intermediary metabolites (gluconeogenesis, ketogenesis). More recently, respiratory gas exchange data have been used to estimate substrate utilization rates in various physiological and metabolic situations (fasting, post-prandial state, etc.). It should be recalled that indirect calorimetry provides an index of overall substrate disappearance rates. This is incorrectly assumed to be equivalent to substrate "oxidation" rates. Unfortunately, there is no adequate golden standard to validate whole body substrate "oxidation" rates, and this contrasts to the "validation" of heat production by indirect calorimetry, through use of direct calorimetry under strict thermal equilibrium conditions. Tracer techniques using stable (or radioactive) isotopes, represent an independent way of assessing substrate utilization rates. When carbohydrate metabolism is measured with both techniques, indirect calorimetry generally provides consistent glucose "oxidation" rates as compared to isotopic tracers, but only when certain metabolic processes (such as gluconeogenesis and lipogenesis) are minimal or / and when the respiratory quotients are not at the extreme of the physiological range. However, it is believed that the tracer techniques underestimate true glucose "oxidation" rates due to the failure to account for glycogenolysis in the tissue storing glucose, since this escapes the systemic circulation. A major advantage of isotopic techniques is that they are able to estimate (given certain assumptions) various metabolic processes (such as gluconeogenesis) in a noninvasive way. Furthermore when, in addition to the 3 macronutrients, a fourth substrate is administered (such as ethanol), isotopic quantification of substrate "oxidation" allows one to eliminate the inherent assumptions made by indirect calorimetry. In conclusion, isotopic tracers techniques and indirect calorimetry should be considered as complementary techniques, in particular since the tracer techniques require the measurement of carbon dioxide production obtained by indirect calorimetry. However, it should be kept in mind that the assessment of substrate oxidation by indirect calorimetry may involve large errors in particular over a short period of time. By indirect calorimetry, energy expenditure (heat production) is calculated with substantially less error than substrate oxidation rates.