929 resultados para Probabilistic Error Correction


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper studies the signalling effect of the consumption−wealth ratio (cay) on German stock returns via vector error correction models (VECMs). The effect of cay on U.S. stock returns has been recently confirmed by Lettau and Ludvigson with a two−stage method. In this paper, performance of the VECMs and the two−stage method are compared in both German and U.S. data. It is found that the VECMs are more suitable to study the effect of cay on stock returns than the two−stage method. Using the Conditional−Subset VECM, cay signals real stock returns and excess returns in both data sets significantly. The estimated coefficient on cay for stock returns turns out to be two times greater in U.S. data than in German data. When the two−stage method is used, cay has no significant effect on German stock returns. Besides, it is also found that cay signals German wealth growth and U.S. income growth significantly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we investigate the price discovery process in single-name credit spreads obtained from bond, credit default swap (CDS), equity and equity option prices. We analyse short term price discovery by modelling daily changes in credit spreads in the four markets with a vector autoregressive model (VAR). We also look at price discovery in the long run with a vector error correction model (VECM). We find that in the short term the option market clearly leads the other markets in the sub-prime crisis (2007-2009). During the less severe sovereign debt crisis (2009-2012) and the pre-crisis period, options are still important but CDSs become more prominent. In the long run, deviations from the equilibrium relationship with the option market still lead to adjustments in the credit spreads observed or implied from other markets. However, options no longer dominate price discovery in any of the periods considered. Our findings have implications for traders, credit risk managers and financial regulators.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper examines the lead–lag relationship between the FTSE 100 index and index futures price employing a number of time series models. Using 10-min observations from June 1996–1997, it is found that lagged changes in the futures price can help to predict changes in the spot price. The best forecasting model is of the error correction type, allowing for the theoretical difference between spot and futures prices according to the cost of carry relationship. This predictive ability is in turn utilised to derive a trading strategy which is tested under real-world conditions to search for systematic profitable trading opportunities. It is revealed that although the model forecasts produce significantly higher returns than a passive benchmark, the model was unable to outperform the benchmark after allowing for transaction costs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper examines the effects of liquidity during the 2007–09 crisis, focussing on the Senior Tranche of the CDX.NA.IG Index and on Moody's AAA Corporate Bond Index. It aims to understand whether the sharp increase in the credit spreads of these AAA-rated credit indices can be explained by worse credit fundamentals alone or whether it also reflects a lack of depth in the relevant markets, the scarcity of risk-capital, and the liquidity preference exhibited by investors. Using cointegration analysis and error correction models, the paper shows that during the crisis lower market and funding liquidity are important drivers of the increase in the credit spread of the AAA-rated structured product, whilst they are less significant in explaining credit spread changes for a portfolio of unstructured credit instruments. Looking at the experience of the subprime crisis, the study shows that when the conditions under which securitisation can work properly (liquidity, transparency and tradability) suddenly disappear, investors are left highly exposed to systemic risk.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the present study, to shed light on a role of positional error correction mechanism and prediction mechanism in the proactive control discovered earlier, we carried out a visual tracking experiment, in which the region where target was shown, was regulated in a circular orbit. Main results found in this research were following. Recognition of a time step, obtained from the environmental stimuli, is required for the predictive function. The period of the rhythm in the brain obtained from environmental stimuli is shortened about 10%, when the visual information is cut-off. The shortening of the period of the rhythm in the brain accelerates the motion as soon as the visual information is cut-off, and lets the hand motion precedes the target motion. Although the precedence of the hand in the blind region is reset by the environmental information when the target enters the visible region, the hand precedes in average the target when the predictive mechanism dominates the error-corrective mechanism.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reading aloud is apparently an indispensible part of teaching. Nevertheless, little is known about reading aloud across the curriculum by students and teachers in high schools. Nor do we understand teachers’ attitudes towards issues such as error correction, rehearsal time, and selecting students to read. A survey of 360 teachers in England shows that, although they have little training in reading aloud, they are extremely confident. Reading aloud by students and teachers is strongly related, and serves to further understanding rather than administrative purposes or pupils’ enjoyment. Unexpectedly, Modern Language teachers express views that set them apart from other subjects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Unemployment as an unintended consequence of social assistance recipiency: results from a time-series analysis of aggregated population data Does the frequency of unemployment have a tendency to increase the number of social assistance recipients, or does the relationship work the other way around? This article utilizes Swedish annual data on aggregated unemployment and means-tested social assistance recipiency in the period 1946–1990 and proposes a multiple time-series approach based on vector error-correction modelling to establish the direction of influence. First, we show that rates of unemployment and receipt of social assistance is co-integrated. Second, we demonstrate that adjustments to the long-run equilibrium are made through adjustments of the unemployment. This indicates that the level of unemployment reacts to changes in rates of social assistance recipiency rather than vice versa. It is also shown that lagged changes in the level of unemployment do not predict changes in rates of social assistance recipients in short-term. Together these findings demonstrate that the number of social assistance recipients does increase the number of unemployed in a period characterized by low unemployment and high employment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper examines the relationships among per capita CO2 emissions, per capita GDP and international trade based on panel data sets spanning the period 1960-2008: one for 150 countries and the others for sub-samples comprising OECD and Non-OECD economies. We apply panel unit root and cointegration tests, and estimate a panel error correction model. The results from the error correction model suggest that there are long-term relationships between the variables for the whole sample and for Non-OECD countries. Finally, Granger causality tests show that there is bi-directional short-term causality between per capita GDP and international trade for the whole sample and between per capita GDP and CO2 emissions for OECD countries

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis consists of a summary and four self-contained papers. Paper [I] Following the 1987 report by The World Commission on Environment and Development, the genuine saving has come to play a key role in the context of sustainable development, and the World Bank regularly publishes numbers for genuine saving on a national basis. However, these numbers are typically calculated as if the tax system is non-distortionary. This paper presents an analogue to genuine saving in a second best economy, where the government raises revenue by means of distortionary taxation. We show how the social cost of public debt, which depends on the marginal excess burden, ought to be reflected in the genuine saving. We also illustrate by presenting calculations for Greece, Japan, Portugal, U.K., U.S. and OECD average, showing that the numbers published by the World Bank are likely to be biased and may even give incorrect information as to whether the economy is locally sustainable. Paper [II] This paper examines the relationships among per capita CO2 emissions, per capita GDP and international trade based on panel data spanning the period 1960-2008 for 150 countries. A distinction is also made between OECD and Non-OECD countries to capture the differences of this relationship between developed and developing economies. We apply panel unit root and cointegration tests, and estimate a panel error correction model. The results from the error correction model suggest that there are long-term relationships between the variables for the whole sample and for Non-OECD countries. Finally, Granger causality tests show that there is bi-directional short-term causality between per capita GDP and international trade for the whole sample and between per capita GDP and CO2 emissions for OECD countries. Paper [III] Fundamental questions in economics are why some regions are richer than others, why their growth rates differ, whether their growth rates tend to converge, and what key factors contribute to explain economic growth. This paper deals with the average income growth, net migration, and changes in unemployment rates at the municipal level in Sweden. The aim is to explore in depth the effects of possible underlying determinants with a particular focus on local policy variables. The analysis is based on a three-equation model. Our results show, among other things, that increases in the local public expenditure and income taxe rate have negative effects on subsequent income income growth. In addition, the results show conditional convergence, i.e. that the average income among the municipal residents tends to grow more rapidly in relatively poor local jurisdictions than in initially “richer” jurisdictions, conditional on the other explanatory variables. Paper [IV] This paper explores the relationship between income growth and income inequality using data at the municipal level in Sweden for the period 1992-2007. We estimate a fixed effects panel data growth model, where the within-municipality income inequality is one of the explanatory variables. Different inequality measures (Gini coefficient, top income shares, and measures of inequality in the lower and upper part of the income distribution) are examined. We find a positive and significant relationship between income growth and income inequality measured as the Gini coefficient and top income shares, respectively. In addition, while inequality in the upper part of the income distribution is positively associated with the income growth rate, inequality in the lower part of the income distribution seems to be negatively related to the income growth. Our findings also suggest that increased income inequality enhances growth more in municipalities with a high level of average income than in municipalities with a low level of average income.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Short-term Water Information and Forecasting Tools (SWIFT) is a suite of tools for flood and short-term streamflow forecasting, consisting of a collection of hydrologic model components and utilities. Catchments are modeled using conceptual subareas and a node-link structure for channel routing. The tools comprise modules for calibration, model state updating, output error correction, ensemble runs and data assimilation. Given the combinatorial nature of the modelling experiments and the sub-daily time steps typically used for simulations, the volume of model configurations and time series data is substantial and its management is not trivial. SWIFT is currently used mostly for research purposes but has also been used operationally, with intersecting but significantly different requirements. Early versions of SWIFT used mostly ad-hoc text files handled via Fortran code, with limited use of netCDF for time series data. The configuration and data handling modules have since been redesigned. The model configuration now follows a design where the data model is decoupled from the on-disk persistence mechanism. For research purposes the preferred on-disk format is JSON, to leverage numerous software libraries in a variety of languages, while retaining the legacy option of custom tab-separated text formats when it is a preferred access arrangement for the researcher. By decoupling data model and data persistence, it is much easier to interchangeably use for instance relational databases to provide stricter provenance and audit trail capabilities in an operational flood forecasting context. For the time series data, given the volume and required throughput, text based formats are usually inadequate. A schema derived from CF conventions has been designed to efficiently handle time series for SWIFT.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reduced form estimation of multivariate data sets currently takes into account long-run co-movement restrictions by using Vector Error Correction Models (VECM' s). However, short-run co-movement restrictions are completely ignored. This paper proposes a way of taking into account short-and long-run co-movement restrictions in multivariate data sets, leading to efficient estimation of VECM' s. It enables a more precise trend-cycle decomposition of the data which imposes no untested restrictions to recover these two components. The proposed methodology is applied to a multivariate data set containing U.S. per-capita output, consumption and investment Based on the results of a post-sample forecasting comparison between restricted and unrestricted VECM' s, we show that a non-trivial loss of efficiency results whenever short-run co-movement restrictions are ignored. While permanent shocks to consumption still play a very important role in explaining consumption’s variation, it seems that the improved estimates of trends and cycles of output, consumption, and investment show evidence of a more important role for transitory shocks than previously suspected. Furthermore, contrary to previous evidence, it seems that permanent shocks to output play a much more important role in explaining unemployment fluctuations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation proposes a bivariate markov switching dynamic conditional correlation model for estimating the optimal hedge ratio between spot and futures contracts. It considers the cointegration between series and allows to capture the leverage efect in return equation. The model is applied using daily data of future and spot prices of Bovespa Index and R$/US$ exchange rate. The results in terms of variance reduction and utility show that the bivariate markov switching model outperforms the strategies based ordinary least squares and error correction models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A lei do preço único afirma que o mesmo ativo negociado em diferentes mercados deve apresentar preços equivalentes. Este trabalho busca verificar se o risco de crédito soberano brasileiro negociado no mercado internacional é precificado de forma semelhante tanto nos tradicionais mercados de títulos quanto no novo e crescente mercado de derivativos de crédito. Adicionalmente, utiliza-se a análise de Price Discovery para examinar qual dos mercados se move mais rapidamente em resposta às mudanças nas condições de crédito da economia brasileira. A análise empírica é feita por meio de modelos de séries de tempo, mais especificamente análise de cointegração e vetor de correção de erros. Os resultados confirmam a predição teórica da lei do preço único de que o risco de crédito brasileiro, tanto nos mercados de títulos quanto no mercado de derivativos de crédito, movem-se juntos no longo prazo. Por fim, a maior parte do Price Discovery ocorre no mercado de derivativos de crédito.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta dissertação trata da questão dos preços administrados no Brasil sob a argumentação de que os mesmos apresentam uma persistência mais acentuada do que os demais preços da economia. Para alcançar este objetivo foram verificados alguns testes de persistência inflacionária. Em seguida, utilizou-se a metodologia dos Vetores de Correção de Erro (VEC) para estudar a relação dos preços administrados com as variáveis mais importantes da economia brasileira, tais como, produto, taxa de câmbio, preços livres e taxa de juros Selic. Por fim, utilizou-se do instrumental de Mankiw e Reis (2003) para verificar qual o índice de preços seria mais adequado para manter a atividade econômica brasileira mais próxima de seu nível potencial. Os resultados encontrados foram os seguintes: 1) observou-se persistência do IPCA representada pelos preços administrados; 2) a autoridade monetária responde a choques dos preços monitorados com maior veemência do que a choques nos preços livres; 3) o exercício de Mankiw e Reis (2003) apontou que a porcentagem dos preços monitorados deve ser menor que a atual do IPCA em um índice de preços estabilizador. Desta forma, mostra-se que a presença dos preços administrados dificulta pronunciadamente a condução de política monetária no Brasil.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper aims to investigate on empirical and theoretical grounds the Brazilian exchange rate dynamics under floating exchange rates. The empirical analysis examines the short and long term behavior of the exchange rate, interest rate (domestic and foreign) and country risk using econometric techniques such as variance decomposition, Granger causality, cointegration tests, error correction models, and a GARCH model to estimate the exchange rate volatility. The empirical findings suggest that one can argue in favor of a certain degree of endogeneity of the exchange rate and that flexible rates have not been able to insulate the Brazilian economy in the same patterns predicted by literature due to its own specificities (managed floating with the use of international reserves and domestic interest rates set according to inflation target) and to externally determined variables such as the country risk. Another important outcome is the lack of a closer association of domestic and foreign interest rates since the new exchange regime has been adopted. That is, from January 1999 to May 2004, the US monetary policy has no significant impact on the Brazilian exchange rate dynamics, which has been essentially endogenous primarily when we consider the fiscal dominance expressed by the probability of default.