933 resultados para Non-linear error correction models
Resumo:
This paper studies the signalling effect of the consumption−wealth ratio (cay) on German stock returns via vector error correction models (VECMs). The effect of cay on U.S. stock returns has been recently confirmed by Lettau and Ludvigson with a two−stage method. In this paper, performance of the VECMs and the two−stage method are compared in both German and U.S. data. It is found that the VECMs are more suitable to study the effect of cay on stock returns than the two−stage method. Using the Conditional−Subset VECM, cay signals real stock returns and excess returns in both data sets significantly. The estimated coefficient on cay for stock returns turns out to be two times greater in U.S. data than in German data. When the two−stage method is used, cay has no significant effect on German stock returns. Besides, it is also found that cay signals German wealth growth and U.S. income growth significantly.
Resumo:
This paper examines the effects of liquidity during the 2007–09 crisis, focussing on the Senior Tranche of the CDX.NA.IG Index and on Moody's AAA Corporate Bond Index. It aims to understand whether the sharp increase in the credit spreads of these AAA-rated credit indices can be explained by worse credit fundamentals alone or whether it also reflects a lack of depth in the relevant markets, the scarcity of risk-capital, and the liquidity preference exhibited by investors. Using cointegration analysis and error correction models, the paper shows that during the crisis lower market and funding liquidity are important drivers of the increase in the credit spread of the AAA-rated structured product, whilst they are less significant in explaining credit spread changes for a portfolio of unstructured credit instruments. Looking at the experience of the subprime crisis, the study shows that when the conditions under which securitisation can work properly (liquidity, transparency and tradability) suddenly disappear, investors are left highly exposed to systemic risk.
Resumo:
Filter degeneracy is the main obstacle for the implementation of particle filter in non-linear high-dimensional models. A new scheme, the implicit equal-weights particle filter (IEWPF), is introduced. In this scheme samples are drawn implicitly from proposal densities with a different covariance for each particle, such that all particle weights are equal by construction. We test and explore the properties of the new scheme using a 1,000-dimensional simple linear model, and the 1,000-dimensional non-linear Lorenz96 model, and compare the performance of the scheme to a Local Ensemble Kalman Filter. The experiments show that the new scheme can easily be implemented in high-dimensional systems and is never degenerate, with good convergence properties in both systems.
Resumo:
This dissertation proposes a bivariate markov switching dynamic conditional correlation model for estimating the optimal hedge ratio between spot and futures contracts. It considers the cointegration between series and allows to capture the leverage efect in return equation. The model is applied using daily data of future and spot prices of Bovespa Index and R$/US$ exchange rate. The results in terms of variance reduction and utility show that the bivariate markov switching model outperforms the strategies based ordinary least squares and error correction models.
Resumo:
The paper aims to investigate on empirical and theoretical grounds the Brazilian exchange rate dynamics under floating exchange rates. The empirical analysis examines the short and long term behavior of the exchange rate, interest rate (domestic and foreign) and country risk using econometric techniques such as variance decomposition, Granger causality, cointegration tests, error correction models, and a GARCH model to estimate the exchange rate volatility. The empirical findings suggest that one can argue in favor of a certain degree of endogeneity of the exchange rate and that flexible rates have not been able to insulate the Brazilian economy in the same patterns predicted by literature due to its own specificities (managed floating with the use of international reserves and domestic interest rates set according to inflation target) and to externally determined variables such as the country risk. Another important outcome is the lack of a closer association of domestic and foreign interest rates since the new exchange regime has been adopted. That is, from January 1999 to May 2004, the US monetary policy has no significant impact on the Brazilian exchange rate dynamics, which has been essentially endogenous primarily when we consider the fiscal dominance expressed by the probability of default.
Resumo:
O objetivo do presente trabalho é utilizar modelos econométricos de séries de tempo para previsão do comportamento da inadimplência agregada utilizando um conjunto amplo de informação, através dos métodos FAVAR (Factor-Augmented Vector Autoregressive) de Bernanke, Boivin e Eliasz (2005) e FAVECM (Factor-augmented Error Correction Models) de Baneerjee e Marcellino (2008). A partir disso, foram construídas previsões fora da amostra de modo a comparar a eficácia de projeção dos modelos contra modelos univariados mais simples - ARIMA - modelo auto-regressivo integrado de média móvel e SARIMA - modelo sazonal auto-regressivo integrado de média móvel. Para avaliação da eficácia preditiva foi utilizada a metodologia MCS (Model Confidence Set) de Hansen, Lunde e James (2011) Essa metodologia permite comparar a superioridade de modelos temporais vis-à-vis a outros modelos.
Resumo:
Esta tese é composta de três artigos que analisam a estrutura a termo das taxas de juros usando diferentes bases de dados e modelos. O capítulo 1 propõe um modelo paramétrico de taxas de juros que permite a segmentação e choques locais na estrutura a termo. Adotando dados do tesouro americano, duas versões desse modelo segmentado são implementadas. Baseado em uma sequência de 142 experimentos de previsão, os modelos propostos são comparados à benchmarks e concluí-se que eles performam melhor nos resultados das previsões fora da amostra, especialmente para as maturidades curtas e para o horizonte de previsão de 12 meses. O capítulo 2 acrescenta restrições de não arbitragem ao estimar um modelo polinomial gaussiano dinâmico de estrutura a termo para o mercado de taxas de juros brasileiro. Esse artigo propõe uma importante aproximação para a série temporal dos fatores de risco da estrutura a termo, que permite a extração do prêmio de risco das taxas de juros sem a necessidade de otimização de um modelo dinâmico completo. Essa metodologia tem a vantagem de ser facilmente implementada e obtém uma boa aproximação para o prêmio de risco da estrutura a termo, que pode ser usada em diferentes aplicações. O capítulo 3 modela a dinâmica conjunta das taxas nominais e reais usando um modelo afim de não arbitagem com variáveis macroeconômicas para a estrutura a termo, afim de decompor a diferença entre as taxas nominais e reais em prêmio de risco de inflação e expectativa de inflação no mercado americano. Uma versão sem variáveis macroeconômicas e uma versão com essas variáveis são implementadas e os prêmios de risco de inflação obtidos são pequenos e estáveis no período analisado, porém possuem diferenças na comparação dos dois modelos analisados.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Study IReal Wage Determination in the Swedish Engineering Industry This study uses the monopoly union model to examine the determination of real wages and in particular the effects of active labour market programmes (ALMPs) on real wages in the engineering industry. Quarterly data for the period 1970:1 to 1996:4 are used in a cointegration framework, utilising the Johansen's maximum likelihood procedure. On a basis of the Johansen (trace) test results, vector error correction (VEC) models are created in order to model the determination of real wages in the engineering industry. The estimation results support the presence of a long-run wage-raising effect to rises in the labour productivity, in the tax wedge, in the alternative real consumer wage and in real UI benefits. The estimation results also support the presence of a long-run wage-raising effect due to positive changes in the participation rates regarding ALMPs, relief jobs and labour market training. This could be interpreted as meaning that the possibility of being a participant in an ALMP increases the utility for workers of not being employed in the industry, which in turn could increase real wages in the industry in the long run. Finally, the estimation results show evidence of a long-run wage-reducing effect due to positive changes in the unemployment rate. Study IIIntersectoral Wage Linkages in Sweden The purpose of this study is to investigate whether the wage-setting in certain sectors of the Swedish economy affects the wage-setting in other sectors. The theoretical background is the Scandinavian model of inflation, which states that the wage-setting in the sectors exposed to international competition affects the wage-setting in the sheltered sectors of the economy. The Johansen maximum likelihood cointegration approach is applied to quarterly data on Swedish sector wages for the period 1980:1–2002:2. Different vector error correction (VEC) models are created, based on assumptions as to which sectors are exposed to international competition and which are not. The adaptability of wages between sectors is then tested by imposing restrictions on the estimated VEC models. Finally, Granger causality tests are performed in the different restricted/unrestricted VEC models to test for sector wage leadership. The empirical results indicate considerable adaptability in wages as between manufacturing, construction, the wholesale and retail trade, the central government sector and the municipalities and county councils sector. This is consistent with the assumptions of the Scandinavian model. Further, the empirical results indicate a low level of adaptability in wages as between the financial sector and manufacturing, and between the financial sector and the two public sectors. The Granger causality tests provide strong evidence for the presence of intersectoral wage causality, but no evidence of a wage-leading role in line with the assumptions of the Scandinavian model for any of the sectors. Study IIIWage and Price Determination in the Private Sector in Sweden The purpose of this study is to analyse wage and price determination in the private sector in Sweden during the period 1980–2003. The theoretical background is a variant of the “Imperfect competition model of inflation”, which assumes imperfect competition in the labour and product markets. According to the model wages and prices are determined as a result of a “battle of mark-ups” between trade unions and firms. The Johansen maximum likelihood cointegration approach is applied to quarterly Swedish data on consumer prices, import prices, private-sector nominal wages, private-sector labour productivity and the total unemployment rate for the period 1980:1–2003:3. The chosen cointegration rank of the estimated vector error correction (VEC) model is two. Thus, two cointegration relations are assumed: one for private-sector nominal wage determination and one for consumer price determination. The estimation results indicate that an increase of consumer prices by one per cent lifts private-sector nominal wages by 0.8 per cent. Furthermore, an increase of private-sector nominal wages by one per cent increases consumer prices by one per cent. An increase of one percentage point in the total unemployment rate reduces private-sector nominal wages by about 4.5 per cent. The long-run effects of private-sector labour productivity and import prices on consumer prices are about –1.2 and 0.3 per cent, respectively. The Rehnberg agreement during 1991–92 and the monetary policy shift in 1993 affected the determination of private-sector nominal wages, private-sector labour productivity, import prices and the total unemployment rate. The “offensive” devaluation of the Swedish krona by 16 per cent in 1982:4, and the start of a floating Swedish krona and the substantial depreciation of the krona at this time affected the determination of import prices.
Resumo:
Despite the impact of red blood cell (RBC) Life-spans in some disease areas such as diabetes or anemia of chronic kidney disease, there is no consensus on how to quantitatively best describe the process. Several models have been proposed to explain the elimination process of RBCs: random destruction process, homogeneous life-span model, or a series of 4-transit compartment model. The aim of this work was to explore the different models that have been proposed in literature, and modifications to those. The impact of choosing the right model on future outcomes prediction--in the above mentioned areas--was also investigated. Both data from indirect (clinical data) and direct life-span measurement (biotin-labeled data) methods were analyzed using non-linear mixed effects models. Analysis showed that: (1) predictions from non-steady state data will depend on the RBC model chosen; (2) the transit compartment model, which considers variation in life-span in the RBC population, better describes RBC survival data than the random destruction or homogenous life-span models; and (3) the additional incorporation of random destruction patterns, although improving the description of the RBC survival data, does not appear to provide a marked improvement when describing clinical data.
Resumo:
This paper reveals the characteristics of the ITC's decisions on countervailing duties, which have seldom been studied. The empirical evidences based on time series data show that there is a long run equilibrium relationship between affirmative countervailing decisions and macroeconomic variables such as economic growth rates and import penetration ratios. The error correction models show that there is a unidirectional causality from affirmative countervailing decisions to slower economic growth.
Resumo:
On Wednesday 11th May 2011 at 6:47 pm (local time) a magnitude 5.1 Mw earthquake occurred 6 km northeast of Lorca with a depth of around 5 km. As a consequence of the shallow depth and the small epicentral distance, important damage was produced in several masonry constructions and even led to the collapse of one of them. Pieces of the facades of several buildings fell down onto the sidewalk, being one of the reasons for the killing of a total of 9 people. The objective of this paper is to describe and analyze the failure patterns observed in reinforced concrete frame buildings with masonry infill walls ranging from 3 to 8 floors in height. Structural as well as non-structural masonry walls suffered important damage that led to redistributions of forces causing in some cases the failure of columns. The importance of the interaction between the structural frames and the infill panels is analyzed by means of non-linear Finite Element Models. The resulting load levels are compared with the member capacities and the changes of the mechanical properties during the seismic event are described and discussed. In the light of the results obtained the observed failure patterns are explained. Some comments are stated concerning the adequacy of the numerical models that are usually used during the design phase for the seismic analysis.
Resumo:
En este trabajo se analiza la relación entre la evolución de la demanda del conjunto de la economía española frente a la de los principales componentes de gasto turístico, así como el grado de dependencia respecto al comportamiento económico de los principales países emisores de turistas hacia España. Para investigar la existencia de una relación entre estas variables se explican las principales técnicas de cointegración, que en caso de detectar dicha vinculación permiten estimar modelos de corrección del error. En concreto, en el trabajo se evalúa el comportamiento de estas variables a través de test de raíces unitarias para poder desarrollar modelos que permitan detectar con mayor precisión estas relaciones.
Resumo:
Las fórmulas basadas en la teoría de la elasticidad son ampliamente utilizadas para el cálculo de asientos de cimentaciones, ya que la totalidad de la normativa geotécnica recomienda su empleo. No obstante, estos métodos no cubren todas las situaciones geotécnicamente posibles ya que frecuentemente las condiciones geológicas son complejas. En este trabajo se analiza la influencia de la presencia de una capa rígida inclinada en los asientos elásticos de una cimentación superficial. Para ello se han resuelto 273 modelos tridimensionales no lineales de elementos finitos, variando los parámetros clave del problema: la inclinación y la profundidad de la capa rígida y la rigidez de la cimentación. Finalmente, se ha realizado un análisis estadístico de los resultados de los modelos y se ha propuesto una fórmula que puede ser utilizada en el cálculo de asientos por métodos elásticos, para tener en consideración la presencia de una capa rígida inclinada en profundidad.
Resumo:
Optical data communication systems are prone to a variety of processes that modify the transmitted signal, and contribute errors in the determination of 1s from 0s. This is a difficult, and commercially important, problem to solve. Errors must be detected and corrected at high speed, and the classifier must be very accurate; ideally it should also be tunable to the characteristics of individual communication links. We show that simple single layer neural networks may be used to address these problems, and examine how different input representations affect the accuracy of bit error correction. Our results lead us to conclude that a system based on these principles can perform at least as well as an existing non-trainable error correction system, whilst being tunable to suit the individual characteristics of different communication links.