876 resultados para dynamic factor models
Resumo:
The predominant knowledge-based approach to automated model construction, compositional modelling, employs a set of models of particular functional components. Its inference mechanism takes a scenario describing the constituent interacting components of a system and translates it into a useful mathematical model. This paper presents a novel compositional modelling approach aimed at building model repositories. It furthers the field in two respects. Firstly, it expands the application domain of compositional modelling to systems that can not be easily described in terms of interacting functional components, such as ecological systems. Secondly, it enables the incorporation of user preferences into the model selection process. These features are achieved by casting the compositional modelling problem as an activity-based dynamic preference constraint satisfaction problem, where the dynamic constraints describe the restrictions imposed over the composition of partial models and the preferences correspond to those of the user of the automated modeller. In addition, the preference levels are represented through the use of symbolic values that differ in orders of magnitude.
Resumo:
In this paper, we test a version of the conditional CAPM with respect to a local market portfolio, proxied by the Brazilian stock index during the period 1976-1992. We also test a conditional APT modeI by using the difference between the 3-day rate (Cdb) and the overnight rate as a second factor in addition to the market portfolio in order to capture the large inflation risk present during this period. The conditional CAPM and APT models are estimated by the Generalized Method of Moments (GMM) and tested on a set of size portfolios created from individual securities exchanged on the Brazilian markets. The inclusion of this second factor proves to be important for the appropriate pricing of the portfolios.
Resumo:
Parametric term structure models have been successfully applied to innumerous problems in fixed income markets, including pricing, hedging, managing risk, as well as studying monetary policy implications. On their turn, dynamic term structure models, equipped with stronger economic structure, have been mainly adopted to price derivatives and explain empirical stylized facts. In this paper, we combine flavors of those two classes of models to test if no-arbitrage affects forecasting. We construct cross section (allowing arbitrages) and arbitrage-free versions of a parametric polynomial model to analyze how well they predict out-of-sample interest rates. Based on U.S. Treasury yield data, we find that no-arbitrage restrictions significantly improve forecasts. Arbitrage-free versions achieve overall smaller biases and Root Mean Square Errors for most maturities and forecasting horizons. Furthermore, a decomposition of forecasts into forward-rates and holding return premia indicates that the superior performance of no-arbitrage versions is due to a better identification of bond risk premium.
Resumo:
Several works in the shopping-time and in the human-capital literature, due to the nonconcavity of the underlying Hamiltonian, use Örst-order conditions in dynamic optimization to characterize necessity, but not su¢ ciency, in intertemporal problems. In this work I choose one paper in each one of these two areas and show that optimality can be characterized by means of a simple aplication of Arrowís (1968) su¢ ciency theorem.
Resumo:
This paper confronts the Capital Asset Pricing Model - CAPM - and the 3-Factor Fama-French - FF - model using both Brazilian and US stock market data for the same Sample period (1999-2007). The US data will serve only as a benchmark for comparative purposes. We use two competing econometric methods, the Generalized Method of Moments (GMM) by (Hansen, 1982) and the Iterative Nonlinear Seemingly Unrelated Regression Estimation (ITNLSUR) by Burmeister and McElroy (1988). Both methods nest other options based on the procedure by Fama-MacBeth (1973). The estimations show that the FF model fits the Brazilian data better than CAPM, however it is imprecise compared with the US analog. We argue that this is a consequence of an absence of clear-cut anomalies in Brazilian data, specially those related to firm size. The tests on the efficiency of the models - nullity of intercepts and fitting of the cross-sectional regressions - presented mixed conclusions. The tests on intercept failed to rejected the CAPM when Brazilian value-premium-wise portfolios were used, contrasting with US data, a very well documented conclusion. The ITNLSUR has estimated an economically reasonable and statistically significant market risk premium for Brazil around 6.5% per year without resorting to any particular data set aggregation. However, we could not find the same for the US data during identical period or even using a larger data set. Este estudo procura contribuir com a literatura empírica brasileira de modelos de apreçamento de ativos. Dois dos principais modelos de apreçamento são Infrontados, os modelos Capital Asset Pricing Model (CAPM)e de 3 fatores de Fama-French. São aplicadas ferramentas econométricas pouco exploradas na literatura nacional na estimação de equações de apreçamento: os métodos de GMM e ITNLSUR. Comparam-se as estimativas com as obtidas de dados americanos para o mesmo período e conclui-se que no Brasil o sucesso do modelo de Fama e French é limitado. Como subproduto da análise, (i) testa-se a presença das chamadas anomalias nos retornos, e (ii) calcula-se o prêmio de risco implícito nos retornos das ações. Os dados revelam a presença de um prêmio de valor, porém não de um prêmio de tamanho. Utilizando o método de ITNLSUR, o prêmio de risco de mercado é positivo e significativo, ao redor de 6,5% ao ano.
Resumo:
This dissertation proposes a bivariate markov switching dynamic conditional correlation model for estimating the optimal hedge ratio between spot and futures contracts. It considers the cointegration between series and allows to capture the leverage efect in return equation. The model is applied using daily data of future and spot prices of Bovespa Index and R$/US$ exchange rate. The results in terms of variance reduction and utility show that the bivariate markov switching model outperforms the strategies based ordinary least squares and error correction models.
Resumo:
This doctoral dissertation analyzes two novels by the American novelist Robert Coover as examples of hypertextual writing on the book bound page, as tokens of hyperfiction. The complexity displayed in the novels, John's Wife and The Adventures of Lucky Pierre, integrates the cultural elements that characterize the contemporary condition of capitalism and technologized practices that have fostered a different subjectivity evidenced in hypertextual writing and reading, the posthuman subjectivity. The models that account for the complexity of each novel are drawn from the concept of strange attractors in Chaos Theory and from the concept of rhizome in Nomadology. The transformations the characters undergo in the degree of their corporeality sets the plane on which to discuss turbulence and posthumanity. The notions of dynamic patterns and strange attractors, along with the concept of the Body without Organs and Rhizome are interpreted, leading to the revision of narratology and to analytical categories appropriate to the study of the novels. The reading exercised throughout this dissertation enacts Daniel Punday's corporeal reading. The changes in the characters' degree of materiality are associated with the stages of order, turbulence and chaos in the story, bearing on the constitution of subjectivity within and along the reading process. Coover's inscription of planes of consistency to counter linearity and accommodate hypertextual features to the paper supported narratives describes the characters' trajectory as rhizomatic. The study led to the conclusion that narrative today stands more as a regime in a rhizomatic relation with other regimes in cultural practice than as an exclusively literary form and genre. Besides this, posthuman subjectivity emerges as class identity, holding hypertextual novels as their literary form of choice.
Resumo:
We show that Judd (1982)’s method can be applied to any finite system, contrary to what he claimed in 1987. An example shows how to employ the technic to study monetary models in presence of capital accumulation.
Resumo:
This paper considers the general problem of Feasible Generalized Least Squares Instrumental Variables (FG LS IV) estimation using optimal instruments. First we summarize the sufficient conditions for the FG LS IV estimator to be asymptotic ally equivalent to an optimal G LS IV estimator. Then we specialize to stationary dynamic systems with stationary VAR errors, and use the sufficient conditions to derive new moment conditions for these models. These moment conditions produce useful IVs from the lagged endogenous variables, despite the correlation between errors and endogenous variables. This use of the information contained in the lagged endogenous variables expands the class of IV estimators under consideration and there by potentially improves both asymptotic and small-sample efficiency of the optimal IV estimator in the class. Some Monte Carlo experiments compare the new methods with those of Hatanaka [1976]. For the DG P used in the Monte Carlo experiments, asymptotic efficiency is strictly improved by the new IVs, and experimental small-sample efficiency is improved as well.
Resumo:
Em modelos de competição de preços, somente um custo de procura positivo por parte do consumidor não gera equilíbrio com dispersão de preços. Já modelos dinâmicos de switching cost consistentemente geram este fenômeno bastante documentado para preços no varejo. Embora ambas as literaturas sejam vastas, poucos modelos tentaram combinar as duas fricções em um só modelo. Este trabalho apresenta um modelo dinâmico de competição de preços em que consumidores idênticos enfrentam custos de procura e de switching. O equilíbrio gera dispersão nos preços. Ainda, como os consumidores são obrigados a se comprometer com uma amostra fixa de firmas antes dos preços serem definidos, somente dois preços serão considerados antes de cada compra. Este resultado independe do tamanho do custo de procura individual do consumidor.
Resumo:
Frente às mudanças que estão ocorrendo ao longo dos últimos anos, onde a perspectiva da economia passa da industrial para a do conhecimento, nos defrontamos com um novo cenário onde o capital intelectual do individuo é percebido como fator de fundamental importância para o desenvolvimento e crescimento da organização. No entanto, para que esta evolução ocorra, o conhecimento tácito do individuo precisa ser disseminado e compartilhado com os demais integrantes da organização. A preocupação nas empresas se volta, então, para a elaboração de estratégias que corroborem no aperfeiçoamento de seus processos. Além disso, é preciso, também, gerenciar toda esta dinâmica de construção e desenvolvimento do conhecimento, de maneira adequada e eficaz proporcionando, assim, o surgimento de novos valores e de vantagem competitiva. Inumeros modelos para auxiliar no processo de aprendizagem organizacional foram desenvolvidos por diversos autores e estudiosos, dentre eles destacamos o sistema de lições aprendidas, que é construida a partir de experiências, positivas ou negativas, vivenciadas dentro de um contexto, sob ordenação de padrões culturais próprios, com impacto real e significativo. Baseado em processos e procedimentos estabelecidos pela coordenação de um dos produtos ofertados pela FGV e sua rede de distribuição, este trabalho tem como objetivo analisar, à luz da teoria da gestão do conhecimento e, mais especificamente, da gestão das lições aprendidas, como o gereciamento do conhecimento está sendo efetuado no projeto Melhores Práticas, criado pela referida coordenação. Espera-se, também, entender se as fases da aquisição, do desenvolvimento e da disseminação, neste cenário, estão sendo realizados de forma eficaz e, se os resultados alcançados podem servir como uma base para avaliação do efetivo compartilhamento do conhecimento.
Resumo:
Our main goal is to investigate the question of which interest-rate options valuation models are better suited to support the management of interest-rate risk. We use the German market to test seven spot-rate and forward-rate models with one and two factors for interest-rate warrants for the period from 1990 to 1993. We identify a one-factor forward-rate model and two spot-rate models with two faetors that are not significant1y outperformed by any of the other four models. Further rankings are possible if additional cri teria are applied.
Resumo:
There is strong empirical evidence that risk premia in long-term interest rates are time-varying. These risk premia critically depend on interest rate volatility, yet existing research has not examined the im- pact of time-varying volatility on excess returns for long-term bonds. To address this issue, we incorporate interest rate option prices, which are very sensitive to interest rate volatility, into a dynamic model for the term structure of interest rates. We estimate three-factor affine term structure models using both swap rates and interest rate cap prices. When we incorporate option prices, the model better captures interest rate volatility and is better able to predict excess returns for long-term swaps over short-term swaps, both in- and out-of-sample. Our results indicate that interest rate options contain valuable infor- mation about risk premia and interest rate dynamics that cannot be extracted from interest rates alone.
Resumo:
The implications of technical change that directly alters factor shares are examined. Such change can lower the income of some factors of production even when it raises total output, thus offering a possible explanation for episodes of social conflict such as the Luddite uprisings in 19th century England and the recent divergence in the U. S. between wages for skilled and unskilled labor. An explanation also why underdeveloped countries do not adopt the latest technology but continue to use outmoded production methods. Total factor productivity is shown to be a misleading measure of technical progress. Share-altering technical change brings into question the plausibility of a wide class of endogenous growth models.
Resumo:
Asset allocation decisions and value at risk calculations rely strongly on volatility estimates. Volatility measures such as rolling window, EWMA, GARCH and stochastic volatility are used in practice. GARCH and EWMA type models that incorporate the dynamic structure of volatility and are capable of forecasting future behavior of risk should perform better than constant, rolling window volatility models. For the same asset the model that is the ‘best’ according to some criterion can change from period to period. We use the reality check test∗ to verify if one model out-performs others over a class of re-sampled time-series data. The test is based on re-sampling the data using stationary bootstrapping. For each re-sample we check the ‘best’ model according to two criteria and analyze the distribution of the performance statistics. We compare constant volatility, EWMA and GARCH models using a quadratic utility function and a risk management measurement as comparison criteria. No model consistently out-performs the benchmark.