918 resultados para dynamic factor models
Resumo:
A new algorithm called the parameterized expectations approach(PEA) for solving dynamic stochastic models under rational expectationsis developed and its advantages and disadvantages are discussed. Thisalgorithm can, in principle, approximate the true equilibrium arbitrarilywell. Also, this algorithm works from the Euler equations, so that theequilibrium does not have to be cast in the form of a planner's problem.Monte--Carlo integration and the absence of grids on the state variables,cause the computation costs not to go up exponentially when the numberof state variables or the exogenous shocks in the economy increase. \\As an application we analyze an asset pricing model with endogenousproduction. We analyze its implications for time dependence of volatilityof stock returns and the term structure of interest rates. We argue thatthis model can generate hump--shaped term structures.
Resumo:
This paper presents several applications to interest rate risk managementbased on a two-factor continuous-time model of the term structure of interestrates previously presented in Moreno (1996). This model assumes that defaultfree discount bond prices are determined by the time to maturity and twofactors, the long-term interest rate and the spread (difference between thelong-term rate and the short-term (instantaneous) riskless rate). Several newmeasures of ``generalized duration" are presented and applied in differentsituations in order to manage market risk and yield curve risk. By means ofthese measures, we are able to compute the hedging ratios that allows us toimmunize a bond portfolio by means of options on bonds. Focusing on thehedging problem, it is shown that these new measures allow us to immunize abond portfolio against changes (parallel and/or in the slope) in the yieldcurve. Finally, a proposal of solution of the limitations of conventionalduration by means of these new measures is presented and illustratednumerically.
Resumo:
This paper presents a two--factor model of the term structure ofinterest rates. We assume that default free discount bond prices aredetermined by the time to maturity and two factors, the long--term interestrate and the spread (difference between the long--term rate and theshort--term (instantaneous) riskless rate). Assuming that both factorsfollow a joint Ornstein--Uhlenbeck process, a general bond pricing equationis derived. We obtain a closed--form expression for bond prices andexamine its implications for the term structure of interest rates. We alsoderive a closed--form solution for interest rate derivatives prices. Thisexpression is applied to price European options on discount bonds andmore complex types of options. Finally, empirical evidence of the model'sperformance is presented.
Resumo:
The human motion study, which relies on mathematical and computational models ingeneral, and multibody dynamic biomechanical models in particular, has become asubject of many recent researches. The human body model can be applied to different physical exercises and many important results such as muscle forces, which are difficult to be measured through practical experiments, can be obtained easily. In the work, human skeletal lower limb model consisting of three bodies in build using the flexible multibody dynamics simulation approach. The floating frame of reference formulation is used to account for the flexibility in the bones of the human lower limb model. The main reason of considering the flexibility inthe human bones is to measure the strains in the bone result from different physical exercises. It has been perceived the bone under strain will become stronger in order to cope with the exercise. On the other hand, the bone strength is considered and important factors in reducing the bone fractures. The simulation approach and model developed in this work are used to measure the bone strain results from applying raising the sole of the foot exercise. The simulation results are compared to the results available in literature. The comparison shows goof agreement. This study sheds the light on the importance of using the flexible multibody dynamic simulation approach to build human biomechanical models, which can be used in developing some exercises to achieve the optimalbone strength.
Resumo:
Understanding the dynamics of interest rates and the term structure has important implications for issues as diverse as real economic activity, monetary policy, pricing of interest rate derivative securities and public debt financing. Our paper follows a longstanding tradition of using factor models of interest rates but proposes a semi-parametric procedure to model interest rates.
Resumo:
Le but de cette thèse est d étendre la théorie du bootstrap aux modèles de données de panel. Les données de panel s obtiennent en observant plusieurs unités statistiques sur plusieurs périodes de temps. Leur double dimension individuelle et temporelle permet de contrôler l 'hétérogénéité non observable entre individus et entre les périodes de temps et donc de faire des études plus riches que les séries chronologiques ou les données en coupe instantanée. L 'avantage du bootstrap est de permettre d obtenir une inférence plus précise que celle avec la théorie asymptotique classique ou une inférence impossible en cas de paramètre de nuisance. La méthode consiste à tirer des échantillons aléatoires qui ressemblent le plus possible à l échantillon d analyse. L 'objet statitstique d intérêt est estimé sur chacun de ses échantillons aléatoires et on utilise l ensemble des valeurs estimées pour faire de l inférence. Il existe dans la littérature certaines application du bootstrap aux données de panels sans justi cation théorique rigoureuse ou sous de fortes hypothèses. Cette thèse propose une méthode de bootstrap plus appropriée aux données de panels. Les trois chapitres analysent sa validité et son application. Le premier chapitre postule un modèle simple avec un seul paramètre et s 'attaque aux propriétés théoriques de l estimateur de la moyenne. Nous montrons que le double rééchantillonnage que nous proposons et qui tient compte à la fois de la dimension individuelle et la dimension temporelle est valide avec ces modèles. Le rééchantillonnage seulement dans la dimension individuelle n est pas valide en présence d hétérogénéité temporelle. Le ré-échantillonnage dans la dimension temporelle n est pas valide en présence d'hétérogénéité individuelle. Le deuxième chapitre étend le précédent au modèle panel de régression. linéaire. Trois types de régresseurs sont considérés : les caractéristiques individuelles, les caractéristiques temporelles et les régresseurs qui évoluent dans le temps et par individu. En utilisant un modèle à erreurs composées doubles, l'estimateur des moindres carrés ordinaires et la méthode de bootstrap des résidus, on montre que le rééchantillonnage dans la seule dimension individuelle est valide pour l'inférence sur les coe¢ cients associés aux régresseurs qui changent uniquement par individu. Le rééchantillonnage dans la dimen- sion temporelle est valide seulement pour le sous vecteur des paramètres associés aux régresseurs qui évoluent uniquement dans le temps. Le double rééchantillonnage est quand à lui est valide pour faire de l inférence pour tout le vecteur des paramètres. Le troisième chapitre re-examine l exercice de l estimateur de différence en di¤érence de Bertrand, Duflo et Mullainathan (2004). Cet estimateur est couramment utilisé dans la littérature pour évaluer l impact de certaines poli- tiques publiques. L exercice empirique utilise des données de panel provenant du Current Population Survey sur le salaire des femmes dans les 50 états des Etats-Unis d Amérique de 1979 à 1999. Des variables de pseudo-interventions publiques au niveau des états sont générées et on s attend à ce que les tests arrivent à la conclusion qu il n y a pas d e¤et de ces politiques placebos sur le salaire des femmes. Bertrand, Du o et Mullainathan (2004) montre que la non-prise en compte de l hétérogénéité et de la dépendance temporelle entraîne d importantes distorsions de niveau de test lorsqu'on évalue l'impact de politiques publiques en utilisant des données de panel. Une des solutions préconisées est d utiliser la méthode de bootstrap. La méthode de double ré-échantillonnage développée dans cette thèse permet de corriger le problème de niveau de test et donc d'évaluer correctement l'impact des politiques publiques.
Resumo:
This paper considers an overlapping generations model in which capital investment is financed in a credit market with adverse selection. Lenders’ inability to commit ex-ante not to bailout ex-post, together with a wealthy position of entrepreneurs gives rise to the soft budget constraint syndrome, i.e. the absence of liquidation of poor performing firms on a regular basis. This problem arises endogenously as a result of the interaction between the economic behavior of agents, without relying on political economy explanations. We found the problem more binding along the business cycle, providing an explanation to creditors leniency during booms in some LatinAmerican countries in the late seventies and early nineties.
Resumo:
In this paper we reviewed the models of volatility for a group of five Latin American countries, mainly motivated by the recent periods of financial turbulence. Our results based on high frequency data suggest that Dynamic multivariate models are more powerful to study the volatilities of asset returns than Constant Conditional Correlation models. For the group of countries included, we identified that domestic volatilities of asset markets have been increasing; but the co-volatility of the region is still moderate.
Resumo:
Financial integration has been pursued aggressively across the globe in the last fifty years; however, there is no conclusive evidence on the diversification gains (or losses) of such efforts. These gains (or losses) are related to the degree of comovements and synchronization among increasingly integrated global markets. We quantify the degree of comovements within the integrated Latin American market (MILA). We use dynamic correlation models to quantify comovements across securities as well as a direct integration measure. Our results show an increase in comovements when we look at the country indexes, however, the increase in the trend of correlation is previous to the institutional efforts to establish an integrated market in the region. On the other hand, when we look at sector indexes and an integration measure, we find a decreased in comovements among a representative sample of securities form the integrated market.
Resumo:
Inverse problems for dynamical system models of cognitive processes comprise the determination of synaptic weight matrices or kernel functions for neural networks or neural/dynamic field models, respectively. We introduce dynamic cognitive modeling as a three tier top-down approach where cognitive processes are first described as algorithms that operate on complex symbolic data structures. Second, symbolic expressions and operations are represented by states and transformations in abstract vector spaces. Third, prescribed trajectories through representation space are implemented in neurodynamical systems. We discuss the Amari equation for a neural/dynamic field theory as a special case and show that the kernel construction problem is particularly ill-posed. We suggest a Tikhonov-Hebbian learning method as regularization technique and demonstrate its validity and robustness for basic examples of cognitive computations.
Resumo:
Factor forecasting models are shown to deliver real-time gains over autoregressive models for US real activity variables during the recent period, but are less successful for nominal variables. The gains are largely due to the Financial Crisis period, and are primarily at the shortest (one quarter ahead) horizon. Excluding the pre-Great Moderation years from the factor forecasting model estimation period (but not from the data used to extract factors) results in a marked fillip in factor model forecast accuracy, but does the same for the AR model forecasts. The relative performance of the factor models compared to the AR models is largely unaffected by whether the exercise is in real time or is pseudo out-of-sample.
Resumo:
In this paper we construct common-factor portfolios using a novel linear transformation of standard factor models extracted from large data sets of asset returns. The simple transformation proposed here keeps the basic properties of the usual factor transformations, although some new interesting properties are further attached to them. Some theoretical advantages are shown to be present. Also, their practical importance is confirmed in two applications: the performance of common-factor portfolios are shown to be superior to that of asset returns and factors commonly employed in the finance literature.
Resumo:
Este trabalho analisa a importância dos fatores comuns na evolução recente dos preços dos metais no período entre 1995 e 2013. Para isso, estimam-se modelos cointegrados de VAR e também um modelo de fator dinâmico bayesiano. Dado o efeito da financeirização das commodities, DFM pode capturar efeitos dinâmicos comuns a todas as commodities. Além disso, os dados em painel são aplicados para usar toda a heterogeneidade entre as commodities durante o período de análise. Nossos resultados mostram que a taxa de juros, taxa efetiva do dólar americano e também os dados de consumo têm efeito permanente nos preços das commodities. Observa-se ainda a existência de um fator dinâmico comum significativo para a maioria dos preços das commodities metálicas, que tornou-se recentemente mais importante na evolução dos preços das commodities.
Resumo:
This paper constructs new business cycle indices for Argentina, Brazil, Chile, and Mexico based on common dynamic factors extracted from a comprehensive set of sectoral output, external trade, fiscal and financial variables. The analysis spans the 135 years since the insertion of these economies into the global economy in the 1870s. The constructed indices are used to derive a business cyc1e chronology for these countries and characterize a set of new stylized facts. In particular, we show that ali four countries have historically displayed a striking combination of high business cyc1e volatility and persistence relative to advanced country benchmarks. Volatility changed considerably over time, however, being very high during early formative decades through the Great Depression, and again during the 1970s and ear1y 1980s, before declining sharply in three of the four countries. We also identify a sizeable common factor across the four economies which variance decompositions ascribe mostly to foreign interest rates and shocks to commodity terms of trade.
Resumo:
The approach proposed here explores the hierarchical nature of item-level data on price changes. On one hand, price data is naturally organized around a regional strucuture, with variations being observed on separate cities. Moreover, the itens that comprise the natural structure of CPIs are also normally interpreted in terms of groups that have economic interpretations, such as tradables and non-tradables, energyrelated, raw foodstuff, monitored prices, etc. The hierarchical dynamic factor model allow the estimation of multiple factors that are naturally interpreted as relating to each of these regional and economic levels.