9 resultados para discrete-time assumption
em Repositório digital da Fundação Getúlio Vargas - FGV
Resumo:
Chambers (1998) explores the interaction between long memory and aggregation. For continuous-time processes, he takes the aliasing effect into account when studying temporal aggregation. For discrete-time processes, however, he seems to fail to do so. This note gives the spectral density function of temporally aggregated long memory discrete-time processes in light of the aliasing effect. The results are different from those in Chambers (1998) and are supported by a small simulation exercise. As a result, the order of aggregation may not be invariant to temporal aggregation, specifically if d is negative and the aggregation is of the stock type.
Resumo:
On using McKenzie’s taxonomy of optimal accumulation in the longrun, we report a “uniform turnpike” theorem of the third kind in a model original to Robinson, Solow and Srinivasan (RSS), and further studied by Stiglitz. Our results are presented in the undiscounted, discrete-time setting emphasized in the recent work of Khan-Mitra, and they rely on the importance of strictly concave felicity functions, or alternatively, on the value of a “marginal rate of transformation”, ξσ, from one period to the next not being unity. Our results, despite their specificity, contribute to the methodology of intertemporal optimization theory, as developed in economics by Ramsey, von Neumann and their followers.
Resumo:
In 1991 Gary S. Becker presented A Note on Restaurant Pricing and Other Examples of Social In uences on Price explaining why many successful restaurants, plays, sporting events, and other activities do not raise their prices even with persistent excess demand. The main reason for this is due to the discontinuity of stable demands, which is explained in Becker's (1991) analysis. In the present paper we construct a discrete time stochastic model of socially interacting consumers deciding for one of two establishments. With this model we show that the discontinuity of stable demands, proposed by Gary S. Becker, depends crucially on an additional factor: the dispersion of the consumers' intrinsic preferences for the establishments.
Resumo:
Este trabalho propõe um novo modelo para avaliação, em tempo discreto, do desconto de reequilíbrio em contratos de concessão rodoviária, a partir de conceitos da Teoria Clássica de Finanças e da Teoria de Opções Reais. O modelo desenvolvido permitiu incorporar flexibilidades decorrentes de incertezas nas situações reais, como decisões gerenciais, vieses de comportamento e componentes políticos, comumente presentes em contratos de concessões rodoviária. Os resultados obtidos, utilizando-se como estudo de caso a BR-262, sinalizaram que há espaço para uma melhor intervenção regulatória com relação ao mecanismo do desconto de reequilíbrio, no sentido de prover melhores incentivos aos concessionários.
Resumo:
This Thesis is the result of my Master Degree studies at the Graduate School of Economics, Getúlio Vargas Foundation, from January 2004 to August 2006. am indebted to my Thesis Advisor, Professor Luiz Renato Lima, who introduced me to the Econometrics' world. In this Thesis, we study time-varying quantile process and we develop two applications, which are presented here as Part and Part II. Each of these parts was transformed in paper. Both papers were submitted. Part shows that asymmetric persistence induces ARCH effects, but the LMARCH test has power against it. On the other hand, the test for asymmetric dynamics proposed by Koenker and Xiao (2004) has correct size under the presence of ARCH errors. These results suggest that the LM-ARCH and the Koenker-Xiao tests may be used in applied research as complementary tools. In the Part II, we compare four different Value-at-Risk (VaR) methodologies through Monte Cario experiments. Our results indicate that the method based on quantile regression with ARCH effect dominates other methods that require distributional assumption. In particular, we show that the non-robust method ologies have higher probability to predict VaRs with too many violations. We illustrate our findings with an empirical exercise in which we estimate VaR for returns of São Paulo stock exchange index, IBOVESPA, during periods of market turmoil. Our results indicate that the robust method based on quantile regression presents the least number of violations.
Resumo:
We argue that it is possible to adapt the approach of imposing restrictions on available plans through finitely effective debt constraints, introduced by Levine and Zame (1996), to encompass models with default and collateral. Along this line, we introduce in the setting of Araujo, Páscoa and Torres-Martínez (2002) and Páscoa and Seghir (2008) the concept of almost finite-time solvency. We show that the conditions imposed in these two papers to rule out Ponzi schemes implicitly restrict actions to be almost finite-time solvent. We define the notion of equilibrium with almost finite-time solvency and look on sufficient conditions for its existence. Assuming a mild assumption on default penalties, namely that agents are myopic with respect to default penalties, we prove that existence is guaranteed (and Ponzi schemes are ruled out) when actions are restricted to be almost finite-time solvent. The proof is very simple and intuitive. In particular, the main existence results in Araujo et al. (2002) and Páscoa and Seghir (2008) are simple corollaries of our existence result.
Resumo:
This paper investigates which properties money-demand functions have to satisfy to be consistent with multidimensional extensions of Lucasí(2000) versions of the Sidrauski (1967) and the shopping-time models. We also investigate how such classes of models relate to each other regarding the rationalization of money demands. We conclude that money demand functions rationalizable by the shoppingtime model are always rationalizable by the Sidrauski model, but that the converse is not true. The log-log money demand with an interest-rate elasticity greater than or equal to one and the semi-log money demand are counterexamples.
Resumo:
This paper develops a framework to test whether discrete-valued irregularly-spaced financial transactions data follow a subordinated Markov process. For that purpose, we consider a specific optional sampling in which a continuous-time Markov process is observed only when it crosses some discrete level. This framework is convenient for it accommodates not only the irregular spacing of transactions data, but also price discreteness. Further, it turns out that, under such an observation rule, the current price duration is independent of previous price durations given the current price realization. A simple nonparametric test then follows by examining whether this conditional independence property holds. Finally, we investigate whether or not bid-ask spreads follow Markov processes using transactions data from the New York Stock Exchange. The motivation lies on the fact that asymmetric information models of market microstructures predict that the Markov property does not hold for the bid-ask spread. The results are mixed in the sense that the Markov assumption is rejected for three out of the five stocks we have analyzed.
Resumo:
When estimating policy parameters, also known as treatment effects, the assignment to treatment mechanism almost always causes endogeneity and thus bias many of these policy parameters estimates. Additionally, heterogeneity in program impacts is more likely to be the norm than the exception for most social programs. In situations where these issues are present, the Marginal Treatment Effect (MTE) parameter estimation makes use of an instrument to avoid assignment bias and simultaneously to account for heterogeneous effects throughout individuals. Although this parameter is point identified in the literature, the assumptions required for identification may be strong. Given that, we use weaker assumptions in order to partially identify the MTE, i.e. to stablish a methodology for MTE bounds estimation, implementing it computationally and showing results from Monte Carlo simulations. The partial identification we perfom requires the MTE to be a monotone function over the propensity score, which is a reasonable assumption on several economics' examples, and the simulation results shows it is possible to get informative even in restricted cases where point identification is lost. Additionally, in situations where estimated bounds are not informative and the traditional point identification is lost, we suggest a more generic method to point estimate MTE using the Moore-Penrose Pseudo-Invese Matrix, achieving better results than traditional methods.