969 resultados para [JEL:C10] Mathématiques et méthodes quantitatives - Économétrie et méthodes statistiques
Resumo:
In this paper, we use identification-robust methods to assess the empirical adequacy of a New Keynesian Phillips Curve (NKPC) equation. We focus on the Gali and Gertler’s (1999) specification, on both U.S. and Canadian data. Two variants of the model are studied: one based on a rationalexpectations assumption, and a modification to the latter which consists in using survey data on inflation expectations. The results based on these two specifications exhibit sharp differences concerning: (i) identification difficulties, (ii) backward-looking behavior, and (ii) the frequency of price adjustments. Overall, we find that there is some support for the hybrid NKPC for the U.S., whereas the model is not suited to Canada. Our findings underscore the need for employing identificationrobust inference methods in the estimation of expectations-based dynamic macroeconomic relations.
Resumo:
Understanding the dynamics of interest rates and the term structure has important implications for issues as diverse as real economic activity, monetary policy, pricing of interest rate derivative securities and public debt financing. Our paper follows a longstanding tradition of using factor models of interest rates but proposes a semi-parametric procedure to model interest rates.
Resumo:
It is well-known that non-cooperative and cooperative game theory may yield different solutions to games. These differences are particularly dramatic in the case of truels, or three-person duels, in which the players may fire sequentially or simultaneously, and the games may be one-round or n-round. Thus, it is never a Nash equilibrium for all players to hold their fire in any of these games, whereas in simultaneous one-round and n-round truels such cooperation, wherein everybody survives, is in both the a -core and ß -core. On the other hand, both cores may be empty, indicating a lack of stability, when the unique Nash equilibrium is one survivor. Conditions under which each approach seems most applicable are discussed. Although it might be desirable to subsume the two approaches within a unified framework, such unification seems unlikely since the two approaches are grounded in fundamentally different notions of stability.
Resumo:
The rationalizability of a choice function by means of a transitive relation has been analyzed thoroughly in the literature. However, not much seems to be known when transitivity is weakened to quasi-transitivity or acyclicity. We describe the logical relationships between the different notions of rationalizability involving, for example, the transitivity, quasi-transitivity, or acyclicity of the rationalizing relation. Furthermore, we discuss sufficient conditions and necessary conditions for rational choice on arbitrary domains. Transitive, quasi-transitive, and acyclical rationalizability are fully characterized for domains that contain all singletons and all two-element subsets of the universal set.
Resumo:
A contingent contract in a transferable utility game under uncertainty specifies an outcome for each possible state. It is assumed that coalitions evaluate these contracts by considering the minimal possible excesses. A main question of the paper concerns the existence and characterization of efficient contracts. It is shown that they exist if and only if the set of possible coalitions contains a balanced subset. Moreover, a characterization of values that result in efficient contracts in the case of minimally balanced collections is provided.
Resumo:
Presently, conditions ensuring the validity of bootstrap methods for the sample mean of (possibly heterogeneous) near epoch dependent (NED) functions of mixing processes are unknown. Here we establish the validity of the bootstrap in this context, extending the applicability of bootstrap methods to a class of processes broadly relevant for applications in economics and finance. Our results apply to two block bootstrap methods: the moving blocks bootstrap of Künsch ( 989) and Liu and Singh ( 992), and the stationary bootstrap of Politis and Romano ( 994). In particular, the consistency of the bootstrap variance estimator for the sample mean is shown to be robust against heteroskedasticity and dependence of unknown form. The first order asymptotic validity of the bootstrap approximation to the actual distribution of the sample mean is also established in this heterogeneous NED context.
Resumo:
In this paper, we study several tests for the equality of two unknown distributions. Two are based on empirical distribution functions, three others on nonparametric probability density estimates, and the last ones on differences between sample moments. We suggest controlling the size of such tests (under nonparametric assumptions) by using permutational versions of the tests jointly with the method of Monte Carlo tests properly adjusted to deal with discrete distributions. We also propose a combined test procedure, whose level is again perfectly controlled through the Monte Carlo test technique and has better power properties than the individual tests that are combined. Finally, in a simulation experiment, we show that the technique suggested provides perfect control of test size and that the new tests proposed can yield sizeable power improvements.
Resumo:
In this paper, we provide both qualitative and quantitative measures of the cost of measuring the integrated volatility by the realized volatility when the frequency of observation is fixed. We start by characterizing for a general diffusion the difference between the realized and the integrated volatilities for a given frequency of observations. Then, we compute the mean and variance of this noise and the correlation between the noise and the integrated volatility in the Eigenfunction Stochastic Volatility model of Meddahi (2001a). This model has, as special examples, log-normal, affine, and GARCH diffusion models. Using some previous empirical works, we show that the standard deviation of the noise is not negligible with respect to the mean and the standard deviation of the integrated volatility, even if one considers returns at five minutes. We also propose a simple approach to capture the information about the integrated volatility contained in the returns through the leverage effect.
Resumo:
In this paper, we introduce a new approach for volatility modeling in discrete and continuous time. We follow the stochastic volatility literature by assuming that the variance is a function of a state variable. However, instead of assuming that the loading function is ad hoc (e.g., exponential or affine), we assume that it is a linear combination of the eigenfunctions of the conditional expectation (resp. infinitesimal generator) operator associated to the state variable in discrete (resp. continuous) time. Special examples are the popular log-normal and square-root models where the eigenfunctions are the Hermite and Laguerre polynomials respectively. The eigenfunction approach has at least six advantages: i) it is general since any square integrable function may be written as a linear combination of the eigenfunctions; ii) the orthogonality of the eigenfunctions leads to the traditional interpretations of the linear principal components analysis; iii) the implied dynamics of the variance and squared return processes are ARMA and, hence, simple for forecasting and inference purposes; (iv) more importantly, this generates fat tails for the variance and returns processes; v) in contrast to popular models, the variance of the variance is a flexible function of the variance; vi) these models are closed under temporal aggregation.
Resumo:
This paper proposes a systematic framework for analyzing the dynamic effects of permanent and transitory shocks on a system of \"n\" economic variables.