997 resultados para Investisseurs individuels--Attitudes--Modèles économétriques
Resumo:
La causalité au sens de Granger est habituellement définie par la prévisibilité d'un vecteur de variables par un autre une période à l'avance. Récemment, Lutkepohl (1990) a proposé de définir la non-causalité entre deux variables (ou vecteurs) par la non-prévisibilité à tous les délais dans le futur. Lorsqu'on considère plus de deux vecteurs (ie. lorsque l'ensemble d'information contient les variables auxiliaires), ces deux notions ne sont pas équivalentes. Dans ce texte, nous généralisons d'abord les notions antérieures de causalités en considérant la causalité à un horizon donné h arbitraire, fini ou infini. Ensuite, nous dérivons des conditions nécessaires et suffisantes de non-causalité entre deux vecteurs de variables (à l'intérieur d'un plus grand vecteur) jusqu'à un horizon donné h. Les modèles considérés incluent les autoregressions vectorielles, possiblement d'ordre infini, et les modèles ARIMA multivariés. En particulier, nous donnons des conditions de séparabilité et de rang pour la non-causalité jusqu'à un horizon h, lesquelles sont relativement simples à vérifier.
Resumo:
This paper develops a general stochastic framework and an equilibrium asset pricing model that make clear how attitudes towards intertemporal substitution and risk matter for option pricing. In particular, we show under which statistical conditions option pricing formulas are not preference-free, in other words, when preferences are not hidden in the stock and bond prices as they are in the standard Black and Scholes (BS) or Hull and White (HW) pricing formulas. The dependence of option prices on preference parameters comes from several instantaneous causality effects such as the so-called leverage effect. We also emphasize that the most standard asset pricing models (CAPM for the stock and BS or HW preference-free option pricing) are valid under the same stochastic setting (typically the absence of leverage effect), regardless of preference parameter values. Even though we propose a general non-preference-free option pricing formula, we always keep in mind that the BS formula is dominant both as a theoretical reference model and as a tool for practitioners. Another contribution of the paper is to characterize why the BS formula is such a benchmark. We show that, as soon as we are ready to accept a basic property of option prices, namely their homogeneity of degree one with respect to the pair formed by the underlying stock price and the strike price, the necessary statistical hypotheses for homogeneity provide BS-shaped option prices in equilibrium. This BS-shaped option-pricing formula allows us to derive interesting characterizations of the volatility smile, that is, the pattern of BS implicit volatilities as a function of the option moneyness. First, the asymmetry of the smile is shown to be equivalent to a particular form of asymmetry of the equivalent martingale measure. Second, this asymmetry appears precisely when there is either a premium on an instantaneous interest rate risk or on a generalized leverage effect or both, in other words, whenever the option pricing formula is not preference-free. Therefore, the main conclusion of our analysis for practitioners should be that an asymmetric smile is indicative of the relevance of preference parameters to price options.
Resumo:
This paper exploits the term structure of interest rates to develop testable economic restrictions on the joint process of long-term interest rates and inflation when the latter is subject to a targeting policy by the Central Bank. Two competing models that econometrically describe agents’ inferences about inflation targets are developed and shown to generate distinct predictions on the behavior of interest rates. In an empirical application to the Canadian inflation target zone, results indicate that agents perceive the band to be substantially narrower than officially announced and asymmetric around the stated mid-point. The latter result (i) suggests that the monetary authority attaches different weights to positive and negative deviations from the central target, and (ii) challenges on empirical grounds the assumption, frequently made in the literature, that the policy maker’s loss function is symmetric (usually a quadratic function) around a desired inflation value.
Resumo:
In the context of multivariate regression (MLR) and seemingly unrelated regressions (SURE) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. in this paper, we propose finite-and large-sample likelihood-based test procedures for possibly non-linear hypotheses on the coefficients of MLR and SURE systems.
Resumo:
Recent work shows that a low correlation between the instruments and the included variables leads to serious inference problems. We extend the local-to-zero analysis of models with weak instruments to models with estimated instruments and regressors and with higher-order dependence between instruments and disturbances. This makes this framework applicable to linear models with expectation variables that are estimated non-parametrically. Two examples of such models are the risk-return trade-off in finance and the impact of inflation uncertainty on real economic activity. Results show that inference based on Lagrange Multiplier (LM) tests is more robust to weak instruments than Wald-based inference. Using LM confidence intervals leads us to conclude that no statistically significant risk premium is present in returns on the S&P 500 index, excess holding yields between 6-month and 3-month Treasury bills, or in yen-dollar spot returns.
Resumo:
In this paper, we look at how labor market conditions at different points during the tenure of individuals with firms are correlated with current earnings. Using data on individuals from the German Socioeconomic Panel for the 1985-1994 period, we find that both the contemporaneous unemployment rate and prior values of the unemployment rate are significantly correlated with current earnings, contrary to results for the American labor market. Estimated elasticities vary between 9 and 15 percent for the elasticity of earnings with respect to current unemployment rates, and between 6 and 10 percent with respect to unemployment rates at the start of current firm tenure. Moreover, whereas local unemployment rates determine levels of earnings, national rates influence contemporaneous variations in earnings. We interpret this result as evidence that German unions do, in fact, bargain over wages and employment, but that models of individualistic contracts, such as the implicit contract model, may explain some of the observed wage drift and longer-term wage movements reasonably well. Furthermore, we explore the heterogeneity of contracts over a variety of worker and job characteristics. In particular, we find evidence that contracts differ across firm size and worker type. Workers of large firms are remarkably more insulated from the job market than workers for any other type of firm, indicating the importance of internal job markets.
Resumo:
It is well known that standard asymptotic theory is not valid or is extremely unreliable in models with identification problems or weak instruments [Dufour (1997, Econometrica), Staiger and Stock (1997, Econometrica), Wang and Zivot (1998, Econometrica), Stock and Wright (2000, Econometrica), Dufour and Jasiak (2001, International Economic Review)]. One possible way out consists here in using a variant of the Anderson-Rubin (1949, Ann. Math. Stat.) procedure. The latter, however, allows one to build exact tests and confidence sets only for the full vector of the coefficients of the endogenous explanatory variables in a structural equation, which in general does not allow for individual coefficients. This problem may in principle be overcome by using projection techniques [Dufour (1997, Econometrica), Dufour and Jasiak (2001, International Economic Review)]. AR-types are emphasized because they are robust to both weak instruments and instrument exclusion. However, these techniques can be implemented only by using costly numerical techniques. In this paper, we provide a complete analytic solution to the problem of building projection-based confidence sets from Anderson-Rubin-type confidence sets. The latter involves the geometric properties of “quadrics” and can be viewed as an extension of usual confidence intervals and ellipsoids. Only least squares techniques are required for building the confidence intervals. We also study by simulation how “conservative” projection-based confidence sets are. Finally, we illustrate the methods proposed by applying them to three different examples: the relationship between trade and growth in a cross-section of countries, returns to education, and a study of production functions in the U.S. economy.
Resumo:
We discuss statistical inference problems associated with identification and testability in econometrics, and we emphasize the common nature of the two issues. After reviewing the relevant statistical notions, we consider in turn inference in nonparametric models and recent developments on weakly identified models (or weak instruments). We point out that many hypotheses, for which test procedures are commonly proposed, are not testable at all, while some frequently used econometric methods are fundamentally inappropriate for the models considered. Such situations lead to ill-defined statistical problems and are often associated with a misguided use of asymptotic distributional results. Concerning nonparametric hypotheses, we discuss three basic problems for which such difficulties occur: (1) testing a mean (or a moment) under (too) weak distributional assumptions; (2) inference under heteroskedasticity of unknown form; (3) inference in dynamic models with an unlimited number of parameters. Concerning weakly identified models, we stress that valid inference should be based on proper pivotal functions —a condition not satisfied by standard Wald-type methods based on standard errors — and we discuss recent developments in this field, mainly from the viewpoint of building valid tests and confidence sets. The techniques discussed include alternative proposed statistics, bounds, projection, split-sampling, conditioning, Monte Carlo tests. The possibility of deriving a finite-sample distributional theory, robustness to the presence of weak instruments, and robustness to the specification of a model for endogenous explanatory variables are stressed as important criteria assessing alternative procedures.
Resumo:
McCausland (2004a) describes a new theory of random consumer demand. Theoretically consistent random demand can be represented by a \"regular\" \"L-utility\" function on the consumption set X. The present paper is about Bayesian inference for regular L-utility functions. We express prior and posterior uncertainty in terms of distributions over the indefinite-dimensional parameter set of a flexible functional form. We propose a class of proper priors on the parameter set. The priors are flexible, in the sense that they put positive probability in the neighborhood of any L-utility function that is regular on a large subset bar(X) of X; and regular, in the sense that they assign zero probability to the set of L-utility functions that are irregular on bar(X). We propose methods of Bayesian inference for an environment with indivisible goods, leaving the more difficult case of indefinitely divisible goods for another paper. We analyse individual choice data from a consumer experiment described in Harbaugh et al. (2001).
Resumo:
Conseil canadien de la magistrature
Resumo:
Rapport de recherche
Resumo:
Rapport de recherche
Resumo:
Rapport de recherche
Resumo:
Rapport de recherche
Resumo:
Rapport de recherche