795 resultados para Étoile de faible masse
Resumo:
Suzumura shows that a binary relation has a weak order extension if and only if it is consistent. However, consistency is demonstrably not sufficient to extend an upper semi-continuous binary relation to an upper semicontinuous weak order. Jaffray proves that any asymmetric (or reflexive), transitive and upper semicontinuous binary relation has an upper semicontinuous strict (or weak) order extension. We provide sufficient conditions for existence of upper semicontinuous extensions of consistence rather than transitive relations. For asymmetric relations, consistency and upper semicontinuity suffice. For more general relations, we prove one theorem using a further consistency property and another with an additional continuity requirement.
Resumo:
This paper characterizes welfarist social evaluation in a multi-profile setting where, in addition to multiple utility profiles, it is assumed that there are several profiles of non-welfare information. We prove new versions of the welfarism theorems in this alternative framework, and we illustrate that a very plausible and weak anonymity property is sufficient to generate anonymous social-evaluation orderings.
Resumo:
La présente étude analyse les effets dynamiques de la dévaluation du franc CFA, à l’aide d’un modèle monétaire d’équilibre général, intertemporel et multisectoriel. L’accent est particulièrement mis sur les interactions entre dévaluation et accumulation du capital. Dans le modèle, les effets du changement de parité passent par le marché du travail qui se caractérise par l’inertie du salaire nominal. Les résultats montrent que la dévaluation relance l’investissement, avec des effets expansionnistes sur l’activité économique. Le choc monétaire n’a eu qu’un impact limité sur les soldes budgétaire et commercial. Une mesure d’accompagnement telle que la réduction des salaires de la fonction publique améliore ces deux soldes mais déclenche un processus récessif.
Resumo:
The GARCH and Stochastic Volatility paradigms are often brought into conflict as two competitive views of the appropriate conditional variance concept : conditional variance given past values of the same series or conditional variance given a larger past information (including possibly unobservable state variables). The main thesis of this paper is that, since in general the econometrician has no idea about something like a structural level of disaggregation, a well-written volatility model should be specified in such a way that one is always allowed to reduce the information set without invalidating the model. To this respect, the debate between observable past information (in the GARCH spirit) versus unobservable conditioning information (in the state-space spirit) is irrelevant. In this paper, we stress a square-root autoregressive stochastic volatility (SR-SARV) model which remains true to the GARCH paradigm of ARMA dynamics for squared innovations but weakens the GARCH structure in order to obtain required robustness properties with respect to various kinds of aggregation. It is shown that the lack of robustness of the usual GARCH setting is due to two very restrictive assumptions : perfect linear correlation between squared innovations and conditional variance on the one hand and linear relationship between the conditional variance of the future conditional variance and the squared conditional variance on the other hand. By relaxing these assumptions, thanks to a state-space setting, we obtain aggregation results without renouncing to the conditional variance concept (and related leverage effects), as it is the case for the recently suggested weak GARCH model which gets aggregation results by replacing conditional expectations by linear projections on symmetric past innovations. Moreover, unlike the weak GARCH literature, we are able to define multivariate models, including higher order dynamics and risk premiums (in the spirit of GARCH (p,p) and GARCH in mean) and to derive conditional moment restrictions well suited for statistical inference. Finally, we are able to characterize the exact relationships between our SR-SARV models (including higher order dynamics, leverage effect and in-mean effect), usual GARCH models and continuous time stochastic volatility models, so that previous results about aggregation of weak GARCH and continuous time GARCH modeling can be recovered in our framework.
Resumo:
Recent work shows that a low correlation between the instruments and the included variables leads to serious inference problems. We extend the local-to-zero analysis of models with weak instruments to models with estimated instruments and regressors and with higher-order dependence between instruments and disturbances. This makes this framework applicable to linear models with expectation variables that are estimated non-parametrically. Two examples of such models are the risk-return trade-off in finance and the impact of inflation uncertainty on real economic activity. Results show that inference based on Lagrange Multiplier (LM) tests is more robust to weak instruments than Wald-based inference. Using LM confidence intervals leads us to conclude that no statistically significant risk premium is present in returns on the S&P 500 index, excess holding yields between 6-month and 3-month Treasury bills, or in yen-dollar spot returns.
Resumo:
Ce rapport s'inscrit dans la problematique soulevée par l'insistance des milieux financiers américains et du Trésor américain à libéraliser et à globaliser les mouvements internationaux de capitaux. Dans cette perspective, il tente de répondre à trois questions, à savoir 1) est-ce que la finance internationale en général et les institutions monétaires et financières nationales et internationales sont sources d'instabilité financière et économique pour les pays? 2) est-il possible d'éviter que les bénéfices découlant de l'accès accru aux marchés internationaux des capitaux soient diminués et même renversés par des crises monétaires et financières internationales et nationales et, comme corollaire, 3) est-ce que le systême actuel des monnaies nationales est dépassé ?
Resumo:
This paper studies the proposition that an inflation bias can arise in a setup where a central banker with asymmetric preferences targets the natural unemployment rate. Preferences are asymmetric in the sense that positive unemployment deviations from the natural rate are weighted more (or less) severely than negative deviations in the central banker's loss function. The bias is proportional to the conditional variance of unemployment. The time-series predictions of the model are evaluated using data from G7 countries. Econometric estimates support the prediction that the conditional variance of unemployment and the rate of inflation are positively related.
Resumo:
This paper derives the ARMA representation of integrated and realized variances when the spot variance depends linearly on two autoregressive factors, i.e., SR SARV(2) models. This class of processes includes affine, GARCH diffusion, CEV models, as well as the eigenfunction stochastic volatility and the positive Ornstein-Uhlenbeck models. We also study the leverage effect case, the relationship between weak GARCH representation of returns and the ARMA representation of realized variances. Finally, various empirical implications of these ARMA representations are considered. We find that it is possible that some parameters of the ARMA representation are negative. Hence, the positiveness of the expected values of integrated or realized variances is not guaranteed. We also find that for some frequencies of observations, the continuous time model parameters may be weakly or not identified through the ARMA representation of realized variances.
Resumo:
Ce texte propose des méthodes d’inférence exactes (tests et régions de confiance) sur des modèles de régression linéaires avec erreurs autocorrélées suivant un processus autorégressif d’ordre deux [AR(2)], qui peut être non stationnaire. L’approche proposée est une généralisation de celle décrite dans Dufour (1990) pour un modèle de régression avec erreurs AR(1) et comporte trois étapes. Premièrement, on construit une région de confiance exacte pour le vecteur des coefficients du processus autorégressif (φ). Cette région est obtenue par inversion de tests d’indépendance des erreurs sur une forme transformée du modèle contre des alternatives de dépendance aux délais un et deux. Deuxièmement, en exploitant la dualité entre tests et régions de confiance (inversion de tests), on détermine une région de confiance conjointe pour le vecteur φ et un vecteur d’intérêt M de combinaisons linéaires des coefficients de régression du modèle. Troisièmement, par une méthode de projection, on obtient des intervalles de confiance «marginaux» ainsi que des tests à bornes exacts pour les composantes de M. Ces méthodes sont appliquées à des modèles du stock de monnaie (M2) et du niveau des prix (indice implicite du PNB) américains
Resumo:
We discuss statistical inference problems associated with identification and testability in econometrics, and we emphasize the common nature of the two issues. After reviewing the relevant statistical notions, we consider in turn inference in nonparametric models and recent developments on weakly identified models (or weak instruments). We point out that many hypotheses, for which test procedures are commonly proposed, are not testable at all, while some frequently used econometric methods are fundamentally inappropriate for the models considered. Such situations lead to ill-defined statistical problems and are often associated with a misguided use of asymptotic distributional results. Concerning nonparametric hypotheses, we discuss three basic problems for which such difficulties occur: (1) testing a mean (or a moment) under (too) weak distributional assumptions; (2) inference under heteroskedasticity of unknown form; (3) inference in dynamic models with an unlimited number of parameters. Concerning weakly identified models, we stress that valid inference should be based on proper pivotal functions —a condition not satisfied by standard Wald-type methods based on standard errors — and we discuss recent developments in this field, mainly from the viewpoint of building valid tests and confidence sets. The techniques discussed include alternative proposed statistics, bounds, projection, split-sampling, conditioning, Monte Carlo tests. The possibility of deriving a finite-sample distributional theory, robustness to the presence of weak instruments, and robustness to the specification of a model for endogenous explanatory variables are stressed as important criteria assessing alternative procedures.
Resumo:
This paper studies the interdependence between fiscal and monetary policies, and their joint role in the determination of the price level. The government is characterized by a long-run fiscal policy rule whereby a given fraction of the outstanding debt, say d, is backed by the present discounted value of current and future primary surpluses. The remaining debt is backed by seigniorage revenue. The parameter d characterizes the interdependence between fiscal and monetary authorities. It is shown that in a standard monetary economy, this policy rule implies that the price level depends not only on the money stock, but also on the proportion of debt that is backed with money. Empirical estimates of d are obtained for OECD countries using data on nominal consumption, monetary base, and debt. Results indicate that debt plays only a minor role in the determination of the price level in these economies. Estimates of d correlate well with institutional measures of central bank independence.
Resumo:
This paper derives optimal monetary policy rules in setups where certainty equivalence does not hold because either central bank preferences are not quadratic, and/or the aggregate supply relation is nonlinear. Analytical results show that these features lead to sign and size asymmetries, and nonlinearities in the policy rule. Reduced-form estimates indicate that US monetary policy can be characterized by a nonlinear policy rule after 1983, but not before 1979. This finding is consistent with the view that the Fed's inflation preferences during the Volcker-Greenspan regime differ considerably from the ones during the Burns-Miller regime.
Resumo:
Statistical tests in vector autoregressive (VAR) models are typically based on large-sample approximations, involving the use of asymptotic distributions or bootstrap techniques. After documenting that such methods can be very misleading even with fairly large samples, especially when the number of lags or the number of equations is not small, we propose a general simulation-based technique that allows one to control completely the level of tests in parametric VAR models. In particular, we show that maximized Monte Carlo tests [Dufour (2002)] can provide provably exact tests for such models, whether they are stationary or integrated. Applications to order selection and causality testing are considered as special cases. The technique developed is applied to quarterly and monthly VAR models of the U.S. economy, comprising income, money, interest rates and prices, over the period 1965-1996.
Resumo:
This paper constructs and estimates a sticky-price, Dynamic Stochastic General Equilibrium model with heterogenous production sectors. Sectors differ in price stickiness, capital-adjustment costs and production technology, and use output from each other as material and investment inputs following an Input-Output Matrix and Capital Flow Table that represent the U.S. economy. By relaxing the standard assumption of symmetry, this model allows different sectoral dynamics in response to monetary policy shocks. The model is estimated by Simulated Method of Moments using sectoral and aggregate U.S. time series. Results indicate 1) substantial heterogeneity in price stickiness across sectors, with quantitatively larger differences between services and goods than previously found in micro studies that focus on final goods alone, 2) a strong sensitivity to monetary policy shocks on the part of construction and durable manufacturing, and 3) similar quantitative predictions at the aggregate level by the multi-sector model and a standard model that assumes symmetry across sectors.
Resumo:
This paper develops a model where the value of the monetary policy instrument is selected by a heterogenous committee engaged in a dynamic voting game. Committee members differ in their institutional power and, in certain states of nature, they also differ in their preferred instrument value. Preference heterogeneity and concern for the future interact to generate decisions that are dynamically ineffcient and inertial around the previously-agreed instrument value. This model endogenously generates autocorrelation in the policy variable and provides an explanation for the empirical observation that the nominal interest rate under the central bank’s control is infrequently adjusted.