784 resultados para Modèles: atmosphère
Resumo:
This paper addresses the question of whether R&D should be carried out by an independent research unit or be produced in-house by the firm marketing the innovation. We define two organizational structures. In an integrated structure, the firm that markets the innovation also carries out and finances research leading to the innovation. In an independent structure, the firm that markets the innovation buys it from an independent research unit which is financed externally. We compare the two structures under the assumption that the research unit has some private information about the real cost of developing the new product. When development costs are negatively correlated with revenues from the innovation, the integrated structure dominates. The independent structure dominates in the opposite case.
Resumo:
Dans ce texte, nous analysons les développements récents de l’économétrie à la lumière de la théorie des tests statistiques. Nous revoyons d’abord quelques principes fondamentaux de philosophie des sciences et de théorie statistique, en mettant l’accent sur la parcimonie et la falsifiabilité comme critères d’évaluation des modèles, sur le rôle de la théorie des tests comme formalisation du principe de falsification de modèles probabilistes, ainsi que sur la justification logique des notions de base de la théorie des tests (tel le niveau d’un test). Nous montrons ensuite que certaines des méthodes statistiques et économétriques les plus utilisées sont fondamentalement inappropriées pour les problèmes et modèles considérés, tandis que de nombreuses hypothèses, pour lesquelles des procédures de test sont communément proposées, ne sont en fait pas du tout testables. De telles situations conduisent à des problèmes statistiques mal posés. Nous analysons quelques cas particuliers de tels problèmes : (1) la construction d’intervalles de confiance dans le cadre de modèles structurels qui posent des problèmes d’identification; (2) la construction de tests pour des hypothèses non paramétriques, incluant la construction de procédures robustes à l’hétéroscédasticité, à la non-normalité ou à la spécification dynamique. Nous indiquons que ces difficultés proviennent souvent de l’ambition d’affaiblir les conditions de régularité nécessaires à toute analyse statistique ainsi que d’une utilisation inappropriée de résultats de théorie distributionnelle asymptotique. Enfin, nous soulignons l’importance de formuler des hypothèses et modèles testables, et de proposer des techniques économétriques dont les propriétés sont démontrables dans les échantillons finis.
Resumo:
This paper proves a new representation theorem for domains with both discrete and continuous variables. The result generalizes Debreu's well-known representation theorem on connected domains. A strengthening of the standard continuity axiom is used in order to guarantee the existence of a representation. A generalization of the main theorem and an application of the more general result are also presented.
Resumo:
This paper proposes an explanation for why efficient reforms are not carried out when losers have the power to block their implementation, even though compensating them is feasible. We construct a signaling model with two-sided incomplete information in which a government faces the task of sequentially implementing two reforms by bargaining with interest groups. The organization of interest groups is endogenous. Compensations are distortionary and government types differ in the concern about distortions. We show that, when compensations are allowed to be informative about the government’s type, there is a bias against the payment of compensations and the implementation of reforms. This is because paying high compensations today provides incentives for some interest groups to organize and oppose subsequent reforms with the only purpose of receiving a transfer. By paying lower compensations, governments attempt to prevent such interest groups from organizing. However, this comes at the cost of reforms being blocked by interest groups with relatively high losses.
Resumo:
In this paper, we study several tests for the equality of two unknown distributions. Two are based on empirical distribution functions, three others on nonparametric probability density estimates, and the last ones on differences between sample moments. We suggest controlling the size of such tests (under nonparametric assumptions) by using permutational versions of the tests jointly with the method of Monte Carlo tests properly adjusted to deal with discrete distributions. We also propose a combined test procedure, whose level is again perfectly controlled through the Monte Carlo test technique and has better power properties than the individual tests that are combined. Finally, in a simulation experiment, we show that the technique suggested provides perfect control of test size and that the new tests proposed can yield sizeable power improvements.
Resumo:
In this paper, we provide both qualitative and quantitative measures of the cost of measuring the integrated volatility by the realized volatility when the frequency of observation is fixed. We start by characterizing for a general diffusion the difference between the realized and the integrated volatilities for a given frequency of observations. Then, we compute the mean and variance of this noise and the correlation between the noise and the integrated volatility in the Eigenfunction Stochastic Volatility model of Meddahi (2001a). This model has, as special examples, log-normal, affine, and GARCH diffusion models. Using some previous empirical works, we show that the standard deviation of the noise is not negligible with respect to the mean and the standard deviation of the integrated volatility, even if one considers returns at five minutes. We also propose a simple approach to capture the information about the integrated volatility contained in the returns through the leverage effect.
Resumo:
In this paper, we introduce a new approach for volatility modeling in discrete and continuous time. We follow the stochastic volatility literature by assuming that the variance is a function of a state variable. However, instead of assuming that the loading function is ad hoc (e.g., exponential or affine), we assume that it is a linear combination of the eigenfunctions of the conditional expectation (resp. infinitesimal generator) operator associated to the state variable in discrete (resp. continuous) time. Special examples are the popular log-normal and square-root models where the eigenfunctions are the Hermite and Laguerre polynomials respectively. The eigenfunction approach has at least six advantages: i) it is general since any square integrable function may be written as a linear combination of the eigenfunctions; ii) the orthogonality of the eigenfunctions leads to the traditional interpretations of the linear principal components analysis; iii) the implied dynamics of the variance and squared return processes are ARMA and, hence, simple for forecasting and inference purposes; (iv) more importantly, this generates fat tails for the variance and returns processes; v) in contrast to popular models, the variance of the variance is a flexible function of the variance; vi) these models are closed under temporal aggregation.
Resumo:
In an economy where cash can be stored costlessly (in nominal terms), the nominal interest rate is bounded below by zero. This paper derives the implications of this nonnegativity constraint for the term structure and shows that it induces a nonlinear and convex relation between short- and long-term interest rates. As a result, the long-term rate responds asymmetrically to changes in the short-term rate, and by less than predicted by a benchmark linear model. In particular, a decrease in the short-term rate leads to a decrease in the long-term rate that is smaller in magnitude than the increase in the long-term rate associated with an increase in the short-term rate of the same size. Up to the extent that monetary policy acts by affecting long-term rates through the term structure, its power is considerably reduced at low interest rates. The empirical predictions of the model are examined using data from Japan.
Resumo:
This paper tests the predictions of the Barro-Gordon model using US data on inflation and unemployment. To that end, it constructs a general game-theoretical model with asymmetric preferences that nests the Barro-Gordon model and a version of Cukierman’s model as special cases. Likelihood Ratio tests indicate that the restriction imposed by the Barro-Gordon model is rejected by the data but the one imposed by the version of Cukierman’s model is not. Reduced-form estimates are consistent with the view that the Federal Reserve weights more heavily positive than negative unemployment deviations from the expected natural rate.
Resumo:
This paper studies the persistent effects of monetary shocks on output. Previous empirical literature documents this persistence, but standard general equilibrium models with sticky prices fail to generate output responses beyond the duration of nominal contracts. This paper constructs and estimates a general equilibrium model with price rigidities, habit formation, and costly capital adjustment. The model is estimated via Maximum Likelihood using US data on output, the real money stock, and the nominal interest rate. Econometric results suggest that habit formation and adjustment costs to capital play an important role in explaining the output effects of monetary policy. In particular, impulse response analysis indicates that the model generates persistent, hump-shaped output responses to monetary shocks.
Resumo:
The aim of this paper is to demonstrate that, even if Marx's solution to the transformation problem can be modified, his basic conclusions remain valid. the proposed alternative solution which is presented hare is based on the constraint of a common general profit rate in both spaces and a money wage level which will be determined simultaneously with prices.
Resumo:
The aim of this paper is to demonstrate that, even if Marx's solution to the transformation problem can be modified, his basic concusions remain valid.
Resumo:
By reporting his satisfaction with his job or any other experience, an individual does not communicate the number of utils that he feels. Instead, he expresses his posterior preference over available alternatives conditional on acquired knowledge of the past. This new interpretation of reported job satisfaction restores the power of microeconomic theory without denying the essential role of discrepancies between one’s situation and available opportunities. Posterior human wealth discrepancies are found to be the best predictor of reported job satisfaction. Static models of relative utility and other subjective well-being assumptions are all unambiguously rejected by the data, as well as an \"economic\" model in which job satisfaction is a measure of posterior human wealth. The \"posterior choice\" model readily explains why so many people usually report themselves as happy or satisfied, why both younger and older age groups are insensitive to current earning discrepancies, and why the past weighs more heavily than the present and the future.
Resumo:
This paper extends the Competitive Storage Model by incorporating prominent features of the production process and financial markets. A major limitation of this basic model is that it cannot successfully explain the degree of serial correlation observed in actual data. The proposed extensions build on the observation that in order to generate a high degree of price persistence, a model must incorporate features such that agents are willing to hold stocks more often than predicted by the basic model. We therefore allow unique characteristics of the production and trading mechanisms to provide the required incentives. Specifically, the proposed models introduce (i) gestation lags in production with heteroskedastic supply shocks, (ii) multiperiod forward contracts, and (iii) a convenience return to inventory holding. The rational expectations solutions for twelve commodities are numerically solved. Simulations are then employed to assess the effects of the above extensions on the time series properties of commodity prices. Results indicate that each of the features above partially account for the persistence and occasional spikes observed in actual data. Evidence is presented that the precautionary demand for stocks might play a substantial role in the dynamics of commodity prices.
Resumo:
This paper develops a model of money demand where the opportunity cost of holding money is subject to regime changes. The regimes are fully characterized by the mean and variance of inflation and are assumed to be the result of alternative government policies. Agents are unable to directly observe whether government actions are indeed consistent with the inflation rate targeted as part of a stabilization program but can construct probability inferences on the basis of available observations of inflation and money growth. Government announcements are assumed to provide agents with additional, possibly truthful information regarding the regime. This specification is estimated and tested using data from the Israeli and Argentine high inflation periods. Results indicate the successful stabilization program implemented in Israel in July 1985 was more credible than either the earlier Israeli attempt in November 1984 or the Argentine programs. Government’s signaling might substantially simplify the inference problem and increase the speed of learning on the part of the agents. However, under certain conditions, it might increase the volatility of inflation. After the introduction of an inflation stabilization plan, the welfare gains from a temporary increase in real balances might be high enough to induce agents to raise their real balances in the short-term, even if they are uncertain about the nature of government policy and the eventual outcome of the stabilization attempt. Statistically, the model restrictions cannot be rejected at the 1% significance level.