848 resultados para [JEL:J41] Labor and Demographic Economics - Particular Labor Markets - Contracts: Specific Human Capital, Matching Models, Efficiency Wage Models, and Internal Labor Markets


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies monetary policy in an economy where the central banker's preferences are asymmetric around optimal inflation. In particular, positive deviations from the optimum can be weighted more, or less, severely than negative deviations in the policy maker's loss function. It is shown that under asymmetric preferences, uncertainty can induce a prudent behavior on the part of the central banker. Since the prudence motive can be large enough to override the inflation bias, optimal monetary policy could be implemented even in the absence of rules, reputation, or contractual mechanisms. For certain parameter values, a deflationary bias can arise in equilibrium.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we characterize the asymmetries of the smile through multiple leverage effects in a stochastic dynamic asset pricing framework. The dependence between price movements and future volatility is introduced through a set of latent state variables. These latent variables can capture not only the volatility risk and the interest rate risk which potentially affect option prices, but also any kind of correlation risk and jump risk. The standard financial leverage effect is produced by a cross-correlation effect between the state variables which enter into the stochastic volatility process of the stock price and the stock price process itself. However, we provide a more general framework where asymmetric implied volatility curves result from any source of instantaneous correlation between the state variables and either the return on the stock or the stochastic discount factor. In order to draw the shapes of the implied volatility curves generated by a model with latent variables, we specify an equilibrium-based stochastic discount factor with time non-separable preferences. When we calibrate this model to empirically reasonable values of the parameters, we are able to reproduce the various types of implied volatility curves inferred from option market data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presently, conditions ensuring the validity of bootstrap methods for the sample mean of (possibly heterogeneous) near epoch dependent (NED) functions of mixing processes are unknown. Here we establish the validity of the bootstrap in this context, extending the applicability of bootstrap methods to a class of processes broadly relevant for applications in economics and finance. Our results apply to two block bootstrap methods: the moving blocks bootstrap of Künsch ( 989) and Liu and Singh ( 992), and the stationary bootstrap of Politis and Romano ( 994). In particular, the consistency of the bootstrap variance estimator for the sample mean is shown to be robust against heteroskedasticity and dependence of unknown form. The first order asymptotic validity of the bootstrap approximation to the actual distribution of the sample mean is also established in this heterogeneous NED context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a probabilistic approach to the problem of assigning k indivisible identical objects to a set of agents with single-peaked preferences. Using the ordinal extension of preferences, we characterize the class of uniform probabilistic rules by Pareto efficiency, strategy-proofness, and no-envy. We also show that in this characterization no-envy cannot be replaced by anonymity. When agents are strictly risk averse von-Neumann-Morgenstern utility maximizers, then we reduce the problem of assigning k identical objects to a problem of allocating the amount k of an infinitely divisible commodity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is often thought that a tariff reduction, by opening up the domestic market to foreign firms, should lessen the need for a policy aimed at discouraging domestic mergers. This implicitly assumes that the tariff in question is sufficiently high to prevent foreign firms from selling in the domestic market. However, not all tariffs are prohibitive, so that foreign firms may be present in the domestic market before it is abolished. Furthermore, even if the tariff is prohibitive, a merger of domestic firms may render it nonprohibitive, thus inviting foreign firms to penetrate the domestic market. In this paper, we show, using a simple example, that in the latter two cases, abolishing the tariff may in fact make the domestic merger more profitable. Hence, trade liberalization will not necessarily reduce the profitability of domestic mergers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to demonstrate that, even if Marx's solution to the transformation problem can be modified, his basic concusions remain valid.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We characterize the solution to a model of consumption smoothing using financing under non-commitment and savings. We show that, under certain conditions, these two different instruments complement each other perfectly. If the rate of time preference is equal to the interest rate on savings, perfect smoothing can be achieved in finite time. We also show that, when random revenues are generated by periodic investments in capital through a concave production function, the level of smoothing achieved through financial contracts can influence the productive investment efficiency. As long as financial contracts cannot achieve perfect smoothing, productive investment will be used as a complementary smoothing device.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose finite sample tests and confidence sets for models with unobserved and generated regressors as well as various models estimated by instrumental variables methods. The validity of the procedures is unaffected by the presence of identification problems or \"weak instruments\", so no detection of such problems is required. We study two distinct approaches for various models considered by Pagan (1984). The first one is an instrument substitution method which generalizes an approach proposed by Anderson and Rubin (1949) and Fuller (1987) for different (although related) problems, while the second one is based on splitting the sample. The instrument substitution method uses the instruments directly, instead of generated regressors, in order to test hypotheses about the \"structural parameters\" of interest and build confidence sets. The second approach relies on \"generated regressors\", which allows a gain in degrees of freedom, and a sample split technique. For inference about general possibly nonlinear transformations of model parameters, projection techniques are proposed. A distributional theory is obtained under the assumptions of Gaussian errors and strictly exogenous regressors. We show that the various tests and confidence sets proposed are (locally) \"asymptotically valid\" under much weaker assumptions. The properties of the tests proposed are examined in simulation experiments. In general, they outperform the usual asymptotic inference methods in terms of both reliability and power. Finally, the techniques suggested are applied to a model of Tobin’s q and to a model of academic performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent work suggests that the conditional variance of financial returns may exhibit sudden jumps. This paper extends a non-parametric procedure to detect discontinuities in otherwise continuous functions of a random variable developed by Delgado and Hidalgo (1996) to higher conditional moments, in particular the conditional variance. Simulation results show that the procedure provides reasonable estimates of the number and location of jumps. This procedure detects several jumps in the conditional variance of daily returns on the S&P 500 index.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose two axiomatic theories of cost sharing with the common premise that agents demand comparable -though perhaps different- commodities and are responsible for their own demand. Under partial responsibility the agents are not responsible for the asymmetries of the cost function: two agents consuming the same amount of output always pay the same price; this holds true under full responsibility only if the cost function is symmetric in all individual demands. If the cost function is additively separable, each agent pays her stand alone cost under full responsibility; this holds true under partial responsibility only if, in addition, the cost function is symmetric. By generalizing Moulin and Shenker’s (1999) Distributivity axiom to cost-sharing methods for heterogeneous goods, we identify in each of our two theories a different serial method. The subsidy-free serial method (Moulin, 1995) is essentially the only distributive method meeting Ranking and Dummy. The cross-subsidizing serial method (Sprumont, 1998) is the only distributive method satisfying Separability and Strong Ranking. Finally, we propose an alternative characterization of the latter method based on a strengthening of Distributivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the problem of testing the error distribution in a multivariate linear regression (MLR) model. The tests are functions of appropriately standardized multivariate least squares residuals whose distribution is invariant to the unknown cross-equation error covariance matrix. Empirical multivariate skewness and kurtosis criteria are then compared to simulation-based estimate of their expected value under the hypothesized distribution. Special cases considered include testing multivariate normal, Student t; normal mixtures and stable error models. In the Gaussian case, finite-sample versions of the standard multivariate skewness and kurtosis tests are derived. To do this, we exploit simple, double and multi-stage Monte Carlo test methods. For non-Gaussian distribution families involving nuisance parameters, confidence sets are derived for the the nuisance parameters and the error distribution. The procedures considered are evaluated in a small simulation experi-ment. Finally, the tests are applied to an asset pricing model with observable risk-free rates, using monthly returns on New York Stock Exchange (NYSE) portfolios over five-year subperiods from 1926-1995.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional explanations for Western Europe's demographic growth in the High Middle Ages are unable to explain the rise in per-capita income that accompanied observed population changes. Here, we examine the hypothesis that an innovation in information technology changed the optimal structure of contracts and raised the productivity of human capital. We present historical evidence for this thesis, offer a theoretical explanation based on transaction costs, and test the theory's predictions with data on urban demographic growth. We find that the information-technology hypothesis significantly increases the capacity of the neoclassical growth model to explain European economic expansion between 1000 and 1300.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the interdependence between fiscal and monetary policies, and their joint role in the determination of the price level. The government is characterized by a long-run fiscal policy rule whereby a given fraction of the outstanding debt, say d, is backed by the present discounted value of current and future primary surpluses. The remaining debt is backed by seigniorage revenue. The parameter d characterizes the interdependence between fiscal and monetary authorities. It is shown that in a standard monetary economy, this policy rule implies that the price level depends not only on the money stock, but also on the proportion of debt that is backed with money. Empirical estimates of d are obtained for OECD countries using data on nominal consumption, monetary base, and debt. Results indicate that debt plays only a minor role in the determination of the price level in these economies. Estimates of d correlate well with institutional measures of central bank independence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A group of agents participate in a cooperative enterprise producing a single good. Each participant contributes a particular type of input; output is nondecreasing in these contributions. How should it be shared? We analyze the implications of the axiom of Group Monotonicity: if a group of agents simultaneously decrease their input contributions, not all of them should receive a higher share of output. We show that in combination with other more familiar axioms, this condition pins down a very small class of methods, which we dub nearly serial.