339 resultados para Béton renforcé de fibres--Propriétés mécaniques--Modèles mathématiques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies seemingly unrelated linear models with integrated regressors and stationary errors. By adding leads and lags of the first differences of the regressors and estimating this augmented dynamic regression model by feasible generalized least squares using the long-run covariance matrix, we obtain an efficient estimator of the cointegrating vector that has a limiting mixed normal distribution. Simulation results suggest that this new estimator compares favorably with others already proposed in the literature. We apply these new estimators to the testing of purchasing power parity (PPP) among the G-7 countries. The test based on the efficient estimates rejects the PPP hypothesis for most countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A group of agents located along a river have quasi-linear preferences over water and money. We ask how the water should be allocated and what money transfers should be performed. We are interested in efficiency, stability (in the sense of the core), and fairness (in a sense to be defined). We first show that the cooperative game associated with our problem is convex : its core is therefore large and easily described. Next, we propose the following fairness requirement : no group of agents should enjoy a welfare higher than what it could achieve in the absence of the remaining agents. We prove that only one welfare vector in the core satisfies this condition : it is the marginal contribution vector corresponding to the ordering of the agents along the river. We discuss how it could be decentralized or implemented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of multivariate linear regression (MLR) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. In this paper, we propose a general method for constructing exact tests of possibly nonlinear hypotheses on the coefficients of MLR systems. For the case of uniform linear hypotheses, we present exact distributional invariance results concerning several standard test criteria. These include Wilks' likelihood ratio (LR) criterion as well as trace and maximum root criteria. The normality assumption is not necessary for most of the results to hold. Implications for inference are two-fold. First, invariance to nuisance parameters entails that the technique of Monte Carlo tests can be applied on all these statistics to obtain exact tests of uniform linear hypotheses. Second, the invariance property of the latter statistic is exploited to derive general nuisance-parameter-free bounds on the distribution of the LR statistic for arbitrary hypotheses. Even though it may be difficult to compute these bounds analytically, they can easily be simulated, hence yielding exact bounds Monte Carlo tests. Illustrative simulation experiments show that the bounds are sufficiently tight to provide conclusive results with a high probability. Our findings illustrate the value of the bounds as a tool to be used in conjunction with more traditional simulation-based test methods (e.g., the parametric bootstrap) which may be applied when the bounds are not conclusive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the question of whether R&D should be carried out by an independent research unit or be produced in-house by the firm marketing the innovation. We define two organizational structures. In an integrated structure, the firm that markets the innovation also carries out and finances research leading to the innovation. In an independent structure, the firm that markets the innovation buys it from an independent research unit which is financed externally. We compare the two structures under the assumption that the research unit has some private information about the real cost of developing the new product. When development costs are negatively correlated with revenues from the innovation, the integrated structure dominates. The independent structure dominates in the opposite case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proves a new representation theorem for domains with both discrete and continuous variables. The result generalizes Debreu's well-known representation theorem on connected domains. A strengthening of the standard continuity axiom is used in order to guarantee the existence of a representation. A generalization of the main theorem and an application of the more general result are also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we provide both qualitative and quantitative measures of the cost of measuring the integrated volatility by the realized volatility when the frequency of observation is fixed. We start by characterizing for a general diffusion the difference between the realized and the integrated volatilities for a given frequency of observations. Then, we compute the mean and variance of this noise and the correlation between the noise and the integrated volatility in the Eigenfunction Stochastic Volatility model of Meddahi (2001a). This model has, as special examples, log-normal, affine, and GARCH diffusion models. Using some previous empirical works, we show that the standard deviation of the noise is not negligible with respect to the mean and the standard deviation of the integrated volatility, even if one considers returns at five minutes. We also propose a simple approach to capture the information about the integrated volatility contained in the returns through the leverage effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we introduce a new approach for volatility modeling in discrete and continuous time. We follow the stochastic volatility literature by assuming that the variance is a function of a state variable. However, instead of assuming that the loading function is ad hoc (e.g., exponential or affine), we assume that it is a linear combination of the eigenfunctions of the conditional expectation (resp. infinitesimal generator) operator associated to the state variable in discrete (resp. continuous) time. Special examples are the popular log-normal and square-root models where the eigenfunctions are the Hermite and Laguerre polynomials respectively. The eigenfunction approach has at least six advantages: i) it is general since any square integrable function may be written as a linear combination of the eigenfunctions; ii) the orthogonality of the eigenfunctions leads to the traditional interpretations of the linear principal components analysis; iii) the implied dynamics of the variance and squared return processes are ARMA and, hence, simple for forecasting and inference purposes; (iv) more importantly, this generates fat tails for the variance and returns processes; v) in contrast to popular models, the variance of the variance is a flexible function of the variance; vi) these models are closed under temporal aggregation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La causalité au sens de Granger est habituellement définie par la prévisibilité d'un vecteur de variables par un autre une période à l'avance. Récemment, Lutkepohl (1990) a proposé de définir la non-causalité entre deux variables (ou vecteurs) par la non-prévisibilité à tous les délais dans le futur. Lorsqu'on considère plus de deux vecteurs (ie. lorsque l'ensemble d'information contient les variables auxiliaires), ces deux notions ne sont pas équivalentes. Dans ce texte, nous généralisons d'abord les notions antérieures de causalités en considérant la causalité à un horizon donné h arbitraire, fini ou infini. Ensuite, nous dérivons des conditions nécessaires et suffisantes de non-causalité entre deux vecteurs de variables (à l'intérieur d'un plus grand vecteur) jusqu'à un horizon donné h. Les modèles considérés incluent les autoregressions vectorielles, possiblement d'ordre infini, et les modèles ARIMA multivariés. En particulier, nous donnons des conditions de séparabilité et de rang pour la non-causalité jusqu'à un horizon h, lesquelles sont relativement simples à vérifier.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper extends the Competitive Storage Model by incorporating prominent features of the production process and financial markets. A major limitation of this basic model is that it cannot successfully explain the degree of serial correlation observed in actual data. The proposed extensions build on the observation that in order to generate a high degree of price persistence, a model must incorporate features such that agents are willing to hold stocks more often than predicted by the basic model. We therefore allow unique characteristics of the production and trading mechanisms to provide the required incentives. Specifically, the proposed models introduce (i) gestation lags in production with heteroskedastic supply shocks, (ii) multiperiod forward contracts, and (iii) a convenience return to inventory holding. The rational expectations solutions for twelve commodities are numerically solved. Simulations are then employed to assess the effects of the above extensions on the time series properties of commodity prices. Results indicate that each of the features above partially account for the persistence and occasional spikes observed in actual data. Evidence is presented that the precautionary demand for stocks might play a substantial role in the dynamics of commodity prices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the problem of measuring the uncertainty of CGE (or RBC)-type model simulations associated with parameter uncertainty. We describe two approaches for building confidence sets on model endogenous variables. The first one uses a standard Wald-type statistic. The second approach assumes that a confidence set (sampling or Bayesian) is available for the free parameters, from which confidence sets are derived by a projection technique. The latter has two advantages: first, confidence set validity is not affected by model nonlinearities; second, we can easily build simultaneous confidence intervals for an unlimited number of variables. We study conditions under which these confidence sets take the form of intervals and show they can be implemented using standard methods for solving CGE models. We present an application to a CGE model of the Moroccan economy to study the effects of policy-induced increases of transfers from Moroccan expatriates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A full understanding of public affairs requires the ability to distinguish between the policies that voters would like the government to adopt, and the influence that different voters or group of voters actually exert in the democratic process. We consider the properties of a computable equilibrium model of a competitive political economy in which the economic interests of groups of voters and their effective influence on equilibrium policy outcomes can be explicitly distinguished and computed. The model incorporates an amended version of the GEMTAP tax model, and is calibrated to data for the United States for 1973 and 1983. Emphasis is placed on how the aggregation of GEMTAP households into groups within which economic and political behaviour is assumed homogeneous affects the numerical representation of interests and influence for representative members of each group. Experiments with the model suggest that the changes in both interests and influence are important parts of the story behind the evolution of U.S. tax policy in the decade after 1973.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a recent paper, Bai and Perron (1998) considered theoretical issues related to the limiting distribution of estimators and test statistics in the linear model with multiple structural changes. In this companion paper, we consider practical issues for the empirical applications of the procedures. We first address the problem of estimation of the break dates and present an efficient algorithm to obtain global minimizers of the sum of squared residuals. This algorithm is based on the principle of dynamic programming and requires at most least-squares operations of order O(T 2) for any number of breaks. Our method can be applied to both pure and partial structural-change models. Secondly, we consider the problem of forming confidence intervals for the break dates under various hypotheses about the structure of the data and the errors across segments. Third, we address the issue of testing for structural changes under very general conditions on the data and the errors. Fourth, we address the issue of estimating the number of breaks. We present simulation results pertaining to the behavior of the estimators and tests in finite samples. Finally, a few empirical applications are presented to illustrate the usefulness of the procedures. All methods discussed are implemented in a GAUSS program available upon request for non-profit academic use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extend the class of M-tests for a unit root analyzed by Perron and Ng (1996) and Ng and Perron (1997) to the case where a change in the trend function is allowed to occur at an unknown time. These tests M(GLS) adopt the GLS detrending approach of Dufour and King (1991) and Elliott, Rothenberg and Stock (1996) (ERS). Following Perron (1989), we consider two models : one allowing for a change in slope and the other for both a change in intercept and slope. We derive the asymptotic distribution of the tests as well as that of the feasible point optimal tests PT(GLS) suggested by ERS. The asymptotic critical values of the tests are tabulated. Also, we compute the non-centrality parameter used for the local GLS detrending that permits the tests to have 50% asymptotic power at that value. We show that the M(GLS) and PT(GLS) tests have an asymptotic power function close to the power envelope. An extensive simulation study analyzes the size and power in finite samples under various methods to select the truncation lag for the autoregressive spectral density estimator. An empirical application is also provided.