578 resultados para Modélisation phénoménologique


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract : Textural division of a mineral in pyramids, with their apices located at the centre of the mineral and their bases corresponding to the mineral faces is called textural sector zoning. Textural sector zoning is observed in many metamorphic minerals like andalousite and garnet. Garnets found in the graphite rich black shales of the Mesozoic cover of the Gotthard Massif display textural sector zoning. The morphology of this sector zoning is not the same in different types of black shales observed in the Nufenen pass area. Garnets in foliated black shales display a well developed sector zoning while garnets found in cm-scale layered black shales display well developed sectors in the direction of the schistosity plane. This sector zoning is always associated with up to 30μm sized birefringent lamellae emanating radial from the sector boundaries. They alternate with isotrope lamellae. The garnet forming reaction was determined using singular value decomposition approach and results compared to thermodynamic calculations. It is of the form chl + mu + cc + cld = bt + fds + ank + gt + czo and is similar in both layered and foliated black shales. The calculated X(O) is close to 0.36 and does not significantly vary during the metamorphic history of the rock. This corresponds to X CO2, X CH4, and X H2O BSE imaging of garnets on oriented-cuts revealed that the orientation of the lamellae found within the sectors is controlled by crystallography. BSE imaging and electron microprobe analysis revealed that these lamellae are calcium rich compared to the isotropic lamellae. The addition of Ca to an almandine rich garnet causes a small distortion of the X site and potentially, ordering. Ordered and disordered garnet might have very similar free energies for this composition. Hence, two garnets with different composition can be precipitated with minor overstepping of the reaction. It is enough that continued nucleation of a new garnet layer slightly prefers the same structure to assure a fiber-like growth of both garnet compositions side by side. This hypothesis is in agreement with the thermodynamic properties of the garnet solid solution described in the literature and could explain the textures observed in garnets with these compositions. To understand the differences in sector zoning morphology, and crystal growth kinetics, crystal size distribution were determined in several samples using 2D spatial analysis of slab surfaces. The same nucleation rate law was chosen for all cases. Different growth rate law for non-layered black shales and layered black shales were used. Garnet in layered black shales grew according to a growth rate law of the form R=kt ½. The transport of nutrient is the limiting factor. Transport will occur preferentially on the schistosity planes. The shapes of the garnets in such rocks are therefore ovoid with the longest axis parallel to the schistosity planes. Sector zoning is less developed with sectors present only parallel to the schistosity planes. Garnet in non-layered blackshales grew according to a growth rate law of the form R=kt. The limiting factor is the attachment at the surface of the garnet. Garnets in these rocks will display a well developed sector zoning in all directions. The growth rate law is thus influenced by the texture of the rock. It favours or hinders the transport of nutrient to the mineral surface. Résumé : La zonation sectorielle texturale consiste en la division d'un cristal en pyramides dont les sommets sont localisés au centre du minéral. La base de ces pyramides correspond aux faces du minéral. Ce type de zonation est fréquemment observé dans les minéraux métamorphiques tels que l'andalousite ou le grenat. Les grenats présents dans les marnes riches en graphites de la couverture Mésozoïque du Massif du Gotthard présent une zonation sectorielle texturale. La morphologie de cette zonation n'est pas la même dans les marnes litées et dans les marnes foliées. Les grenats des marnes foliées montrent des secteurs bien développés dans 3 directions. Les grenats des marnes litées montrent des secteurs développés uniquement dans la direction des plans de schistosité. Cette zonation sectorielle est toujours associée à des lamelles biréfringentes de quelques microns de large qui partent de la limite des secteurs et qui sont perpendiculaires aux faces du grenat. Ces lamelles alternent avec des lamelles isotropes. La réaction de formation du grenat a été déterminée par calcul matriciel et thermodynamique. La réaction est de la forme chl + mu + cc + cld= bt + fds + ank + gt + czo. Elle est similaire dans les roches litées et dans les roches foliées. L'évaluation des conditions fluides montrent que le X(O) est proche de 0.36 et ne change pas de façon significative durant l'histoire métamorphique de la roche. Des images BSE sur des coupes orientées ont révélé que l'orientation de lamelles biréfringentes est contrôlée parla crystallographie. La comparaison des analyses à la microsonde électronique et des images BSE révèle également que les lamelles biréfringentes sont plus riches en calcium que les lamelles isotropes. L'addition de calcium va déformer légèrement le site X et ainsi créer un ordre sur ce site. L'énergie interne d'un grenat ordré et d'un grenat désordonné sont suffisamment proches pour qu'un léger dépassement de l'énergie de la réaction de formation permette la coexistence des 2 types de grenat dans le même minéral. La formation de lamelles est expliquée par le fait qu'un grenat préférera la même structure. Ces observations sont en accord avec la thermodynamique des solutions solides du grenat et permet d'expliquer les structures similaires observées dans des grenats provenant de lithologies différentes. Une étude de la distribution des tailles des grenats et une modélisation de la croissance a permis de mettre en évidence 2 mécanismes de croissance différents suivant la texture de la roche. Dans les 2 cas, la loi de nucléation est la même. Dans les roches litées, la loi de croissance est de forme R=kt½. Le transport des nutriments est le facteur limitant. Ce transport a lieu préférentiellement dans la direction des niveaux de schistosité. Les grenats ont une forme légèrement allongée car la croissance des secteurs est facilitée sur les niveaux de schistosité. La croissance des grenats dans les roches foliées suit une loi de croissance de la forme R=kt. Les seuls facteurs limitant la croissance sont les processus d'attachement à la surface du grenat. La loi de croissance de ces grenats est donc contrainte par la texture de la roche. Cela se marque par des différences dans la morphologie de la zonation sectorielle.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dans ce texte, nous revoyons certains développements récents de l’économétrie qui peuvent être intéressants pour des chercheurs dans des domaines autres que l’économie et nous soulignons l’éclairage particulier que l’économétrie peut jeter sur certains thèmes généraux de méthodologie et de philosophie des sciences, tels la falsifiabilité comme critère du caractère scientifique d’une théorie (Popper), la sous-détermination des théories par les données (Quine) et l’instrumentalisme. En particulier, nous soulignons le contraste entre deux styles de modélisation - l’approche parcimonieuse et l’approche statistico-descriptive - et nous discutons les liens entre la théorie des tests statistiques et la philosophie des sciences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we study several tests for the equality of two unknown distributions. Two are based on empirical distribution functions, three others on nonparametric probability density estimates, and the last ones on differences between sample moments. We suggest controlling the size of such tests (under nonparametric assumptions) by using permutational versions of the tests jointly with the method of Monte Carlo tests properly adjusted to deal with discrete distributions. We also propose a combined test procedure, whose level is again perfectly controlled through the Monte Carlo test technique and has better power properties than the individual tests that are combined. Finally, in a simulation experiment, we show that the technique suggested provides perfect control of test size and that the new tests proposed can yield sizeable power improvements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we provide both qualitative and quantitative measures of the cost of measuring the integrated volatility by the realized volatility when the frequency of observation is fixed. We start by characterizing for a general diffusion the difference between the realized and the integrated volatilities for a given frequency of observations. Then, we compute the mean and variance of this noise and the correlation between the noise and the integrated volatility in the Eigenfunction Stochastic Volatility model of Meddahi (2001a). This model has, as special examples, log-normal, affine, and GARCH diffusion models. Using some previous empirical works, we show that the standard deviation of the noise is not negligible with respect to the mean and the standard deviation of the integrated volatility, even if one considers returns at five minutes. We also propose a simple approach to capture the information about the integrated volatility contained in the returns through the leverage effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we introduce a new approach for volatility modeling in discrete and continuous time. We follow the stochastic volatility literature by assuming that the variance is a function of a state variable. However, instead of assuming that the loading function is ad hoc (e.g., exponential or affine), we assume that it is a linear combination of the eigenfunctions of the conditional expectation (resp. infinitesimal generator) operator associated to the state variable in discrete (resp. continuous) time. Special examples are the popular log-normal and square-root models where the eigenfunctions are the Hermite and Laguerre polynomials respectively. The eigenfunction approach has at least six advantages: i) it is general since any square integrable function may be written as a linear combination of the eigenfunctions; ii) the orthogonality of the eigenfunctions leads to the traditional interpretations of the linear principal components analysis; iii) the implied dynamics of the variance and squared return processes are ARMA and, hence, simple for forecasting and inference purposes; (iv) more importantly, this generates fat tails for the variance and returns processes; v) in contrast to popular models, the variance of the variance is a flexible function of the variance; vi) these models are closed under temporal aggregation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops a model of money demand where the opportunity cost of holding money is subject to regime changes. The regimes are fully characterized by the mean and variance of inflation and are assumed to be the result of alternative government policies. Agents are unable to directly observe whether government actions are indeed consistent with the inflation rate targeted as part of a stabilization program but can construct probability inferences on the basis of available observations of inflation and money growth. Government announcements are assumed to provide agents with additional, possibly truthful information regarding the regime. This specification is estimated and tested using data from the Israeli and Argentine high inflation periods. Results indicate the successful stabilization program implemented in Israel in July 1985 was more credible than either the earlier Israeli attempt in November 1984 or the Argentine programs. Government’s signaling might substantially simplify the inference problem and increase the speed of learning on the part of the agents. However, under certain conditions, it might increase the volatility of inflation. After the introduction of an inflation stabilization plan, the welfare gains from a temporary increase in real balances might be high enough to induce agents to raise their real balances in the short-term, even if they are uncertain about the nature of government policy and the eventual outcome of the stabilization attempt. Statistically, the model restrictions cannot be rejected at the 1% significance level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the problem of measuring the uncertainty of CGE (or RBC)-type model simulations associated with parameter uncertainty. We describe two approaches for building confidence sets on model endogenous variables. The first one uses a standard Wald-type statistic. The second approach assumes that a confidence set (sampling or Bayesian) is available for the free parameters, from which confidence sets are derived by a projection technique. The latter has two advantages: first, confidence set validity is not affected by model nonlinearities; second, we can easily build simultaneous confidence intervals for an unlimited number of variables. We study conditions under which these confidence sets take the form of intervals and show they can be implemented using standard methods for solving CGE models. We present an application to a CGE model of the Moroccan economy to study the effects of policy-induced increases of transfers from Moroccan expatriates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops a general stochastic framework and an equilibrium asset pricing model that make clear how attitudes towards intertemporal substitution and risk matter for option pricing. In particular, we show under which statistical conditions option pricing formulas are not preference-free, in other words, when preferences are not hidden in the stock and bond prices as they are in the standard Black and Scholes (BS) or Hull and White (HW) pricing formulas. The dependence of option prices on preference parameters comes from several instantaneous causality effects such as the so-called leverage effect. We also emphasize that the most standard asset pricing models (CAPM for the stock and BS or HW preference-free option pricing) are valid under the same stochastic setting (typically the absence of leverage effect), regardless of preference parameter values. Even though we propose a general non-preference-free option pricing formula, we always keep in mind that the BS formula is dominant both as a theoretical reference model and as a tool for practitioners. Another contribution of the paper is to characterize why the BS formula is such a benchmark. We show that, as soon as we are ready to accept a basic property of option prices, namely their homogeneity of degree one with respect to the pair formed by the underlying stock price and the strike price, the necessary statistical hypotheses for homogeneity provide BS-shaped option prices in equilibrium. This BS-shaped option-pricing formula allows us to derive interesting characterizations of the volatility smile, that is, the pattern of BS implicit volatilities as a function of the option moneyness. First, the asymmetry of the smile is shown to be equivalent to a particular form of asymmetry of the equivalent martingale measure. Second, this asymmetry appears precisely when there is either a premium on an instantaneous interest rate risk or on a generalized leverage effect or both, in other words, whenever the option pricing formula is not preference-free. Therefore, the main conclusion of our analysis for practitioners should be that an asymmetric smile is indicative of the relevance of preference parameters to price options.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A full understanding of public affairs requires the ability to distinguish between the policies that voters would like the government to adopt, and the influence that different voters or group of voters actually exert in the democratic process. We consider the properties of a computable equilibrium model of a competitive political economy in which the economic interests of groups of voters and their effective influence on equilibrium policy outcomes can be explicitly distinguished and computed. The model incorporates an amended version of the GEMTAP tax model, and is calibrated to data for the United States for 1973 and 1983. Emphasis is placed on how the aggregation of GEMTAP households into groups within which economic and political behaviour is assumed homogeneous affects the numerical representation of interests and influence for representative members of each group. Experiments with the model suggest that the changes in both interests and influence are important parts of the story behind the evolution of U.S. tax policy in the decade after 1973.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the literature on tests of normality, much concern has been expressed over the problems associated with residual-based procedures. Indeed, the specialized tables of critical points which are needed to perform the tests have been derived for the location-scale model; hence reliance on available significance points in the context of regression models may cause size distortions. We propose a general solution to the problem of controlling the size normality tests for the disturbances of standard linear regression, which is based on using the technique of Monte Carlo tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose finite sample tests and confidence sets for models with unobserved and generated regressors as well as various models estimated by instrumental variables methods. The validity of the procedures is unaffected by the presence of identification problems or \"weak instruments\", so no detection of such problems is required. We study two distinct approaches for various models considered by Pagan (1984). The first one is an instrument substitution method which generalizes an approach proposed by Anderson and Rubin (1949) and Fuller (1987) for different (although related) problems, while the second one is based on splitting the sample. The instrument substitution method uses the instruments directly, instead of generated regressors, in order to test hypotheses about the \"structural parameters\" of interest and build confidence sets. The second approach relies on \"generated regressors\", which allows a gain in degrees of freedom, and a sample split technique. For inference about general possibly nonlinear transformations of model parameters, projection techniques are proposed. A distributional theory is obtained under the assumptions of Gaussian errors and strictly exogenous regressors. We show that the various tests and confidence sets proposed are (locally) \"asymptotically valid\" under much weaker assumptions. The properties of the tests proposed are examined in simulation experiments. In general, they outperform the usual asymptotic inference methods in terms of both reliability and power. Finally, the techniques suggested are applied to a model of Tobin’s q and to a model of academic performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent work shows that a low correlation between the instruments and the included variables leads to serious inference problems. We extend the local-to-zero analysis of models with weak instruments to models with estimated instruments and regressors and with higher-order dependence between instruments and disturbances. This makes this framework applicable to linear models with expectation variables that are estimated non-parametrically. Two examples of such models are the risk-return trade-off in finance and the impact of inflation uncertainty on real economic activity. Results show that inference based on Lagrange Multiplier (LM) tests is more robust to weak instruments than Wald-based inference. Using LM confidence intervals leads us to conclude that no statistically significant risk premium is present in returns on the S&P 500 index, excess holding yields between 6-month and 3-month Treasury bills, or in yen-dollar spot returns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We characterize Paretian quasi-orders in the two-agent continuous case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent work suggests that the conditional variance of financial returns may exhibit sudden jumps. This paper extends a non-parametric procedure to detect discontinuities in otherwise continuous functions of a random variable developed by Delgado and Hidalgo (1996) to higher conditional moments, in particular the conditional variance. Simulation results show that the procedure provides reasonable estimates of the number and location of jumps. This procedure detects several jumps in the conditional variance of daily returns on the S&P 500 index.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss statistical inference problems associated with identification and testability in econometrics, and we emphasize the common nature of the two issues. After reviewing the relevant statistical notions, we consider in turn inference in nonparametric models and recent developments on weakly identified models (or weak instruments). We point out that many hypotheses, for which test procedures are commonly proposed, are not testable at all, while some frequently used econometric methods are fundamentally inappropriate for the models considered. Such situations lead to ill-defined statistical problems and are often associated with a misguided use of asymptotic distributional results. Concerning nonparametric hypotheses, we discuss three basic problems for which such difficulties occur: (1) testing a mean (or a moment) under (too) weak distributional assumptions; (2) inference under heteroskedasticity of unknown form; (3) inference in dynamic models with an unlimited number of parameters. Concerning weakly identified models, we stress that valid inference should be based on proper pivotal functions —a condition not satisfied by standard Wald-type methods based on standard errors — and we discuss recent developments in this field, mainly from the viewpoint of building valid tests and confidence sets. The techniques discussed include alternative proposed statistics, bounds, projection, split-sampling, conditioning, Monte Carlo tests. The possibility of deriving a finite-sample distributional theory, robustness to the presence of weak instruments, and robustness to the specification of a model for endogenous explanatory variables are stressed as important criteria assessing alternative procedures.