919 resultados para Process control -- Statistical methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work provides a contribution to a better understanding of the trophic ecology of important predators in the Northern Humboldt Current System, the jack mackerel (Trachurus murphyi), the chub mackerel (Scomber japonicus) and the jumbo squid (Dosidicus gigas) by the characterization of the highly variable feeding patterns of these species at different spatiotemporal scales. We provided new knowledge on the comparative trophic behaviour of these species, defined as opportunistic in previous investigations. For that purpose we applied a variety of statistical methods to an extensive dataset of 27,188 non-empty stomachs. We defined the spatial organization of the forage fauna of these predators and documented changes in prey composition according to predators’ size and spatiotemporal features of environment. Our results highligh the key role played by the dissolved oxygen. We also deciphered an important paradox on the jumbo squid diet: why do they hardly forage on the huge anchovy (Engraulis ringens) biomass distributed of coastal Peru? We showed that the shallow oxygen minimum zone present off coastal Peru could hamper the co-occurrence of jumbo squids and anchovies. In addition, we proposed a conceptual model on jumbo squid trophic ecology including the ontogenetic cycle, oxygen and prey availability. Moreover we showed that the trophic behaviour of jack mackerel and chub mackerel is adapted to forage on more accessible species such as for example the squat lobster Pleurocondes monodon and Zoea larvae. Besides, both predators present a trophic overlap. But jack mackerel was not as oracious as chub mackerel, contradictorily to what was observed by others authors. Fish diet presented a high spatiotemporal variability, and the shelf break appeared as a strong biogeographical frontier. Diet composition of our fish predators was not necessarily a consistent indicator of changes in prey biomass. El Niño events had a weak effect on the stomach fullness and diet composition of chub mackerel and jack mackerel. Moreover, decadal changes in diet diversity challenged the classic paradigm of positive correlation between species richness and temperature. Finally, the global patterns that we described in this work, illustrated the opportunistic foraging behaviour, life strategies and the high degree of plasticity of these species. Such behaviour allows adaptation to changes in the environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Teollisuuden palveluiden on huomattu olevan potentiaalinen lisätulojen lähde. Teollisuuden palveluiden dynaamisessa maailmassa räätälöinti ja kyky toimia nopeasti ovat kriittisiä asiakastyytyväisyyden ja kilpailuedun luomisprosessin osia. Toimitusketjussa käytetyn ajan lyhentämisellä voidaan saavuttaa sekä paremmat vasteajat, että alhaisemmat kokonaiskustannukset. Tutkielman tavoitteena on kuvata teollisuuden palveluiden dynaamista ympäristöä: asiakastarvetta, sekä mahdollisuuksia kaventaa pyydetyn ja saavutetun toimitusajan välistä eroa. Tämä toteutetaan pääosin strategisen toimitusajan hallinnan keinoin. Langattomien tietoliikenneverkkojen operaattorit haluavat vähentää ydinosaamiseensa kuulumatomiin toimintoihin, kuten ylläpitoon sitoutuneita pääomia. Tutkielman case osiossa varaosapalvelujen toimitusketjun kysyntä-, materiaali- ja informaatiovirtoja analysoidaan niin kvalitatiivisten haastatteluiden, sisäisten dokumenttien, kuin kvantitatiivisten tilastollisten menetelmienkin avulla. Löydöksiä peilataan vallitsevaa toimitusketjun ja ajanhallinnan paradigmaa vasten. Tulokset osoittavat, että vahvan palvelukulttuurin omaksuminen ja kokonaisvaltainen toimitusketjun tehokkuuden mittaaminen ovat ajanhallinnan lähtökohtia teollisuuden palveluissa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cognitive interviews were used to evaluate two draft versions of a financial survey in Jamaica. The qualitative version used a few open-ended questions, and the quantitative version used numerous close-ended questions. A secondary analysis based on the cognitive interview literature was used to guide a content analysis of the aggregate data of both surveys. The cognitive interview analysis found that the long survey had fewer respondent errors than the open-ended questions on the short survey. A grounded theory analysis then examined the aggregate cognitive data, showing that the respondents attached complex meanings to their financial information. The main limitation of this study was that the standard assessments of quantitative and qualitative reliability and validity were not utilized. Further research should utilize statistical methods to compare and contrast aggregated cognitive interview probe responses on open and close ended surveys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies seemingly unrelated linear models with integrated regressors and stationary errors. By adding leads and lags of the first differences of the regressors and estimating this augmented dynamic regression model by feasible generalized least squares using the long-run covariance matrix, we obtain an efficient estimator of the cointegrating vector that has a limiting mixed normal distribution. Simulation results suggest that this new estimator compares favorably with others already proposed in the literature. We apply these new estimators to the testing of purchasing power parity (PPP) among the G-7 countries. The test based on the efficient estimates rejects the PPP hypothesis for most countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the context of multivariate linear regression (MLR) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. In this paper, we propose a general method for constructing exact tests of possibly nonlinear hypotheses on the coefficients of MLR systems. For the case of uniform linear hypotheses, we present exact distributional invariance results concerning several standard test criteria. These include Wilks' likelihood ratio (LR) criterion as well as trace and maximum root criteria. The normality assumption is not necessary for most of the results to hold. Implications for inference are two-fold. First, invariance to nuisance parameters entails that the technique of Monte Carlo tests can be applied on all these statistics to obtain exact tests of uniform linear hypotheses. Second, the invariance property of the latter statistic is exploited to derive general nuisance-parameter-free bounds on the distribution of the LR statistic for arbitrary hypotheses. Even though it may be difficult to compute these bounds analytically, they can easily be simulated, hence yielding exact bounds Monte Carlo tests. Illustrative simulation experiments show that the bounds are sufficiently tight to provide conclusive results with a high probability. Our findings illustrate the value of the bounds as a tool to be used in conjunction with more traditional simulation-based test methods (e.g., the parametric bootstrap) which may be applied when the bounds are not conclusive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dans ce texte, nous revoyons certains développements récents de l’économétrie qui peuvent être intéressants pour des chercheurs dans des domaines autres que l’économie et nous soulignons l’éclairage particulier que l’économétrie peut jeter sur certains thèmes généraux de méthodologie et de philosophie des sciences, tels la falsifiabilité comme critère du caractère scientifique d’une théorie (Popper), la sous-détermination des théories par les données (Quine) et l’instrumentalisme. En particulier, nous soulignons le contraste entre deux styles de modélisation - l’approche parcimonieuse et l’approche statistico-descriptive - et nous discutons les liens entre la théorie des tests statistiques et la philosophie des sciences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper assesses the empirical performance of an intertemporal option pricing model with latent variables which generalizes the Hull-White stochastic volatility formula. Using this generalized formula in an ad-hoc fashion to extract two implicit parameters and forecast next day S&P 500 option prices, we obtain similar pricing errors than with implied volatility alone as in the Hull-White case. When we specialize this model to an equilibrium recursive utility model, we show through simulations that option prices are more informative than stock prices about the structural parameters of the model. We also show that a simple method of moments with a panel of option prices provides good estimates of the parameters of the model. This lays the ground for an empirical assessment of this equilibrium model with S&P 500 option prices in terms of pricing errors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dans ce texte, nous analysons les développements récents de l’économétrie à la lumière de la théorie des tests statistiques. Nous revoyons d’abord quelques principes fondamentaux de philosophie des sciences et de théorie statistique, en mettant l’accent sur la parcimonie et la falsifiabilité comme critères d’évaluation des modèles, sur le rôle de la théorie des tests comme formalisation du principe de falsification de modèles probabilistes, ainsi que sur la justification logique des notions de base de la théorie des tests (tel le niveau d’un test). Nous montrons ensuite que certaines des méthodes statistiques et économétriques les plus utilisées sont fondamentalement inappropriées pour les problèmes et modèles considérés, tandis que de nombreuses hypothèses, pour lesquelles des procédures de test sont communément proposées, ne sont en fait pas du tout testables. De telles situations conduisent à des problèmes statistiques mal posés. Nous analysons quelques cas particuliers de tels problèmes : (1) la construction d’intervalles de confiance dans le cadre de modèles structurels qui posent des problèmes d’identification; (2) la construction de tests pour des hypothèses non paramétriques, incluant la construction de procédures robustes à l’hétéroscédasticité, à la non-normalité ou à la spécification dynamique. Nous indiquons que ces difficultés proviennent souvent de l’ambition d’affaiblir les conditions de régularité nécessaires à toute analyse statistique ainsi que d’une utilisation inappropriée de résultats de théorie distributionnelle asymptotique. Enfin, nous soulignons l’importance de formuler des hypothèses et modèles testables, et de proposer des techniques économétriques dont les propriétés sont démontrables dans les échantillons finis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presently, conditions ensuring the validity of bootstrap methods for the sample mean of (possibly heterogeneous) near epoch dependent (NED) functions of mixing processes are unknown. Here we establish the validity of the bootstrap in this context, extending the applicability of bootstrap methods to a class of processes broadly relevant for applications in economics and finance. Our results apply to two block bootstrap methods: the moving blocks bootstrap of Künsch ( 989) and Liu and Singh ( 992), and the stationary bootstrap of Politis and Romano ( 994). In particular, the consistency of the bootstrap variance estimator for the sample mean is shown to be robust against heteroskedasticity and dependence of unknown form. The first order asymptotic validity of the bootstrap approximation to the actual distribution of the sample mean is also established in this heterogeneous NED context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a probabilistic approach to the problem of assigning k indivisible identical objects to a set of agents with single-peaked preferences. Using the ordinal extension of preferences, we characterize the class of uniform probabilistic rules by Pareto efficiency, strategy-proofness, and no-envy. We also show that in this characterization no-envy cannot be replaced by anonymity. When agents are strictly risk averse von-Neumann-Morgenstern utility maximizers, then we reduce the problem of assigning k identical objects to a problem of allocating the amount k of an infinitely divisible commodity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Suzumura shows that a binary relation has a weak order extension if and only if it is consistent. However, consistency is demonstrably not sufficient to extend an upper semi-continuous binary relation to an upper semicontinuous weak order. Jaffray proves that any asymmetric (or reflexive), transitive and upper semicontinuous binary relation has an upper semicontinuous strict (or weak) order extension. We provide sufficient conditions for existence of upper semicontinuous extensions of consistence rather than transitive relations. For asymmetric relations, consistency and upper semicontinuity suffice. For more general relations, we prove one theorem using a further consistency property and another with an additional continuity requirement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops a general stochastic framework and an equilibrium asset pricing model that make clear how attitudes towards intertemporal substitution and risk matter for option pricing. In particular, we show under which statistical conditions option pricing formulas are not preference-free, in other words, when preferences are not hidden in the stock and bond prices as they are in the standard Black and Scholes (BS) or Hull and White (HW) pricing formulas. The dependence of option prices on preference parameters comes from several instantaneous causality effects such as the so-called leverage effect. We also emphasize that the most standard asset pricing models (CAPM for the stock and BS or HW preference-free option pricing) are valid under the same stochastic setting (typically the absence of leverage effect), regardless of preference parameter values. Even though we propose a general non-preference-free option pricing formula, we always keep in mind that the BS formula is dominant both as a theoretical reference model and as a tool for practitioners. Another contribution of the paper is to characterize why the BS formula is such a benchmark. We show that, as soon as we are ready to accept a basic property of option prices, namely their homogeneity of degree one with respect to the pair formed by the underlying stock price and the strike price, the necessary statistical hypotheses for homogeneity provide BS-shaped option prices in equilibrium. This BS-shaped option-pricing formula allows us to derive interesting characterizations of the volatility smile, that is, the pattern of BS implicit volatilities as a function of the option moneyness. First, the asymmetry of the smile is shown to be equivalent to a particular form of asymmetry of the equivalent martingale measure. Second, this asymmetry appears precisely when there is either a premium on an instantaneous interest rate risk or on a generalized leverage effect or both, in other words, whenever the option pricing formula is not preference-free. Therefore, the main conclusion of our analysis for practitioners should be that an asymmetric smile is indicative of the relevance of preference parameters to price options.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of accessing the uncertainty of calibrated parameters in computable general equilibrium (CGE) models through the construction of confidence sets (or intervals) for these parameters. We study two different setups under which this can be done.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the literature on tests of normality, much concern has been expressed over the problems associated with residual-based procedures. Indeed, the specialized tables of critical points which are needed to perform the tests have been derived for the location-scale model; hence reliance on available significance points in the context of regression models may cause size distortions. We propose a general solution to the problem of controlling the size normality tests for the disturbances of standard linear regression, which is based on using the technique of Monte Carlo tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the issue of estimating semiparametric time series models specified by their conditional mean and conditional variance. We stress the importance of using joint restrictions on the mean and variance. This leads us to take into account the covariance between the mean and the variance and the variance of the variance, that is, the skewness and kurtosis. We establish the direct links between the usual parametric estimation methods, namely, the QMLE, the GMM and the M-estimation. The ususal univariate QMLE is, under non-normality, less efficient than the optimal GMM estimator. However, the bivariate QMLE based on the dependent variable and its square is as efficient as the optimal GMM one. A Monte Carlo analysis confirms the relevance of our approach, in particular, the importance of skewness.