63 resultados para Hypothesis test
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
When the behaviour of a specific hypothesis test statistic is studied by aMonte Carlo experiment, the usual way to describe its quality is by givingthe empirical level of the test. As an alternative to this procedure, we usethe empirical distribution of the obtained \emph{p-}values and exploit itsinformation both graphically and numerically.
Resumo:
This paper discusses inference in self exciting threshold autoregressive (SETAR)models. Of main interest is inference for the threshold parameter. It iswell-known that the asymptotics of the corresponding estimator depend uponwhether the SETAR model is continuous or not. In the continuous case, thelimiting distribution is normal and standard inference is possible. Inthe discontinuous case, the limiting distribution is non-normal and cannotbe estimated consistently. We show valid inference can be drawn by theuse of the subsampling method. Moreover, the method can even be extendedto situations where the (dis)continuity of the model is unknown. In thiscase, also the inference for the regression parameters of the modelbecomes difficult and subsampling can be used advantageously there aswell. In addition, we consider an hypothesis test for the continuity ofthe SETAR model. A simulation study examines small sample performance.
Resumo:
It is proved the algebraic equality between Jennrich's (1970) asymptotic$X^2$ test for equality of correlation matrices, and a Wald test statisticderived from Neudecker and Wesselman's (1990) expression of theasymptoticvariance matrix of the sample correlation matrix.
Resumo:
Small sample properties are of fundamental interest when only limited data is avail-able. Exact inference is limited by constraints imposed by speci.c nonrandomizedtests and of course also by lack of more data. These e¤ects can be separated as we propose to evaluate a test by comparing its type II error to the minimal type II error among all tests for the given sample. Game theory is used to establish this minimal type II error, the associated randomized test is characterized as part of a Nash equilibrium of a .ctitious game against nature.We use this method to investigate sequential tests for the di¤erence between twomeans when outcomes are constrained to belong to a given bounded set. Tests ofinequality and of noninferiority are included. We .nd that inference in terms oftype II error based on a balanced sample cannot be improved by sequential sampling or even by observing counter factual evidence providing there is a reasonable gap between the hypotheses.
Resumo:
This paper analyzes whether standard covariance matrix tests work whendimensionality is large, and in particular larger than sample size. Inthe latter case, the singularity of the sample covariance matrix makeslikelihood ratio tests degenerate, but other tests based on quadraticforms of sample covariance matrix eigenvalues remain well-defined. Westudy the consistency property and limiting distribution of these testsas dimensionality and sample size go to infinity together, with theirratio converging to a finite non-zero limit. We find that the existingtest for sphericity is robust against high dimensionality, but not thetest for equality of the covariance matrix to a given matrix. For thelatter test, we develop a new correction to the existing test statisticthat makes it robust against high dimensionality.
Resumo:
Consider the problem of testing k hypotheses simultaneously. In this paper,we discuss finite and large sample theory of stepdown methods that providecontrol of the familywise error rate (FWE). In order to improve upon theBonferroni method or Holm's (1979) stepdown method, Westfall and Young(1993) make eective use of resampling to construct stepdown methods thatimplicitly estimate the dependence structure of the test statistics. However,their methods depend on an assumption called subset pivotality. The goalof this paper is to construct general stepdown methods that do not requiresuch an assumption. In order to accomplish this, we take a close look atwhat makes stepdown procedures work, and a key component is a monotonicityrequirement of critical values. By imposing such monotonicity on estimatedcritical values (which is not an assumption on the model but an assumptionon the method), it is demonstrated that the problem of constructing a validmultiple test procedure which controls the FWE can be reduced to the problemof contructing a single test which controls the usual probability of a Type 1error. This reduction allows us to draw upon an enormous resamplingliterature as a general means of test contruction.
Resumo:
Although it is commonly accepted that most macroeconomic variables are nonstationary, it is often difficult to identify the source of the non-stationarity. In particular, it is well-known that integrated and short memory models containing trending components that may display sudden changes in their parameters share some statistical properties that make their identification a hard task. The goal of this paper is to extend the classical testing framework for I(1) versus I(0)+ breaks by considering a a more general class of models under the null hypothesis: non-stationary fractionally integrated (FI) processes. A similar identification problem holds in this broader setting which is shown to be a relevant issue from both a statistical and an economic perspective. The proposed test is developed in the time domain and is very simple to compute. The asymptotic properties of the new technique are derived and it is shown by simulation that it is very well-behaved in finite samples. To illustrate the usefulness of the proposed technique, an application using inflation data is also provided.
Resumo:
The number of hypothesis trying to explain which are the reasons behind the decision to migrate to work into a developed country are diverse and at the same time, difficult to test due to the multiplicity of factors which affect it. This papers attempts to move forward trying to disentangle which are the socio-economic factors that explain the differences in the figures of immigrants in the OECD countries. We show empirical evidence about the determinants of the migratory flows to 17 OECD countries from 65 countries in the 1980-2000 period. Our results reveal the importance to differentiate the inflows composition by at least income in the origin countries. Thus, regarding inflows from non-high-income countries, the results suggest that there is a pull effect from monetary and not real income, and then, the welfare magnets hypothesis should be rejected. This group reacts more to the migratory policy than the inflows coming from high-income countries, although those policies designed to slow down the inflows have not been able, in the aggregate, to reduce them.
Resumo:
Several unit root tests in panel data have recently been proposed. The test developed by Harris and Tzavalis (1999 JoE) performs particularly well when the time dimension is moderate in relation to the cross-section dimension. However, in common with the traditional tests designed for the unidimensional case, it was found to perform poorly when there is a structural break in the time series under the alternative. Here we derive the asymptotic distribution of the test allowing for a shift in the mean, and assess the small sample performance. We apply this new test to show how the hypothesis of (perfect) hysteresis in Spanish unemployment is rejected in favour of the alternative of the natural unemployment rate, when the possibility of a change in the latter is considered.
Resumo:
Several unit root tests in panel data have recently been proposed. The test developed by Harris and Tzavalis (1999 JoE) performs particularly well when the time dimension is moderate in relation to the cross-section dimension. However, in common with the traditional tests designed for the unidimensional case, it was found to perform poorly when there is a structural break in the time series under the alternative. Here we derive the asymptotic distribution of the test allowing for a shift in the mean, and assess the small sample performance. We apply this new test to show how the hypothesis of (perfect) hysteresis in Spanish unemployment is rejected in favour of the alternative of the natural unemployment rate, when the possibility of a change in the latter is considered.
Resumo:
A change in paradigm is needed in the prevention of toxic effects on the nervous system, moving from its present reliance solely on data from animal testing to a prediction model mostly based on in vitro toxicity testing and in silico modeling. According to the report published by the National Research Council (NRC) of the US National Academies of Science, high-throughput in vitro tests will provide evidence for alterations in"toxicity pathways" as the best possible method of large scale toxicity prediction. The challenges to implement this proposal are enormous, and provide much room for debate. While many efforts address the technical aspects of implementing the vision, many questions around it need also to be addressed. Is the overall strategy the only one to be pursued? How can we move from current to future paradigms? Will we ever be able to reliably model for chronic and developmental neurotoxicity in vitro? This paper summarizes four presentations from a symposium held at the International Neurotoxicology Conference held in Xi"an, China, in June 2011. A. Li reviewed the current guidelines for neurotoxicity and developmental neurotoxicity testing, and discussed the major challenges existing to realize the NCR vision for toxicity testing. J. Llorens reviewed the biology of mammalian toxic avoidance in view of present knowledge on the physiology and molecular biology of the chemical senses, taste and smell. This background information supports the hypothesis that relating in vivo toxicity to chemical epitope descriptors that mimic the chemical encoding performed by the olfactory system may provide a way to the long term future of complete in silico toxicity prediction. S. Ceccatelli reviewed the implementation of rodent and human neural stem cells (NSCs) as models for in vitro toxicity testing that measures parameters such as cell proliferation, differentiation and migration. These appear to be sensitive endpoints that can identify substances with developmental neurotoxic potential. C. Sun ol reviewed the use of primary neuronal cultures in testing for neurotoxicity of environmental pollutants, including the study of the effects of persistent exposures and/or in differentiating cells, which allow recording of effects that can be extrapolated to human developmental neurotoxicity.
Resumo:
Many studies have shown that IQs have been increasing over the last half century. These increases have come to be known as «the Flynn effect». The «Flynn effect» represents a difference on ability-level between groups of the same age but different cohort. The ability-level differentiation hypothesis represents a difference on the relevance of cognitive factors between groups of high and low ability. Hence, it should be possible to imitate the ability-level differentiation effect by comparing groups of the same age but different cohort. The indifferentiation hypothesis represents no differences on the relevance of cognitive abilities in all age groups within the same cohort. The aim of the present study is to test the relationships between these phenomena. For this purpose we analyzed the American standardisation samples of the WISC, WISC-R and WISC-III. Results support the link between the Flynn effect and the differentiation hypothesis. Also, reported evidence replicate previous findings supporting the indifferentiation hypothesis. Implications for the assessment of the intelligence are discussed.
Resumo:
Material throughput is a means of measuring the so-called social metabolism, or physical dimensions of a society’s consumption, and can be taken as an indirect and approximate indicator of sustainability. Material flow accounting can be used to test the dematerialisation hypothesis, the idea that technological progress causes a decrease in total material used (strong dematerialisation) or material used per monetary unit of output (weak dematerialisation). This paper sets out the results of a material flow analysis for Spain for the period from 1980 to 2000. The analysis reveals that neither strong nor weak dematerialisation took place during the period analysed. Although the population did not increase considerably, materials mobilised by the Spanish economy (DMI) increased by 85% in absolute terms, surpassing GDP growth. In addition, Spain became more dependent on external trade in physical terms. In fact, its imports are more than twice the amount of its exports in terms of weight.
Resumo:
Vegeu el resum a l'inici del document del fitxer adjunt.