951 resultados para Statistical hypothesis testing
Resumo:
In this paper, we show that the widely used stationarity tests such as the KPSS test have power close to size in the presence of time-varying unconditional variance. We propose a new test as a complement of the existing tests. Monte Carlo experiments show that the proposed test possesses the following characteristics: (i) In the presence of unit root or a structural change in the mean, the proposed test is as powerful as the KPSS and other tests; (ii) In the presence a changing variance, the traditional tests perform badly whereas the proposed test has high power comparing to the existing tests; (iii) The proposed test has the same size as traditional stationarity tests under the null hypothesis of stationarity. An application to daily observations of return on US Dollar/Euro exchange rate reveals the existence of instability in the unconditional variance when the entire sample is considered, but stability is found in subsamples.
Resumo:
Uma das principais vantagens das estratégias de negociação em pares está relacionada à baixa correlação com os retornos do mercado. Ao tomar posições compradas e vendidas, estas estratégias são capazes de controlar a magnitude do beta de mercado, mantendo-se praticamente zero ou estatísticamente não significativas. A idéia consiste na realização de arbitragem estatística, aproveitando os desvios de preços de equilíbrio de longo prazo. Como tal, elas envolvem modelos de correção de equilíbrio para os pares de retornos dos ativos. Nós mostramos como construir uma estratégia de negociação de pares que é beneficiada não só pela relação de equilíbrio de longo prazo entre os pares de preços dos ativos da carteira, mas também pela velocidade com que os preços ajustam os desvios para o equilíbrio. Até então, a grande maioria das estratégias envolvendo negociação em pares se baseavam na hipótese de que a obtenção de retornos positivos estaria relacionada à reversão à média caracterizada pela relação de cointegração dos pares, mas ignorava a possibilidade de seleção dos pares testando a velocidade de ajustamento do Vetor de Correção de Erros desta relação. Os resutados deste trabalho indicaram baixos níveis de correlação com o mercado, neutralidade das estratégias, associados a retornos financeiros líquidos e Índice de Sharpe anualizados de 15,05% e 1,96 respectivamente.
Resumo:
In this paper, we show that the widely used stationarity tests such as the KPSS test has power close to size in the presence of time-varying unconditional variance. We propose a new test as a complement of the existing tests. Monte Carlo experiments show that the proposed test possesses the following characteristics: (i) In the presence of unit root or a structural change in the mean, the proposed test is as powerful as the KPSS and other tests; (ii) In the presence a changing variance, the traditional tests perform badly whereas the proposed test has high power comparing to the existing tests; (iii) The proposed test has the same size as traditional stationarity tests under the null hypothesis of covariance stationarity. An application to daily observations of return on US Dollar/Euro exchange rate reveals the existence of instability in the unconditional variance when the entire sample is considered, but stability is found in sub-samples.
Resumo:
Extreme rainfall events have triggered a significant number of flash floods in Madeira Island along its past and recent history. Madeira is a volcanic island where the spatial rainfall distribution is strongly affected by its rugged topography. In this thesis, annual maximum of daily rainfall data from 25 rain gauge stations located in Madeira Island were modelled by the generalised extreme value distribution. Also, the hypothesis of a Gumbel distribution was tested by two methods and the existence of a linear trend in both distributions parameters was analysed. Estimates for the 50– and 100–year return levels were also obtained. Still in an univariate context, the assumption that a distribution function belongs to the domain of attraction of an extreme value distribution for monthly maximum rainfall data was tested for the rainy season. The available data was then analysed in order to find the most suitable domain of attraction for the sampled distribution. In a different approach, a search for thresholds was also performed for daily rainfall values through a graphical analysis. In a multivariate context, a study was made on the dependence between extreme rainfall values from the considered stations based on Kendall’s τ measure. This study suggests the influence of factors such as altitude, slope orientation, distance between stations and their proximity of the sea on the spatial distribution of extreme rainfall. Groups of three pairwise associated stations were also obtained and an adjustment was made to a family of extreme value copulas involving the Marshall–Olkin family, whose parameters can be written as a function of Kendall’s τ association measures of the obtained pairs.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The code STATFLUX, implementing a new and simple statistical procedure for the calculation of transfer coefficients in radionuclide transport to animals and plants, is proposed. The method is based on the general multiple-compartment model, which uses a system of linear equations involving geometrical volume considerations. Flow parameters were estimated by employing two different least-squares procedures: Derivative and Gauss-Marquardt methods, with the available experimental data of radionuclide concentrations as the input functions of time. The solution of the inverse problem, which relates a given set of flow parameter with the time evolution of concentration functions, is achieved via a Monte Carlo Simulation procedure.Program summaryTitle of program: STATFLUXCatalogue identifier: ADYS_v1_0Program summary URL: http://cpc.cs.qub.ac.uk/summaries/ADYS_v1_0Program obtainable from: CPC Program Library, Queen's University of Belfast, N. IrelandLicensing provisions: noneComputer for which the program is designed and others on which it has been tested: Micro-computer with Intel Pentium III, 3.0 GHzInstallation: Laboratory of Linear Accelerator, Department of Experimental Physics, University of São Paulo, BrazilOperating system: Windows 2000 and Windows XPProgramming language used: Fortran-77 as implemented in Microsoft Fortran 4.0. NOTE: Microsoft Fortran includes non-standard features which are used in this program. Standard Fortran compilers such as, g77, f77, ifort and NAG95, are not able to compile the code and therefore it has not been possible for the CPC Program Library to test the program.Memory, required to execute with typical data: 8 Mbytes of RAM memory and 100 MB of Hard disk memoryNo. of bits in a word: 16No. of lines in distributed program, including test data, etc.: 6912No. of bytes in distributed Program, including test data, etc.: 229 541Distribution format: tar.gzNature of the physical problem: the investigation of transport mechanisms for radioactive substances, through environmental pathways, is very important for radiological protection of populations. One such pathway, associated with the food chain, is the grass-animal-man sequence. The distribution of trace elements in humans and laboratory animals has been intensively studied over the past 60 years [R.C. Pendlenton, C.W. Mays, R.D. Lloyd, A.L. Brooks, Differential accumulation of iodine-131 from local fallout in people and milk, Health Phys. 9 (1963) 1253-1262]. In addition, investigations on the incidence of cancer in humans, and a possible causal relationship to radioactive fallout, have been undertaken [E.S. Weiss, M.L. Rallison, W.T. London, W.T. Carlyle Thompson, Thyroid nodularity in southwestern Utah school children exposed to fallout radiation, Amer. J. Public Health 61 (1971) 241-249; M.L. Rallison, B.M. Dobyns, F.R. Keating, J.E. Rall, F.H. Tyler, Thyroid diseases in children, Amer. J. Med. 56 (1974) 457-463; J.L. Lyon, M.R. Klauber, J.W. Gardner, K.S. Udall, Childhood leukemia associated with fallout from nuclear testing, N. Engl. J. Med. 300 (1979) 397-402]. From the pathways of entry of radionuclides in the human (or animal) body, ingestion is the most important because it is closely related to life-long alimentary (or dietary) habits. Those radionuclides which are able to enter the living cells by either metabolic or other processes give rise to localized doses which can be very high. The evaluation of these internally localized doses is of paramount importance for the assessment of radiobiological risks and radiological protection. The time behavior of trace concentration in organs is the principal input for prediction of internal doses after acute or chronic exposure. The General Multiple-Compartment Model (GMCM) is the powerful and more accepted method for biokinetical studies, which allows the calculation of concentration of trace elements in organs as a function of time, when the flow parameters of the model are known. However, few biokinetics data exist in the literature, and the determination of flow and transfer parameters by statistical fitting for each system is an open problem.Restriction on the complexity of the problem: This version of the code works with the constant volume approximation, which is valid for many situations where the biological half-live of a trace is lower than the volume rise time. Another restriction is related to the central flux model. The model considered in the code assumes that exist one central compartment (e.g., blood), that connect the flow with all compartments, and the flow between other compartments is not included.Typical running time: Depends on the choice for calculations. Using the Derivative Method the time is very short (a few minutes) for any number of compartments considered. When the Gauss-Marquardt iterative method is used the calculation time can be approximately 5-6 hours when similar to 15 compartments are considered. (C) 2006 Elsevier B.V. All rights reserved.
Resumo:
The disturbance vicariance hypothesis (DV) has been proposed to explain speciation in Amazonia, especially its edge regions, e. g. in eastern Guiana Shield harlequin frogs (Atelopus) which are suggested to have derived from a cool-adapted Andean ancestor. In concordance with DV predictions we studied that (i) these amphibians display a natural distribution gap in central Amazonia; (ii) east of this gap they constitute a monophyletic lineage which is nested within a pre-Andean/western clade; (iii) climate envelopes of Atelopus west and east of the distribution gap show some macroclimatic divergence due to a regional climate envelope shift; (iv) geographic distributions of climate envelopes of western and eastern Atelopus range into central Amazonia but with limited spatial overlap. We tested if presence and apparent absence data points of Atelopus were homogenously distributed with Ripley's K function. A molecular phylogeny (mitochondrial 16S rRNA gene) was reconstructed using Maximum Likelihood and Bayesian Inference to study if Guianan Atelopus constitute a clade nested within a larger genus phylogeny. We focused on climate envelope divergence and geographic distribution by computing climatic envelope models with MaxEnt based on macroscale bioclimatic parameters and testing them by using Schoener's index and modified Hellinger distance. We corroborated existing DV predictions and, for the first time, formulated new DV predictions aiming on species' climate envelope change. Our results suggest that cool-adapted Andean Atelopus ancestors had dispersed into the Amazon basin and further onto the eastern Guiana Shield where, under warm conditions, they were forced to change climate envelopes. © 2010 The Author(s).
Resumo:
We review the basic hypotheses which motivate the statistical framework used to analyze the cosmic microwave background, and how that framework can be enlarged as we relax those hypotheses. In particular, we try to separate as much as possible the questions of gaussianity, homogeneity, and isotropy from each other. We focus both on isotropic estimators of nongaussianity as well as statistically anisotropic estimators of gaussianity, giving particular emphasis on their signatures and the enhanced cosmic variances that become increasingly important as our putative Universe becomes less symmetric. After reviewing the formalism behind some simple model-independent tests, we discuss how these tests can be applied to CMBdata when searching for large-scale anomalies. Copyright © 2010 L. Raul Abramo and Thiago S. Pereira.
Resumo:
The allelic frequencies of 12 short tandem repeat loci were obtained from a sample of 307 unrelated individuals living in Macapá, a city in the northern Amazon region, Brazil. These loci are the most commonly used in forensics and paternity testing. Based on the allele frequency obtained for the population of Macapá, we estimated an interethnic admixture for the three parental groups (European, Native American and African) of, respectively, 46%, 35% and 19%. Comparing these allele frequencies with those of other Brazilian populations and of the Iberian Peninsula population, no significant distances were observed. The interpopulation genetic distances (FST coefficients) to the present database ranged from FST = 0.0016 between Macapá and Belém to FST = 0.0036 between Macapá and the Iberian Peninsula.
Resumo:
Objectives. To verify the hypothesis that crack analysis and a mechanical test would rank a series of composites in a similar order with respect to polymerization stress. Also, both tests would show similar relationships between stress and composite elastic modulus and/or shrinkage. Methods. Soda-lime glass discs (2-mm thick) with a central perforation (3.5-mm diameter) received four Vickers indentations 500 mu m from the cavity margin. The indent cracks were measured (500x) prior and 10 min after the cavity was restored with one of six materials (Kalore/KL, Gradia/GR, Ice/IC, Wave/WV, Majesty Flow/MF, and Majesty Posterior/MP). Stresses at the indent site were calculated based on glass fracture toughness and increase in crack length. Stress at the bonded interface was calculated using the equation for an internally pressurized cylinder. The mechanical test used a universal testing machine and glass rods (5-mm diameter) as substrate. An extensometer monitored specimen height (2 mm). Nominal stress was calculated dividing the maximum shrinkage force by the specimen cross-sectional area. Composite elastic modulus was determined by nanoindentation and post-gel shrinkage was measured using strain gages. Data were subjected to one-way ANOVA/Tukey or Kruskal-Wallis/Mann-Whitney tests (alpha: 5%). Results. Both tests grouped the composites in three statistical subsets, with small differences in overlapping between the intermediate subset (MF, WV) and the highest (MP, IC) or the lowest stress materials (KL, GR). Higher stresses were developed by composites with high modulus and/or high shrinkage. Significance. Crack analysis demonstrated to be as effective as the mechanical test to rank composites regarding polymerization stress. (c) 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
Background: The evaluation of associations between genotypes and diseases in a case-control framework plays an important role in genetic epidemiology. This paper focuses on the evaluation of the homogeneity of both genotypic and allelic frequencies. The traditional test that is used to check allelic homogeneity is known to be valid only under Hardy-Weinberg equilibrium, a property that may not hold in practice. Results: We first describe the flaws of the traditional (chi-squared) tests for both allelic and genotypic homogeneity. Besides the known problem of the allelic procedure, we show that whenever these tests are used, an incoherence may arise: sometimes the genotypic homogeneity hypothesis is not rejected, but the allelic hypothesis is. As we argue, this is logically impossible. Some methods that were recently proposed implicitly rely on the idea that this does not happen. In an attempt to correct this incoherence, we describe an alternative frequentist approach that is appropriate even when Hardy-Weinberg equilibrium does not hold. It is then shown that the problem remains and is intrinsic of frequentist procedures. Finally, we introduce the Full Bayesian Significance Test to test both hypotheses and prove that the incoherence cannot happen with these new tests. To illustrate this, all five tests are applied to real and simulated datasets. Using the celebrated power analysis, we show that the Bayesian method is comparable to the frequentist one and has the advantage of being coherent. Conclusions: Contrary to more traditional approaches, the Full Bayesian Significance Test for association studies provides a simple, coherent and powerful tool for detecting associations.
Resumo:
Objectives. The null hypothesis was that mechanical testing systems used to determine polymerization stress (sigma(pol)) would rank a series of composites similarly. Methods. Two series of composites were tested in the following systems: universal testing machine (UTM) using glass rods as bonding substrate, UTM/acrylic rods, "low compliance device", and single cantilever device ("Bioman"). One series had five experimental composites containing BisGMA:TEGDMA in equimolar concentrations and 60, 65, 70, 75 or 80 wt% of filler. The other series had five commercial composites: Filtek Z250 (3M ESPE), Filtek A110 (3M ESPE), Tetric Ceram (Ivoclar), Heliomolar (Ivoclar) and Point 4 (Kerr). Specimen geometry, dimensions and curing conditions were similar in all systems. sigma(pol) was monitored for 10 min. Volumetric shrinkage (VS) was measured in a mercury dilatometer and elastic modulus (E) was determined by three-point bending. Shrinkage rate was used as a measure of reaction kinetics. ANOVA/Tukey test was performed for each variable, separately for each series. Results. For the experimental composites, sigma(pol) decreased with filler content in all systems, following the variation in VS. For commercial materials, sigma(pol) did not vary in the UTM/acrylic system and showed very few similarities in rankings in the others tests system. Also, no clear relationships were observed between sigma(pol) and VS or E. Significance. The testing systems showed a good agreement for the experimental composites, but very few similarities for the commercial composites. Therefore, comparison of polymerization stress results from different devices must be done carefully. (c) 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
The objective of the present work was to propose a method for testing the contribution of each level of the factors in a genotypes x environments (GxE) interaction using multi-environment trials analyses by means of an F test. The study evaluated a data set, with twenty genotypes and thirty-four environments, in a block design with four replications. The sum of squares within rows (genotypes) and columns (environments) of the GxE matrix was simulated, generating 10000 experiments to verify the empirical distribution. Results indicate a noncentral chi-square distribution for rows and columns of the GxE interaction matrix, which was also verified by the Kolmogorov-Smirnov test and Q-Q plot. Application of the F test identified the genotypes and environments that contributed the most to the GxE interaction. In this way, geneticists can select good genotypes in their studies.
Resumo:
Despite favourable gravitational instability and ridge-push, elastic and frictional forces prevent subduction initiation fromarising spontaneously at passive margins. Here,we argue that forces arising fromlarge continental topographic gradients are required to initiate subduction at passivemargins. In order to test this hypothesis,we use 2Dnumerical models to assess the influence of the Andean Plateau on stressmagnitudes and deformation patterns at the Brazilian passive margin. The numerical results indicate that “plateau-push” in this region is a necessary additional force to initiate subduction. As the SE Brazilianmargin currently shows no signs of self-sustained subduction, we examined geological and geophysical data to determine if themargin is in the preliminary stages of subduction initiation. The compiled data indicate that the margin is presently undergoing tectonic inversion, which we infer as part of the continental–oceanic overthrusting stage of subduction initiation. We refer to this early subduction stage as the “Brazilian Stage”, which is characterized by N10 kmdeep reverse fault seismicity at themargin, recent topographic uplift on the continental side, thick continental crust at themargin, and bulging on the oceanic side due to loading by the overthrusting continent. The combined results of the numerical simulations and passivemargin analysis indicate that the SE Brazilian margin is a prototype candidate for subduction initiation.