26 resultados para Statistical hypothesis testing.
em CentAUR: Central Archive University of Reading - UK
Resumo:
This article reviews recent work on hypothesis testing in the American Journal of AGricultural Economics and its predecessor journal, the Journal of Farm Economics
Resumo:
Version 1 of the Global Charcoal Database is now available for regional fire history reconstructions, data exploration, hypothesis testing, and evaluation of coupled climate–vegetation–fire model simulations. The charcoal database contains over 400 radiocarbon-dated records that document changes in charcoal abundance during the Late Quaternary. The aim of this public database is to stimulate cross-disciplinary research in fire sciences targeted at an increased understanding of the controls and impacts of natural and anthropogenic fire regimes on centennial-to-orbital timescales. We describe here the data standardization techniques for comparing multiple types of sedimentary charcoal records. Version 1 of the Global Charcoal Database has been used to characterize global and regional patterns in fire activity since the last glacial maximum. Recent studies using the charcoal database have explored the relation between climate and fire during periods of rapid climate change, including evidence of fire activity during the Younger Dryas Chronozone, and during the past two millennia.
Resumo:
Ranald Roderick Macdonald (1945-2007) was an important contributor to mathematical psychology in the UK, as a referee and action editor for British Journal of Mathematical and Statistical Psychology and as a participant and organizer at the British Psychological Society's Mathematics, statistics and computing section meetings. This appreciation argues that his most important contribution was to the foundations of significance testing, where his concern about what information was relevant in interpreting the results of significance tests led him to be a persuasive advocate for the 'Weak Fisherian' form of hypothesis testing.
Resumo:
The evolutionary history of gains and losses of vegetative reproductive propagules (soredia) in Porpidia s.l., a group of lichen-forming ascomycetes, was clarified using Bayesian Markov chain Monte Carlo (MCMC) approaches to monophyly tests and a combined MCMC and maximum likelihood approach to ancestral character state reconstructions. The MCMC framework provided confidence estimates for the reconstructions of relationships and ancestral character states, which formed the basis for tests of evolutionary hypotheses. Monophyly tests rejected all hypotheses that predicted any clustering of reproductive modes in extant taxa. In addition, a nearest-neighbor statistic could not reject the hypothesis that the vegetative reproductive mode is randomly distributed throughout the group. These results show that transitions between presence and absence of the vegetative reproductive mode within Porpidia s.l. occurred several times and independently of each other. Likelihood reconstructions of ancestral character states at selected nodes suggest that - contrary to previous thought - the ancestor to Porpidia s.l. already possessed the vegetative reproductive mode. Furthermore, transition rates are reconstructed asymmetrically with the vegetative reproductive mode being gained at a much lower rate than it is lost. A cautious note has to be added, because a simulation study showed that the ancestral character state reconstructions were highly dependent on taxon sampling. However, our central conclusions, particularly the higher rate of change from vegetative reproductive mode present to absent than vice versa within Porpidia s.l., were found to be broadly independent of taxon sampling. [Ancestral character state reconstructions; Ascomycota, Bayesian inference; hypothesis testing; likelihood; MCMC; Porpidia; reproductive systems]
Resumo:
Event-related functional magnetic resonance imaging (efMRI) has emerged as a powerful technique for detecting brains' responses to presented stimuli. A primary goal in efMRI data analysis is to estimate the Hemodynamic Response Function (HRF) and to locate activated regions in human brains when specific tasks are performed. This paper develops new methodologies that are important improvements not only to parametric but also to nonparametric estimation and hypothesis testing of the HRF. First, an effective and computationally fast scheme for estimating the error covariance matrix for efMRI is proposed. Second, methodologies for estimation and hypothesis testing of the HRF are developed. Simulations support the effectiveness of our proposed methods. When applied to an efMRI dataset from an emotional control study, our method reveals more meaningful findings than the popular methods offered by AFNI and FSL. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
An improved understanding of present-day climate variability and change relies on high-quality data sets from the past 2 millennia. Global efforts to model regional climate modes are in the process of being validated against, and integrated with, records of past vegetation change. For South America, however, the full potential of vegetation records for evaluating and improving climate models has hitherto not been sufficiently acknowledged due to an absence of information on the spatial and temporal coverage of study sites. This paper therefore serves as a guide to high-quality pollen records that capture environmental variability during the last 2 millennia. We identify 60 vegetation (pollen) records from across South America which satisfy geochronological requirements set out for climate modelling, and we discuss their sensitivity to the spatial signature of climate modes throughout the continent. Diverse patterns of vegetation response to climate change are observed, with more similar patterns of change in the lowlands and varying intensity and direction of responses in the highlands. Pollen records display local-scale responses to climate modes; thus, it is necessary to understand how vegetation–climate interactions might diverge under variable settings. We provide a qualitative translation from pollen metrics to climate variables. Additionally, pollen is an excellent indicator of human impact through time. We discuss evidence for human land use in pollen records and provide an overview considered useful for archaeological hypothesis testing and important in distinguishing natural from anthropogenically driven vegetation change. We stress the need for the palynological community to be more familiar with climate variability patterns to correctly attribute the potential causes of observed vegetation dynamics. This manuscript forms part of the wider LOng-Term multi-proxy climate REconstructions and Dynamics in South America – 2k initiative that provides the ideal framework for the integration of the various palaeoclimatic subdisciplines and palaeo-science, thereby jump-starting and fostering multidisciplinary research into environmental change on centennial and millennial timescales.
Resumo:
Life-history theories of the early programming of human reproductive strategy stipulate that early rearing experience, including that reflected in infant-parent attachment security, regulates psychological, behavioral, and reproductive development. We tested the hypothesis that infant attachment insecurity, compared with infant attachment security, at the age of 15 months predicts earlier pubertal maturation. Focusing on 373 White females enrolled in the National Institute of Child Health and Human Development Study of Early Child Care and Youth Development, we gathered data from annual physical exams from the ages of 9½ years to 15½ years and from self-reported age of menarche. Results revealed that individuals who had been insecure infants initiated and completed pubertal development earlier and had an earlier age of menarche compared with individuals who had been secure infants, even after accounting for age of menarche in the infants’ mothers. These results support a conditional-adaptational view of individual differences in attachment security and raise questions about the biological mechanisms responsible for the attachment effects we discerned.
Resumo:
We consider the general response theory recently proposed by Ruelle for describing the impact of small perturbations to the non-equilibrium steady states resulting from Axiom A dynamical systems. We show that the causality of the response functions entails the possibility of writing a set of Kramers-Kronig (K-K) relations for the corresponding susceptibilities at all orders of nonlinearity. Nonetheless, only a special class of directly observable susceptibilities obey K-K relations. Specific results are provided for the case of arbitrary order harmonic response, which allows for a very comprehensive K-K analysis and the establishment of sum rules connecting the asymptotic behavior of the harmonic generation susceptibility to the short-time response of the perturbed system. These results set in a more general theoretical framework previous findings obtained for optical systems and simple mechanical models, and shed light on the very general impact of considering the principle of causality for testing self-consistency: the described dispersion relations constitute unavoidable benchmarks that any experimental and model generated dataset must obey. The theory exposed in the present paper is dual to the time-dependent theory of perturbations to equilibrium states and to non-equilibrium steady states, and has in principle similar range of applicability and limitations. In order to connect the equilibrium and the non equilibrium steady state case, we show how to rewrite the classical response theory by Kubo so that response functions formally identical to those proposed by Ruelle, apart from the measure involved in the phase space integration, are obtained. These results, taking into account the chaotic hypothesis by Gallavotti and Cohen, might be relevant in several fields, including climate research. In particular, whereas the fluctuation-dissipation theorem does not work for non-equilibrium systems, because of the non-equivalence between internal and external fluctuations, K-K relations might be robust tools for the definition of a self-consistent theory of climate change.
Resumo:
The question as to whether it is better to diversify a real estate portfolio within a property type across the regions or within a region across the property types is one of continuing interest for academics and practitioners alike. The current study, however, is somewhat different from the usual sector/regional analysis taking account of the fact that holdings in the UK real estate market are heavily concentrated in a single region, London. As a result this study is designed to investigate whether a real estate fund manager can obtain a statistically significant improvement in risk/return performance from extending out of a London based portfolio into firstly the rest of the South East of England and then into the remainder of the UK, or whether the manger would be better off staying within London and diversifying across the various property types. The results indicating that staying within London and diversifying across the various property types may offer performance comparable with regional diversification, although this conclusion largely depends on the time period and the fund manager’s ability to diversify efficiently.
Resumo:
Heinz recently completed a comprehensive experiment in self-play using the FRITZ chess engine to establish the ‘decreasing returns’ hypothesis with specific levels of statistical confidence. This note revisits the results and recalculates the confidence levels of this and other hypotheses. These appear to be better than Heinz’ initial analysis suggests.
Resumo:
Heinz recently completed a comprehensive experiment in self-play using the FRITZ chess engine to establish the ‘decreasing returns’ hypothesis with specific levels of statistical confidence. This note revisits the results and recalculates the confidence levels of this and other hypotheses. These appear to be better than Heinz’ initial analysis suggests.
Resumo:
The conventional method for assessing acute oral toxicity (OECD Test Guideline 401) was designed to identify the median lethal dose (LD50), using the death of animals as an endpoint. Introduced as an alternative method (OECD Test Guideline 420), the Fixed Dose Procedure (FDP) relies on the observation of clear signs of toxicity, uses fewer animals and causes less suffering. More recently, the Acute Toxic Class method and the Up-and-Down Procedure have also been adopted as OECD test guidelines. Both of these methods also use fewer animals than the conventional method, although they still use death as an endpoint. Each of the three new methods incorporates a sequential dosing procedure, which results in increased efficiency. In 1999, with a view to replacing OECD Test Guideline 401, the OECD requested that the three new test guidelines be updated. This was to bring them in line with the regulatory needs of all OECD Member Countries, provide further reductions in the number of animals used, and introduce refinements to reduce the pain and distress experienced by the animals. This paper describes a statistical modelling approach for the evaluation of acute oral toxicity tests, by using the revised FDP for illustration. Opportunities for further design improvements are discussed.
Resumo:
The conventional method for the assessment of acute dermal toxicity (OECD Test Guideline 402, 1987) uses death of animals as an endpoint to identify the median lethal dose (LD50). A new OECD Testing Guideline called the dermal fixed dose procedure (dermal FDP) is being prepared to provide an alternative to Test Guideline 402. In contrast to Test Guideline 402, the dermal FDP does not provide a point estimate of the LD50, but aims to identify that dose of the substance under investigation that causes clear signs of nonlethal toxicity. This is then used to assign classification according to the new Globally Harmonised System of Classification and Labelling scheme (GHS). The dermal FDP has been validated using statistical modelling rather than by in vivo testing. The statistical modelling approach enables calculation of the probability of each GHS classification and the expected numbers of deaths and animals used in the test for imaginary substances with a range of LD50 values and dose-response curve slopes. This paper describes the dermal FDP and reports the results from the statistical evaluation. It is shown that the procedure will be completed with considerably less death and suffering than guideline 402, and will classify substances either in the same or a more stringent GHS class than that assigned on the basis of the LD50 value.
Statistical evaluation of the fixed concentration procedure for acute inhalation toxicity assessment
Resumo:
The conventional method for the assessment of acute inhalation toxicity (OECD Test Guideline 403, 1981) uses death of animals as an endpoint to identify the median lethal concentration (LC50). A new OECD Testing Guideline called the Fixed Concentration Procedure (FCP) is being prepared to provide an alternative to Test Guideline 403. Unlike Test Guideline 403, the FCP does not provide a point estimate of the LC50, but aims to identify an airborne exposure level that causes clear signs of nonlethal toxicity. This is then used to assign classification according to the new Globally Harmonized System of Classification and Labelling scheme (GHS). The FCP has been validated using statistical simulation rather than byin vivo testing. The statistical simulation approach predicts the GHS classification outcome and the numbers of deaths and animals used in the test for imaginary substances with a range of LC50 values and dose response curve slopes. This paper describes the FCP and reports the results from the statistical simulation study assessing its properties. It is shown that the procedure will be completed with considerably less death and suffering than Test Guideline 403, and will classify substances either in the same or a more stringent GHS class than that assigned on the basis of the LC50 value.