65 resultados para Allele frequency data
Resumo:
Many macroeconomic series, such as U.S. real output growth, are sampled quarterly, although potentially useful predictors are often observed at a higher frequency. We look at whether a mixed data-frequency sampling (MIDAS) approach can improve forecasts of output growth. The MIDAS specification used in the comparison uses a novel way of including an autoregressive term. We find that the use of monthly data on the current quarter leads to significant improvement in forecasting current and next quarter output growth, and that MIDAS is an effective way to exploit monthly data compared with alternative methods.
Resumo:
A simple physical model of the atmospheric effects of large explosive volcanic eruptions is developed. Using only one input parameter - the initial amount of sulphur dioxide injected into the stratosphere - the global-average stratospheric optical-depth perturbation and surface temperature response are modelled. The simplicity of this model avoids issues of incomplete data (applicable to more comprehensive models), making it a powerful and useful tool for atmospheric diagnostics of this climate forcing mechanism. It may also provide a computationally inexpensive and accurate way of introducing volcanic activity into larger climate models. The modelled surface temperature response for an initial sulphur-dioxide injection, coupled with emission-history statistics, is used to demonstrate that the most climatically significant volcanic eruptions are those of sufficient explosivity to just reach into the stratosphere (and achieve longevity). This study also highlights the fact that this measure of significance is highly sensitive to the representation of the climatic response and the frequency data used, and that we are far from producing a definitive history of explosive volcanism for at least the past 1000 years. Given this high degree of uncertainty, these results suggest that eruptions that release around and above 0.1 Mt SO2 into the stratosphere have the maximum climatic impact.
Resumo:
Lactase persistence (LP) is common among people of European ancestry, but with the exception of some African, Middle Eastern and southern Asian groups, is rare or absent elsewhere in the world. Lactase gene haplotype conservation around a polymorphism strongly associated with LP in Europeans (-13,910 C/T) indicates that the derived allele is recent in origin and has been subject to strong positive selection. Furthermore, ancient DNA work has shown that the -13,910*T (derived) allele was very rare or absent in early Neolithic central Europeans. It is unlikely that LP would provide a selective advantage without a supply of fresh milk, and this has lead to a gene-culture coevolutionary model where lactase persistence is only favoured in cultures practicing dairying, and dairying is more favoured in lactase persistent populations. We have developed a flexible demic computer simulation model to explore the spread of lactase persistence, dairying, other subsistence practices and unlinked genetic markers in Europe and western Asia's geographic space. Using data on -13,910*T allele frequency and farming arrival dates across Europe, and approximate Bayesian computation to estimate parameters of interest, we infer that the -13,910*T allele first underwent selection among dairying farmers around 7,500 years ago in a region between the central Balkans and central Europe, possibly in association with the dissemination of the Neolithic Linearbandkeramik culture over Central Europe. Furthermore, our results suggest that natural selection favouring a lactase persistence allele was not higher in northern latitudes through an increased requirement for dietary vitamin D. Our results provide a coherent and spatially explicit picture of the coevolution of lactase persistence and dairying in Europe.
Resumo:
This paper examines two hydrochemical time-series derived from stream samples taken in the Upper Hafren catchment, Plynlimon, Wales. One time-series comprises data collected at 7-hour intervals over 22 months (Neal et al., submitted, this issue), while the other is based on weekly sampling over 20 years. A subset of determinands: aluminium, calcium, chloride, conductivity, dissolved organic carbon, iron, nitrate, pH, silicon and sulphate are examined within a framework of non-stationary time-series analysis to identify determinand trends, seasonality and short-term dynamics. The results demonstrate that both long-term and high-frequency monitoring provide valuable and unique insights into the hydrochemistry of a catchment. The long-term data allowed analysis of long-termtrends, demonstrating continued increases in DOC concentrations accompanied by declining SO4 concentrations within the stream, and provided new insights into the changing amplitude and phase of the seasonality of the determinands such as DOC and Al. Additionally, these data proved invaluable for placing the short-term variability demonstrated within the high-frequency data within context. The 7-hour data highlighted complex diurnal cycles for NO3, Ca and Fe with cycles displaying changes in phase and amplitude on a seasonal basis. The high-frequency data also demonstrated the need to consider the impact that the time of sample collection can have on the summary statistics of the data and also that sampling during the hours of darkness provides additional hydrochemical information for determinands which exhibit pronounced diurnal variability. Moving forward, this research demonstrates the need for both long-term and high-frequency monitoring to facilitate a full and accurate understanding of catchment hydrochemical dynamics.
Resumo:
Laboratory measurements of the attenuation and velocity dispersion of compressional and shear waves at appropriate frequencies, pressures, and temperatures can aid interpretation of seismic and well-log surveys as well as indicate absorption mechanisms in rocks. Construction and calibration of resonant-bar equipment was used to measure velocities and attenuations of standing shear and extensional waves in copper-jacketed right cylinders of rocks (30 cm in length, 2.54 cm in diameter) in the sonic frequency range and at differential pressures up to 65 MPa. We also measured ultrasonic velocities and attenuations of compressional and shear waves in 50-mm-diameter samples of the rocks at identical pressures. Extensional-mode velocities determined from the resonant bar are systematically too low, yielding unreliable Poisson's ratios. Poisson's ratios determined from the ultrasonic data are frequency corrected and used to calculate the sonic-frequency compressional-wave velocities and attenuations from the shear- and extensional-mode data. We calculate the bulk-modulus loss. The accuracies of attenuation data (expressed as 1000/Q, where Q is the quality factor) are +/- 1 for compressional and shear waves at ultrasonic frequency, +/- 1 for shear waves, and +/- 3 for compressional waves at sonic frequency. Example sonic-frequency data show that the energy absorption in a limestone is small (Q(P) greater than 200 and stress independent) and is primarily due to poroelasticity, whereas that in the two sandstones is variable in magnitude (Q(P) ranges from less than 50 to greater than 300, at reservoir pressures) and arises from a combination of poroelasticity and viscoelasticity. A graph of compressional-wave attenuation versus compressional-wave velocity at reservoir pressures differentiates high-permeability (> 100 mD, 9.87 X 10(-14) m(2)) brine-saturated sandstones from low-permeability (< 100 mD, 9.87 X 10 (14) m(2)) sandstones and shales.
Resumo:
Many recent inverse scattering techniques have been designed for single frequency scattered fields in the frequency domain. In practice, however, the data is collected in the time domain. Frequency domain inverse scattering algorithms obviously apply to time-harmonic scattering, or nearly time-harmonic scattering, through application of the Fourier transform. Fourier transform techniques can also be applied to non-time-harmonic scattering from pulses. Our goal here is twofold: first, to establish conditions on the time-dependent waves that provide a correspondence between time domain and frequency domain inverse scattering via Fourier transforms without recourse to the conventional limiting amplitude principle; secondly, we apply the analysis in the first part of this work toward the extension of a particular scattering technique, namely the point source method, to scattering from the requisite pulses. Numerical examples illustrate the method and suggest that reconstructions from admissible pulses deliver superior reconstructions compared to straight averaging of multi-frequency data. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
Vibration-rotation spectra of HOCl have been measured at a resolution of 0.05 cm−1 to determine vibration rotation constants, and 35–37 Cl isotope shifts in the vibration frequencies. The spectrum of DOCl has also been recorded, and a preliminary analysis for the band origins has been made. The vibrational frequency data and centrifugal distortion constants have been used to determine the harmonic force field in a least-squares refinement; the force field obtained also gives a good fit to data on the vibrational contributions to the inertial defect. The equilibrium rotational constants of HOCl have been obtained, and an equilibrium structure has been estimated.
Resumo:
The complete general harmonic force field of methyl flouride was recalculated using the most recent literature frequency, Coriolis ζ, and centrifugal distortion data for 12CH3F, 13CH3F, 12CD3F, 12CHD2F and 12CH2DF. The anharmonic corrections applied to the observed frequency data and the adopted molecular geometry are considered to be more realistic than those used hitherto. There is excellent overall agreement between the fitted force constants and the highest quality ab initio force field currently available.
Resumo:
This document provides guidelines for fish stock assessment and fishery management using the software tools and other outputs developed by the United Kingdom's Department for International Development's Fisheries Management Science Programme (FMSP) from 1992 to 2004. It explains some key elements of the precautionary approach to fisheries management and outlines a range of alternative stock assessment approaches that can provide the information needed for such precautionary management. Four FMSP software tools, LFDA (Length Frequency Data Analysis), CEDA (Catch Effort Data Analysis), YIELD and ParFish (Participatory Fisheries Stock Assessment), are described with which intermediary parameters, performance indicators and reference points may be estimated. The document also contains examples of the assessment and management of multispecies fisheries, the use of Bayesian methodologies, the use of empirical modelling approaches for estimating yields and in analysing fishery systems, and the assessment and management of inland fisheries. It also provides a comparison of length- and age-based stock assessment methods. A CD-ROM with the FMSP software packages CEDA, LFDA, YIELD and ParFish is included.
Resumo:
This article introduces a new general method for genealogical inference that samples independent genealogical histories using importance sampling (IS) and then samples other parameters with Markov chain Monte Carlo (MCMC). It is then possible to more easily utilize the advantages of importance sampling in a fully Bayesian framework. The method is applied to the problem of estimating recent changes in effective population size from temporally spaced gene frequency data. The method gives the posterior distribution of effective population size at the time of the oldest sample and at the time of the most recent sample, assuming a model of exponential growth or decline during the interval. The effect of changes in number of alleles, number of loci, and sample size on the accuracy of the method is described using test simulations, and it is concluded that these have an approximately equivalent effect. The method is used on three example data sets and problems in interpreting the posterior densities are highlighted and discussed.
Resumo:
Multi-factor approaches to analysis of real estate returns have, since the pioneering work of Chan, Hendershott and Sanders (1990), emphasised a macro-variables approach in preference to the latent factor approach that formed the original basis of the arbitrage pricing theory. With increasing use of high frequency data and trading strategies and with a growing emphasis on the risks of extreme events, the macro-variable procedure has some deficiencies. This paper explores a third way, with the use of an alternative to the standard principal components approach – independent components analysis (ICA). ICA seeks higher moment independence and maximises in relation to a chosen risk parameter. We apply an ICA based on kurtosis maximisation to weekly US REIT data using a kurtosis maximising algorithm. The results show that ICA is successful in capturing the kurtosis characteristics of REIT returns, offering possibilities for the development of risk management strategies that are sensitive to extreme events and tail distributions.
Resumo:
This paper compares the performance of artificial neural networks (ANNs) with that of the modified Black model in both pricing and hedging Short Sterling options. Using high frequency data, standard and hybrid ANNs are trained to generate option prices. The hybrid ANN is significantly superior to both the modified Black model and the standard ANN in pricing call and put options. Hedge ratios for hedging Short Sterling options positions using Short Sterling futures are produced using the standard and hybrid ANN pricing models, the modified Black model, and also standard and hybrid ANNs trained directly on the hedge ratios. The performance of hedge ratios from ANNs directly trained on actual hedge ratios is significantly superior to those based on a pricing model, and to the modified Black model.
Resumo:
This paper assesses the performance of a vocabulary test designed to measure second language productive vocabulary knowledge.The test, Lex30, uses a word association task to elicit vocabulary, and uses word frequency data to measure the vocabulary produced. Here we report firstly on the reliability of the test as measured by a test-retest study, a parallel test forms experiment and an internal consistency measure. We then investigate the construct validity of the test by looking at changes in test performance over time, analyses of correlations with scores on similar tests, and comparison of spoken and written test performance. Last, we examine the theoretical bases of the two main test components: eliciting vocabulary and measuring vocabulary. Interpretations of our findings are discussed in the context of test validation research literature. We conclude that the findings reported here present a robust argument for the validity of the test as a research tool, and encourage further investigation of its validity in an instructional context
Resumo:
An efficient market incorporates news into prices immediately and fully. Tests for efficiency in financial markets have been undermined by information leakage. We test for efficiency in sports betting markets – real-world markets where news breaks remarkably cleanly. Applying a novel identification to high-frequency data, we investigate the reaction of prices to goals scored on the ‘cusp’ of half-time. This strategy allows us to separate the market's response to major news (a goal), from its reaction to the continual flow of minor game-time news. On our evidence, prices update swiftly and fully.