24 resultados para departures
em CentAUR: Central Archive University of Reading - UK
Resumo:
Many well-established statistical methods in genetics were developed in a climate of severe constraints on computational power. Recent advances in simulation methodology now bring modern, flexible statistical methods within the reach of scientists having access to a desktop workstation. We illustrate the potential advantages now available by considering the problem of assessing departures from Hardy-Weinberg (HW) equilibrium. Several hypothesis tests of HW have been established, as well as a variety of point estimation methods for the parameter which measures departures from HW under the inbreeding model. We propose a computational, Bayesian method for assessing departures from HW, which has a number of important advantages over existing approaches. The method incorporates the effects-of uncertainty about the nuisance parameters--the allele frequencies--as well as the boundary constraints on f (which are functions of the nuisance parameters). Results are naturally presented visually, exploiting the graphics capabilities of modern computer environments to allow straightforward interpretation. Perhaps most importantly, the method is founded on a flexible, likelihood-based modelling framework, which can incorporate the inbreeding model if appropriate, but also allows the assumptions of the model to he investigated and, if necessary, relaxed. Under appropriate conditions, information can be shared across loci and, possibly, across populations, leading to more precise estimation. The advantages of the method are illustrated by application both to simulated data and to data analysed by alternative methods in the recent literature.
Resumo:
This paper reviews the treatment of intellectual property rights in the North American Free Trade Agreement (NAFTA) and considers the welfare-theoretic bases for innovation transfer between member and nonmember states. Specifically, we consider the effects of new technology development from within the union and question whether it is efficient (in a welfare sense) to transfer that new technology to nonmember states. When the new technology contains stochastic components, the important issue of information exchange arises and we consider this question in a simple oligopoly model with Bayesian updating. In this context, it is natural to ask the optimal price at which such information should be transferred. Some simple, natural conjugate examples are used to motivate the key parameters upon which the answer is dependent
Resumo:
Convective equilibrium is a long-standing and useful concept for understanding many aspects of the behaviour of deep moist convection. For example, it is often invoked in developing parameterizations for large-scale models. However, the equilibrium assumption may begin to break down as models are increasingly used with shorter timesteps and finer resolutions. Here we perform idealized cloud-system resolving model simulations of deep convection with imposed time variations in the surface forcing. A range of rapid forcing timescales from 1 − 36hr are used, in order to induce systematic departures from equilibrium. For the longer forcing timescales, the equilibrium assumption remains valid, in at least the limited sense that cycle-integrated measures of convective activity are very similar from cycle to cycle. For shorter forcing timescales, cycle-integrated convection becomes more variable, with enhanced activity on one cycle being correlated with reduced activity on the next, suggesting a role for convective memory. Further investigation shows that the memory does not appear to be carried by the domain-mean thermodynamic fields but rather by structures on horizontal scales of 5 − 20km. Such structures are produced by the convective clouds and can persist beyond the lifetime of the cloud, even through to the next forcing cycle.
Resumo:
A method for the detection of O+ ion fluxes from topside soundings is described. The shape of the plasma scale-height profile is altered by such flows only at heights near the F2-peak, where ion-neutral drag is large. Model profiles are used to relate changes in scale height to the ratio (φ/φL) where φ is the field-aligned O+ flux (relative to the neutral air) and φL is the limiting value set by frictional drag. Values of (φ/φL) can then be determined to within a few per cent from experimental soundings, using the plasma temperature and its gradient (as deduced from the observed profile) and the MSIS model neutral temperature. It was found that 3700 topside profiles show departures from diffusive equilibrium, out of 10,000 used to obtain the global morphology of (φ/φL) near the sunspot minimum. Results reveal dynamic ion-flow effects such as the transequatorial breeze and the effects of the polar wind and protonospheric replenishment light-ion flows can be inferred.
Resumo:
A long-standing debate in evolutionary biology concerns whether species diverge gradually through time or by punctuational episodes at the time of speciation. We found that approximately 22% of substitutional changes at the DNA level can be attributed to punctuational evolution, and the remainder accumulates from background gradual divergence. Punctuational effects occur at more than twice the rate in plants and fungi than in animals, but the proportion of total divergence attributable to punctuational change does not vary among these groups. Punctuational changes cause departures from a clock-like tempo of evolution, suggesting that they should be accounted for in deriving dates from phylogenies. Punctuational episodes of evolution may play a larger role in promoting evolutionary divergence than has previously been appreciated.
Resumo:
The identification of signatures of natural selection in genomic surveys has become an area of intense research, stimulated by the increasing ease with which genetic markers can be typed. Loci identified as subject to selection may be functionally important, and hence (weak) candidates for involvement in disease causation. They can also be useful in determining the adaptive differentiation of populations, and exploring hypotheses about speciation. Adaptive differentiation has traditionally been identified from differences in allele frequencies among different populations, summarised by an estimate of F-ST. Low outliers relative to an appropriate neutral population-genetics model indicate loci subject to balancing selection, whereas high outliers suggest adaptive (directional) selection. However, the problem of identifying statistically significant departures from neutrality is complicated by confounding effects on the distribution of F-ST estimates, and current methods have not yet been tested in large-scale simulation experiments. Here, we simulate data from a structured population at many unlinked, diallelic loci that are predominantly neutral but with some loci subject to adaptive or balancing selection. We develop a hierarchical-Bayesian method, implemented via Markov chain Monte Carlo (MCMC), and assess its performance in distinguishing the loci simulated under selection from the neutral loci. We also compare this performance with that of a frequentist method, based on moment-based estimates of F-ST. We find that both methods can identify loci subject to adaptive selection when the selection coefficient is at least five times the migration rate. Neither method could reliably distinguish loci under balancing selection in our simulations, even when the selection coefficient is twenty times the migration rate.
Resumo:
We describe a Bayesian approach to analyzing multilocus genotype or haplotype data to assess departures from gametic (linkage) equilibrium. Our approach employs a Markov chain Monte Carlo (MCMC) algorithm to approximate the posterior probability distributions of disequilibrium parameters. The distributions are computed exactly in some simple settings. Among other advantages, posterior distributions can be presented visually, which allows the uncertainties in parameter estimates to be readily assessed. In addition, background knowledge can be incorporated, where available, to improve the precision of inferences. The method is illustrated by application to previously published datasets; implications for multilocus forensic match probabilities and for simple association-based gene mapping are also discussed.
Resumo:
Statistical graphics are a fundamental, yet often overlooked, set of components in the repertoire of data analytic tools. Graphs are quick and efficient, yet simple instruments of preliminary exploration of a dataset to understand its structure and to provide insight into influential aspects of inference such as departures from assumptions and latent patterns. In this paper, we present and assess a graphical device for choosing a method for estimating population size in capture-recapture studies of closed populations. The basic concept is derived from a homogeneous Poisson distribution where the ratios of neighboring Poisson probabilities multiplied by the value of the larger neighbor count are constant. This property extends to the zero-truncated Poisson distribution which is of fundamental importance in capture–recapture studies. In practice however, this distributional property is often violated. The graphical device developed here, the ratio plot, can be used for assessing specific departures from a Poisson distribution. For example, simple contaminations of an otherwise homogeneous Poisson model can be easily detected and a robust estimator for the population size can be suggested. Several robust estimators are developed and a simulation study is provided to give some guidance on which should be used in practice. More systematic departures can also easily be detected using the ratio plot. In this paper, the focus is on Gamma mixtures of the Poisson distribution which leads to a linear pattern (called structured heterogeneity) in the ratio plot. More generally, the paper shows that the ratio plot is monotone for arbitrary mixtures of power series densities.
Resumo:
Gardner's popular model of perfect competition in the marketing sector is extended to a conjectural-variations oligopoly with endogenous entry. Revising Gardner's comparative statics on the "farm-retail price ratio," tests of hypotheses about food industry conduct are derived. Using data from a recent article by Wohlgenant, which employs Gardner's framework, tests are made of the validity of his maintained hypothesis-that the food industries are perfectly competitive. No evidence is found of departures from competition in the output markets of the food industries of eight commodity groups: (a) beef and veal, (b) pork, (c) poultry, (d) eggs, (e) dairy, (f) processed fruits and vegetables, (g) fresh fruit, and (h) fresh vegetables.
Resumo:
Assimilation of temperature observations into an ocean model near the equator often results in a dynamically unbalanced state with unrealistic overturning circulations. The way in which these circulations arise from systematic errors in the model or its forcing is discussed. A scheme is proposed, based on the theory of state augmentation, which uses the departures of the model state from the observations to update slowly evolving bias fields. Results are summarized from an experiment applying this bias correction scheme to an ocean general circulation model. They show that the method produces more balanced analyses and a better fit to the temperature observations.
Resumo:
The validity of approximating radiative heating rates in the middle atmosphere by a local linear relaxation to a reference temperature state (i.e., ‘‘Newtonian cooling’’) is investigated. Using radiative heating rate and temperature output from a chemistry–climate model with realistic spatiotemporal variability and realistic chemical and radiative parameterizations, it is found that a linear regressionmodel can capture more than 80% of the variance in longwave heating rates throughout most of the stratosphere and mesosphere, provided that the damping rate is allowed to vary with height, latitude, and season. The linear model describes departures from the climatological mean, not from radiative equilibrium. Photochemical damping rates in the upper stratosphere are similarly diagnosed. Threeimportant exceptions, however, are found.The approximation of linearity breaks down near the edges of the polar vortices in both hemispheres. This nonlinearity can be well captured by including a quadratic term. The use of a scale-independentdamping rate is not well justified in the lower tropical stratosphere because of the presence of a broad spectrum of vertical scales. The local assumption fails entirely during the breakup of the Antarctic vortex, where large fluctuations in temperature near the top of the vortex influence longwave heating rates within the quiescent region below. These results are relevant for mechanistic modeling studies of the middle atmosphere, particularly those investigating the final Antarctic warming.
Resumo:
This paper analyses the 53 managerial sackings and resignations from 16 stock exchange listed English football clubs during the nine seasons between 2000/01 and 2008/09. The results demonstrate that, on average, a managerial sacking results in a post-announcement day market-adjusted share price rise of 0.3%, whilst a resignation leads to a drop in share price of 1% that continues for a trading month thereafter, cumulating in a negative abnormal return of over 8% from a trading day before the event. These findings are intuitive, and suggest that sacking a poorly performing manager may be welcomed by the markets as a possible route to better future match performance, while losing a capable manager through resignation, who typically progresses to a superior job, will result in a drop in a club’s share price. The paper also reveals that while the impact of managerial departures on stock price volatilities is less clear-cut, speculation in the newspapers is rife in the build-up to such an event.
Resumo:
The optimal utilisation of hyper-spectral satellite observations in numerical weather prediction is often inhibited by incorrectly assuming independent interchannel observation errors. However, in order to represent these observation-error covariance structures, an accurate knowledge of the true variances and correlations is needed. This structure is likely to vary with observation type and assimilation system. The work in this article presents the initial results for the estimation of IASI interchannel observation-error correlations when the data are processed in the Met Office one-dimensional (1D-Var) and four-dimensional (4D-Var) variational assimilation systems. The method used to calculate the observation errors is a post-analysis diagnostic which utilises the background and analysis departures from the two systems. The results show significant differences in the source and structure of the observation errors when processed in the two different assimilation systems, but also highlight some common features. When the observations are processed in 1D-Var, the diagnosed error variances are approximately half the size of the error variances used in the current operational system and are very close in size to the instrument noise, suggesting that this is the main source of error. The errors contain no consistent correlations, with the exception of a handful of spectrally close channels. When the observations are processed in 4D-Var, we again find that the observation errors are being overestimated operationally, but the overestimation is significantly larger for many channels. In contrast to 1D-Var, the diagnosed error variances are often larger than the instrument noise in 4D-Var. It is postulated that horizontal errors of representation, not seen in 1D-Var, are a significant contributor to the overall error here. Finally, observation errors diagnosed from 4D-Var are found to contain strong, consistent correlation structures for channels sensitive to water vapour and surface properties.