948 resultados para Bayesian
Resumo:
The antiretroviral protein TRIM5alpha is known to have evolved different restriction capacities against various retroviruses, driven by positive Darwinian selection. However, how these different specificities have evolved in the primate lineages is not fully understood. Here we used ancestral protein resurrection to estimate the evolution of antiviral restriction specificities of TRIM5alpha on the primate lineage leading to humans. We used TRIM5alpha coding sequences from 24 primates for the reconstruction of ancestral TRIM5alpha sequences using maximum-likelihood and Bayesian approaches. Ancestral sequences were transduced into HeLa and CRFK cells. Stable cell lines were generated and used to test restriction of a panel of extant retroviruses (human immunodeficiency virus type 1 [HIV-1] and HIV-2, simian immunodeficiency virus [SIV] variants SIV(mac) and SIV(agm), and murine leukemia virus [MLV] variants N-MLV and B-MLV). The resurrected TRIM5alpha variant from the common ancestor of Old World primates (Old World monkeys and apes, approximately 25 million years before present) was effective against present day HIV-1. In contrast to the HIV-1 restriction pattern, we show that the restriction efficacy against other retroviruses, such as a murine oncoretrovirus (N-MLV), is higher for more recent resurrected hominoid variants. Ancestral TRIM5alpha variants have generally limited efficacy against HIV-2, SIV(agm), and SIV(mac). Our study sheds new light on the evolution of the intrinsic antiviral defense machinery and illustrates the utility of functional evolutionary reconstruction for characterizing recently emerged protein differences.
Resumo:
Aim Recently developed parametric methods in historical biogeography allow researchers to integrate temporal and palaeogeographical information into the reconstruction of biogeographical scenarios, thus overcoming a known bias of parsimony-based approaches. Here, we compare a parametric method, dispersal-extinction-cladogenesis (DEC), against a parsimony-based method, dispersal-vicariance analysis (DIVA), which does not incorporate branch lengths but accounts for phylogenetic uncertainty through a Bayesian empirical approach (Bayes-DIVA). We analyse the benefits and limitations of each method using the cosmopolitan plant family Sapindaceae as a case study.Location World-wide.Methods Phylogenetic relationships were estimated by Bayesian inference on a large dataset representing generic diversity within Sapindaceae. Lineage divergence times were estimated by penalized likelihood over a sample of trees from the posterior distribution of the phylogeny to account for dating uncertainty in biogeographical reconstructions. We compared biogeographical scenarios between Bayes-DIVA and two different DEC models: one with no geological constraints and another that employed a stratified palaeogeographical model in which dispersal rates were scaled according to area connectivity across four time slices, reflecting the changing continental configuration over the last 110 million years.Results Despite differences in the underlying biogeographical model, Bayes-DIVA and DEC inferred similar biogeographical scenarios. The main differences were: (1) in the timing of dispersal events - which in Bayes-DIVA sometimes conflicts with palaeogeographical information, and (2) in the lower frequency of terminal dispersal events inferred by DEC. Uncertainty in divergence time estimations influenced both the inference of ancestral ranges and the decisiveness with which an area can be assigned to a node.Main conclusions By considering lineage divergence times, the DEC method gives more accurate reconstructions that are in agreement with palaeogeographical evidence. In contrast, Bayes-DIVA showed the highest decisiveness in unequivocally reconstructing ancestral ranges, probably reflecting its ability to integrate phylogenetic uncertainty. Care should be taken in defining the palaeogeographical model in DEC because of the possibility of overestimating the frequency of extinction events, or of inferring ancestral ranges that are outside the extant species ranges, owing to dispersal constraints enforced by the model. The wide-spanning spatial and temporal model proposed here could prove useful for testing large-scale biogeographical patterns in plants.
Resumo:
Genome-wide scans of genetic differentiation between hybridizing taxa can identify genome regions with unusual rates of introgression. Regions of high differentiation might represent barriers to gene flow, while regions of low differentiation might indicate adaptive introgression-the spread of selectively beneficial alleles between reproductively isolated genetic backgrounds. Here we conduct a scan for unusual patterns of differentiation in a mosaic hybrid zone between two mussel species, Mytilus edulis and M. galloprovincialis. One outlying locus, mac-1, showed a characteristic footprint of local introgression, with abnormally high frequency of edulis-derived alleles in a patch of M. galloprovincialis enclosed within the mosaic zone, but low frequencies outside of the zone. Further analysis of DNA sequences showed that almost all of the edulis allelic diversity had introgressed into the M. galloprovincialis background in this patch. We then used a variety of approaches to test the hypothesis that there had been adaptive introgression at mac-1. Simulations and model fitting with maximum-likelihood and approximate Bayesian computation approaches suggested that adaptive introgression could generate a "soft sweep," which was qualitatively consistent with our data. Although the migration rate required was high, it was compatible with the functioning of an effective barrier to gene flow as revealed by demographic inferences. As such, adaptive introgression could explain both the reduced intraspecific differentiation around mac-1 and the high diversity of introgressed alleles, although a localized change in barrier strength may also be invoked. Together, our results emphasize the need to account for the complex history of secondary contacts in interpreting outlier loci.
Resumo:
The statistical properties of inflation and, in particular, its degree of persistence and stability over time is a subject of intense debate and no consensus has been achieved yet. The goal of this paper is to analyze this controversy using a general approach, with the aim of providing a plausible explanation for the existing contradictory results. We consider the inflation rates of 21 OECD countries which are modelled as fractionally integrated (FI) processes. First, we show analytically that FI can appear in inflation rates after aggregating individual prices from firms that face different costs of adjusting their prices. Then, we provide robust empirical evidence supporting the FI hypothesis using both classical and Bayesian techniques. Next, we estimate impulse response functions and other scalar measures of persistence, achieving an accurate picture of this property and its variation across countries. It is shown that the application of some popular tools for measuring persistence, such as the sum of the AR coefficients, could lead to erroneous conclusions if fractional integration is present. Finally, we explore the existence of changes in inflation inertia using a novel approach. We conclude that the persistence of inflation is very high (although non-permanent) in most post-industrial countries and that it has remained basically unchanged over the last four decades.
Resumo:
Mathematical methods combined with measurements of single-cell dynamics provide a means to reconstruct intracellular processes that are only partly or indirectly accessible experimentally. To obtain reliable reconstructions, the pooling of measurements from several cells of a clonal population is mandatory. However, cell-to-cell variability originating from diverse sources poses computational challenges for such process reconstruction. We introduce a scalable Bayesian inference framework that properly accounts for population heterogeneity. The method allows inference of inaccessible molecular states and kinetic parameters; computation of Bayes factors for model selection; and dissection of intrinsic, extrinsic and technical noise. We show how additional single-cell readouts such as morphological features can be included in the analysis. We use the method to reconstruct the expression dynamics of a gene under an inducible promoter in yeast from time-lapse microscopy data.
Resumo:
This paper proposes a method to conduct inference in panel VAR models with cross unit interdependencies and time variations in the coefficients. The approach can be used to obtain multi-unit forecasts and leading indicators and to conduct policy analysis in a multiunit setups. The framework of analysis is Bayesian and MCMC methods are used to estimate the posterior distribution of the features of interest. The model is reparametrized to resemble an observable index model and specification searches are discussed. As an example, we construct leading indicators for inflation and GDP growth in the Euro area using G-7 information.
Resumo:
This paper investigates the relationship between monetary policy and the changes experienced by the US economy using a small scale New-Keynesian model. The model is estimated with Bayesian techniques and the stability of policy parameter estimates and of the transmission of policy shocks examined. The model fits well the data and produces forecasts comparable or superior to those of alternative specifications. The parameters of the policy rule, the variance and the transmission of policy shocks have been remarkably stable. The parameters of the Phillips curve and of the Euler equations are varying.
Resumo:
Prior probabilities represent a core element of the Bayesian probabilistic approach to relatedness testing. This letter opinions on the commentary 'Use of prior odds for missing persons identifications' by Budowle et al. (2011), published recently in this journal. Contrary to Budowle et al. (2011), we argue that the concept of prior probabilities (i) is not endowed with the notion of objectivity, (ii) is not a case for computation and (iii) does not require new guidelines edited by the forensic DNA community - as long as probability is properly considered as an expression of personal belief. Please see related article: http://www.investigativegenetics.com/content/3/1/3
Resumo:
Background: Imatinib has revolutionized the treatment of chronic myeloid leukemia (CML) and gastrointestinal stromal tumors (GIST). Considering the large inter-individual differences in the function of the systems involved in its disposition, exposure to imatinib can be expected to vary widely among patients. This observational study aimed at describing imatinib pharmacokinetic variability and its relationship with various biological covariates, especially plasma alpha1-acid glycoprotein (AGP), and at exploring the concentration-response relationship in patients. Methods: A population pharmacokinetic model (NONMEM) including 321 plasma samples from 59 patients was built up and used to derive individual post-hoc Bayesian estimates of drug exposure (AUC; area under curve). Associations between AUC and therapeutic response or tolerability were explored by ordered logistic regression. Influence of the target genotype (i.e. KIT mutation profile) on response was also assessed in GIST patients. Results: A one-compartment model with first-order absorption appropriately described the data, with an average oral clearance of 14.3 L/h (CL) and volume of distribution of 347 L (Vd). A large inter-individual variability remained unexplained, both on CL (36%) and Vd (63%), but AGP levels proved to have a marked impact on total imatinib disposition. Moreover, both total and free AUC correlated with the occurrence and number of side effects (e.g. OR 2.9±0.6 for a 2-fold free AUC increase; p<0.001). Furthermore, in GIST patients, higher free AUC predicted a higher probability of therapeutic response (OR 1.9±0.5; p<0.05), notably in patients with tumor harboring an exon 9 mutation or wild-type KIT, known to decrease tumor sensitivity towards imatinib. Conclusion: The large pharmacokinetic variability, associated to the pharmacokinetic-pharmacodynamic relationship uncovered are arguments to further investigate the usefulness of individualizing imatinib prescription based on TDM. For this type of drug, it should ideally take into consideration either circulating AGP concentrations or free drug levels, as well as KIT genotype for GIST.
Resumo:
The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.
Resumo:
We analyze crash data collected by the Iowa Department of Transportation using Bayesian methods. The data set includes monthly crash numbers, estimated monthly traffic volumes, site length and other information collected at 30 paired sites in Iowa over more than 20 years during which an intervention experiment was set up. The intervention consisted in transforming 15 undivided road segments from four-lane to three lanes, while an additional 15 segments, thought to be comparable in terms of traffic safety-related characteristics were not converted. The main objective of this work is to find out whether the intervention reduces the number of crashes and the crash rates at the treated sites. We fitted a hierarchical Poisson regression model with a change-point to the number of monthly crashes per mile at each of the sites. Explanatory variables in the model included estimated monthly traffic volume, time, an indicator for intervention reflecting whether the site was a “treatment” or a “control” site, and various interactions. We accounted for seasonal effects in the number of crashes at a site by including smooth trigonometric functions with three different periods to reflect the four seasons of the year. A change-point at the month and year in which the intervention was completed for treated sites was also included. The number of crashes at a site can be thought to follow a Poisson distribution. To estimate the association between crashes and the explanatory variables, we used a log link function and added a random effect to account for overdispersion and for autocorrelation among observations obtained at the same site. We used proper but non-informative priors for all parameters in the model, and carried out all calculations using Markov chain Monte Carlo methods implemented in WinBUGS. We evaluated the effect of the four to three-lane conversion by comparing the expected number of crashes per year per mile during the years preceding the conversion and following the conversion for treatment and control sites. We estimated this difference using the observed traffic volumes at each site and also on a per 100,000,000 vehicles. We also conducted a prospective analysis to forecast the expected number of crashes per mile at each site in the study one year, three years and five years following the four to three-lane conversion. Posterior predictive distributions of the number of crashes, the crash rate and the percent reduction in crashes per mile were obtained for each site for the months of January and June one, three and five years after completion of the intervention. The model appears to fit the data well. We found that in most sites, the intervention was effective and reduced the number of crashes. Overall, and for the observed traffic volumes, the reduction in the expected number of crashes per year and mile at converted sites was 32.3% (31.4% to 33.5% with 95% probability) while at the control sites, the reduction was estimated to be 7.1% (5.7% to 8.2% with 95% probability). When the reduction in the expected number of crashes per year, mile and 100,000,000 AADT was computed, the estimates were 44.3% (43.9% to 44.6%) and 25.5% (24.6% to 26.0%) for converted and control sites, respectively. In both cases, the difference in the percent reduction in the expected number of crashes during the years following the conversion was significantly larger at converted sites than at control sites, even though the number of crashes appears to decline over time at all sites. Results indicate that the reduction in the expected number of sites per mile has a steeper negative slope at converted than at control sites. Consistent with this, the forecasted reduction in the number of crashes per year and mile during the years after completion of the conversion at converted sites is more pronounced than at control sites. Seasonal effects on the number of crashes have been well-documented. In this dataset, we found that, as expected, the expected number of monthly crashes per mile tends to be higher during winter months than during the rest of the year. Perhaps more interestingly, we found that there is an interaction between the four to three-lane conversion and season; the reduction in the number of crashes appears to be more pronounced during months, when the weather is nice than during other times of the year, even though a reduction was estimated for the entire year. Thus, it appears that the four to three-lane conversion, while effective year-round, is particularly effective in reducing the expected number of crashes in nice weather.
Resumo:
The Culex pipiens complex includes two widespread mosquito vector species, Cx. pipiens and Cx. quinquefasciatus. The distribution of these species varies in latitude, with the former being present in temperate regions and the latter in tropical and subtropical regions. However, their distribution range overlaps in certain areas and interspecific hybridization has been documented. Genetic introgression between these species may have epidemiological repercussions for West Nile virus (WNV) transmission. Bayesian clustering analysis based on multilocus genotypes of 12 microsatellites was used to determine levels of hybridization between these two species in Macaronesian islands, the only contact zone described in West Africa. The distribution of the two species reflects both the islands’ biogeography and historical aspects of human colonization. Madeira Island displayed a homogenous population of Cx. pipiens, whereas Cape Verde showed a more intriguing scenario with extensive hybridization. In the islands of Brava and Santiago, only Cx. quinquefasciatus was found, while in Fogo and Maio high hybrid rates (~40%) between the two species were detected. Within the admixed populations, second-generation hybrids (~50%) were identified suggesting a lack of isolation mechanisms. The observed levels of hybridization may locally potentiate the transmission to humans of zoonotic arboviruses such as WNV.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
Monetary policy is conducted in an environment of uncertainty. This paper sets upa model where the central bank uses real-time data from the bond market togetherwith standard macroeconomic indicators to estimate the current state of theeconomy more efficiently, while taking into account that its own actions influencewhat it observes. The timeliness of bond market data allows for quicker responsesof monetary policy to disturbances compared to the case when the central bankhas to rely solely on collected aggregate data. The information content of theterm structure creates a link between the bond market and the macroeconomythat is novel to the literature. To quantify the importance of the bond market asa source of information, the model is estimated on data for the United Statesand Australia using Bayesian methods. The empirical exercise suggests that thereis some information in the US term structure that helps the Federal Reserve toidentify shocks to the economy on a timely basis. Australian bond prices seemto be less informative than their US counterparts, perhaps because Australia is arelatively small and open economy.
Resumo:
We investigate the relationship between monetary policy and inflation dynamics in theUS using a medium scale structural model. The specification is estimated with Bayesiantechniques and fits the data reasonably well. Policy shocks account for a part of the declinein inflation volatility; they have been less effective in triggering inflation responses overtime and qualitatively account for the rise and fall in the level of inflation. A number ofstructural parameter variations contribute to these patterns.