969 resultados para bayesian methods
Resumo:
Mathematical methods combined with measurements of single-cell dynamics provide a means to reconstruct intracellular processes that are only partly or indirectly accessible experimentally. To obtain reliable reconstructions, the pooling of measurements from several cells of a clonal population is mandatory. However, cell-to-cell variability originating from diverse sources poses computational challenges for such process reconstruction. We introduce a scalable Bayesian inference framework that properly accounts for population heterogeneity. The method allows inference of inaccessible molecular states and kinetic parameters; computation of Bayes factors for model selection; and dissection of intrinsic, extrinsic and technical noise. We show how additional single-cell readouts such as morphological features can be included in the analysis. We use the method to reconstruct the expression dynamics of a gene under an inducible promoter in yeast from time-lapse microscopy data.
Resumo:
Highway noise is one of the most pressing of the surface characteristics issues facing the concrete paving industry. This is particularly true in urban areas, where not only is there a higher population density near major thoroughfares, but also a greater volume of commuter traffic (Sandberg and Ejsmont 2002; van Keulen 2004). To help address this issue, the National Concrete Pavement Technology Center (CP Tech Center) at Iowa State University (ISU), Federal Highway Administration (FHWA), American Concrete Pavement Association (ACPA), and other organizations have partnered to conduct a multi-part, seven-year Concrete Pavement Surface Characteristics Project. This document contains the results of Part 1, Task 2, of the ISU-FHWA project, addressing the noise issue by evaluating conventional and innovative concrete pavement noise reduction methods. The first objective of this task was to determine what if any concrete surface textures currently constructed in the United States or Europe were considered quiet, had long-term friction characteristics, could be consistently built, and were cost effective. Any specifications of such concrete textures would be included in this report. The second objective was to determine whether any promising new concrete pavement surfaces to control tire-pavement noise and friction were in the development stage and, if so, what further research was necessary. The final objective was to identify measurement techniques used in the evaluation.
Resumo:
Integrative review (IR) has an international reputation in nursing research and evidence-based practice. This IR aimed at identifying and analyzing the concepts and methods recommended to undertaking IR in nursing. Nine information resources,including electronic databases and grey literature were searched. Seventeen studies were included. The results indicate that: primary studies were mostly from USA; it is possible to have several research questions or hypotheses and include primary studies in the review from different theoretical and methodological approaches; it is a type of review that can go beyond the analysis and synthesis of findings from primary studies allowing exploiting other research dimensions, and that presents potentialities for the development of new theories and new problems for research. Conclusion: IR is understood as a very complex type of review and it is expected to be developed using standardized and systematic methods to ensure the required rigor of scientific research and therefore the legitimacy of the established evidence.
Resumo:
Background: Imatinib has revolutionized the treatment of chronic myeloid leukemia (CML) and gastrointestinal stromal tumors (GIST). Considering the large inter-individual differences in the function of the systems involved in its disposition, exposure to imatinib can be expected to vary widely among patients. This observational study aimed at describing imatinib pharmacokinetic variability and its relationship with various biological covariates, especially plasma alpha1-acid glycoprotein (AGP), and at exploring the concentration-response relationship in patients. Methods: A population pharmacokinetic model (NONMEM) including 321 plasma samples from 59 patients was built up and used to derive individual post-hoc Bayesian estimates of drug exposure (AUC; area under curve). Associations between AUC and therapeutic response or tolerability were explored by ordered logistic regression. Influence of the target genotype (i.e. KIT mutation profile) on response was also assessed in GIST patients. Results: A one-compartment model with first-order absorption appropriately described the data, with an average oral clearance of 14.3 L/h (CL) and volume of distribution of 347 L (Vd). A large inter-individual variability remained unexplained, both on CL (36%) and Vd (63%), but AGP levels proved to have a marked impact on total imatinib disposition. Moreover, both total and free AUC correlated with the occurrence and number of side effects (e.g. OR 2.9±0.6 for a 2-fold free AUC increase; p<0.001). Furthermore, in GIST patients, higher free AUC predicted a higher probability of therapeutic response (OR 1.9±0.5; p<0.05), notably in patients with tumor harboring an exon 9 mutation or wild-type KIT, known to decrease tumor sensitivity towards imatinib. Conclusion: The large pharmacokinetic variability, associated to the pharmacokinetic-pharmacodynamic relationship uncovered are arguments to further investigate the usefulness of individualizing imatinib prescription based on TDM. For this type of drug, it should ideally take into consideration either circulating AGP concentrations or free drug levels, as well as KIT genotype for GIST.
Resumo:
The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.
Resumo:
Flow cytometry (FCM) is emerging as an important tool in environmental microbiology. Although flow cytometry applications have to date largely been restricted to certain specialized fields of microbiology, such as the bacterial cell cycle and marine phytoplankton communities, technical advances in instrumentation and methodology are leading to its increased popularity and extending its range of applications. Here we will focus on a number of recent flow cytometry developments important for addressing questions in environmental microbiology. These include (i) the study of microbial physiology under environmentally relevant conditions, (ii) new methods to identify active microbial populations and to isolate previously uncultured microorganisms, and (iii) the development of high-throughput autofluorescence bioreporter assays
Resumo:
Elucidating the molecular and neural basis of complex social behaviors such as communal living, division of labor and warfare requires model organisms that exhibit these multi-faceted behavioral phenotypes. Social insects, such as ants, bees, wasps and termites, are attractive models to address this problem, with rich ecological and ethological foundations. However, their atypical systems of reproduction have hindered application of classical genetic approaches. In this review, we discuss how recent advances in social insect genomics, transcriptomics, and functional manipulations have enhanced our ability to observe and perturb gene expression, physiology and behavior in these species. Such developments begin to provide an integrated view of the molecular and cellular underpinnings of complex social behavior.
Resumo:
The forensic two-trace problem is a perplexing inference problem introduced by Evett (J Forensic Sci Soc 27:375-381, 1987). Different possible ways of wording the competing pair of propositions (i.e., one proposition advanced by the prosecution and one proposition advanced by the defence) led to different quantifications of the value of the evidence (Meester and Sjerps in Biometrics 59:727-732, 2003). Here, we re-examine this scenario with the aim of clarifying the interrelationships that exist between the different solutions, and in this way, produce a global vision of the problem. We propose to investigate the different expressions for evaluating the value of the evidence by using a graphical approach, i.e. Bayesian networks, to model the rationale behind each of the proposed solutions and the assumptions made on the unknown parameters in this problem.
Resumo:
Recently, kernel-based Machine Learning methods have gained great popularity in many data analysis and data mining fields: pattern recognition, biocomputing, speech and vision, engineering, remote sensing etc. The paper describes the use of kernel methods to approach the processing of large datasets from environmental monitoring networks. Several typical problems of the environmental sciences and their solutions provided by kernel-based methods are considered: classification of categorical data (soil type classification), mapping of environmental and pollution continuous information (pollution of soil by radionuclides), mapping with auxiliary information (climatic data from Aral Sea region). The promising developments, such as automatic emergency hot spot detection and monitoring network optimization are discussed as well.
Resumo:
This paper presents 3-D brain tissue classificationschemes using three recent promising energy minimizationmethods for Markov random fields: graph cuts, loopybelief propagation and tree-reweighted message passing.The classification is performed using the well knownfinite Gaussian mixture Markov Random Field model.Results from the above methods are compared with widelyused iterative conditional modes algorithm. Theevaluation is performed on a dataset containing simulatedT1-weighted MR brain volumes with varying noise andintensity non-uniformities. The comparisons are performedin terms of energies as well as based on ground truthsegmentations, using various quantitative metrics.
Resumo:
The fight against doping is mainly focused on direct detection, using analytical methods for the detection of doping agents in biological samples. However, the World Anti-Doping Code also defines doping as possession, administration or attempted administration of prohibited substances or methods, trafficking or attempted trafficking in any prohibited substance or methods. As these issues correspond to criminal investigation, a forensic approach can help assessing potential violation of these rules.In the context of a rowing competition, genetic analyses were conducted on biological samples collected in infusion apparatus, bags and tubing in order to obtain DNA profiles. As no database of athletes' DNA profiles was available, the use of information from the location detection as well as contextual information were key to determine a population of suspected athletes and to obtain reference DNA profiles for comparison.Analysis of samples from infusion systems provided 8 different DNA profiles. The comparison between these profiles and 8 reference profiles from suspected athletes could not be distinguished.This case-study is one of the first where a forensic approach was applied for anti-doping purposes. Based on this investigation, the International Rowing Federation authorities decided to ban not only the incriminated athletes, but also the coaches and officials for 2 years.