903 resultados para Bayesian inference on precipitation
Resumo:
Changepoint regression models have originally been developed in connection with applications in quality control, where a change from the in-control to the out-of-control state has to be detected based on the avaliable random observations. Up to now various changepoint models have been suggested for differents applications like reliability, econometrics or medicine. In many practical situations the covariate cannot be measured precisely and an alternative model are the errors in variable regression models. In this paper we study the regression model with errors in variables with changepoint from a Bayesian approach. From the simulation study we found that the proposed procedure produces estimates suitable for the changepoint and all other model parameters.
Resumo:
Aims: Guided tissue regeneration (GTR) and enamel matrix derivatives (EMD) are two popular regenerative treatments for periodontal infrabony lesions. Both have been used in conjunction with other regenerative materials. We conducted a Bayesian network meta-analysis of randomized controlled trials on treatment effects of GTR, EMD and their combination therapies. Material and Methods: A systematic literature search was conducted using the Medline, EMBASE, LILACS and CENTRAL databases up to and including June 2011. Treatment outcomes were changes in probing pocket depth (PPD), clinical attachment level (CAL) and infrabony defect depth. Different types of bone grafts were treated as one group and so were barrier membranes. Results: A total of 53 studies were included in this review, and we found small differences between regenerative therapies which were non-significant statistically and clinically. GTR and GTR-related combination therapies achieved greater PPD reduction than EMD and EMD-related combination therapies. Combination therapies achieved slightly greater CAL gain than the use of EMD or GTR alone. GTR with BG achieved greatest defect fill. Conclusion: Combination therapies performed better than single therapies, but the additional benefits were small. Bayesian network meta-analysis is a promising technique to compare multiple treatments. Further analysis of methodological characteristics will be required prior to clinical recommendations.
Resumo:
We measured polycyclic aromatic hydrocarbons (PAHs) in bulk precipitation in the Fortaleza metropolitan area, Ceara, Brazil, for the first time. Because little information is available concerning PAHs in tropical climatic regions, we assessed their spatial distribution and possible sources and the influence of urban activities on the depositional fluxes of PAHs in bulk precipitation. The concentrations of individual and total PAHs (Sigma(PAHs)) in bulk precipitation ranged from undetectable to 133.9 ng.L-1 and from 202.6 to 674.8 ng.L-1, respectively. The plume of highest concentrations was most intense in a zone with heavy automobile traffic and favorable topography for the concentration of emitted pollutants. The depositional fluxes of PAHs in bulk precipitation calculated in this study (undetectable to 0.87 mu g.m(-2).month(-1)) are 4 to 27 times smaller than those reported from tourist sites and industrial and urban areas in the Northern Hemisphere. Diagnostic ratio analyses of PAH samples showed that the major source of emissions is gasoline exhaust, with a small percentage originating from diesel fuel. Contributions from coal and wood combustion were also found. Major economic activities appear to contribute to pollutant emissions. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
A common interest in gene expression data analysis is to identify from a large pool of candidate genes the genes that present significant changes in expression levels between a treatment and a control biological condition. Usually, it is done using a statistic value and a cutoff value that are used to separate the genes differentially and nondifferentially expressed. In this paper, we propose a Bayesian approach to identify genes differentially expressed calculating sequentially credibility intervals from predictive densities which are constructed using the sampled mean treatment effect from all genes in study excluding the treatment effect of genes previously identified with statistical evidence for difference. We compare our Bayesian approach with the standard ones based on the use of the t-test and modified t-tests via a simulation study, using small sample sizes which are common in gene expression data analysis. Results obtained report evidence that the proposed approach performs better than standard ones, especially for cases with mean differences and increases in treatment variance in relation to control variance. We also apply the methodologies to a well-known publicly available data set on Escherichia coli bacterium.
Resumo:
Objectives: Stimulation of salivary flow is considered a preventive strategy for dental erosion. Alternatively, products containing calcium phosphate, such as a complex of casein phosphopeptide–amorphous calcium phosphate (CPP–ACP), have also been tested against dental erosion. Therefore, this in situ study analyzed the effect of chewing gum containing CPP–ACP on the mineral precipitation of initial bovine enamel erosion lesions. Methods: Twelve healthy adult subjects wore palatal appliances with two eroded bovine enamel samples. The erosion lesions were produced by immersion in 0.1% citric acid (pH 2.5) for 7 min. During three experimental crossover in situ phases (1 day each), the subjects chewed a type of gum, 3 times for 30 min, in each phase: with CPP–ACP (trident total), without CPP–ACP (trident), and no chewing gum (control). The Knoop surface microhardness was measured at baseline, after erosion in vitro and the mineral precipitation in situ. The differences in the degree of mineral precipitation were analyzed using repeated measures (RM-) ANOVA and post hoc Tukey’s test ( p < 0.05). Results: Significant differences were found among the remineralizing treatments ( p < 0.0001). Chewing gum (19% of microhardness recovery) improved the mineral precipitation compared to control (10%) and the addition of CPP–ACP into the gum promoted the best mineral precipitation effect (30%). Conclusions: Under this protocol, CPP–ACP chewing gum improved the mineral precipitation of eroded enamel. Clinical significance: Since the prevalence of dental erosion is steadily increasing, CPP–ACP chewing gum might be an important strategy to reduce th eprogression of initial erosion lesions.
Resumo:
The effects of cryogenic and stress relief treatments on temper carbide precipitation in the cold work tool steel AISI D2 were studied. For the cryogenic treatment the temperature was −196°C and the holding time was 2, 24 or 30 h. The stress relief heat treatment was carried at 130°C/90 min, when applied. All specimens were compared to a standard thermal cycle. Specimens were studied using metallographic characterisation, X-ray diffraction and thermoelectric power measurements. The metallographic characterisation used SEM (scanning electron microscopy) and SEM-FEG (SEM with field emission gun), besides OM (optical microscopy). No variation in the secondary carbides (micrometre sized) precipitation was found. The temper secondary carbides (nanosized) were found to be more finely dispersed in the matrix of the specimens with cryogenic treatment and without stress relief. The refinement of the temper secondary carbides was attributed to a possible in situ carbide precipitation during tempering.
Resumo:
Precipitation retrieval over high latitudes, particularly snowfall retrieval over ice and snow, using satellite-based passive microwave spectrometers, is currently an unsolved problem. The challenge results from the large variability of microwave emissivity spectra for snow and ice surfaces, which can mimic, to some degree, the spectral characteristics of snowfall. This work focuses on the investigation of a new snowfall detection algorithm specific for high latitude regions, based on a combination of active and passive sensors able to discriminate between snowing and non snowing areas. The space-borne Cloud Profiling Radar (on CloudSat), the Advanced Microwave Sensor units A and B (on NOAA-16) and the infrared spectrometer MODIS (on AQUA) have been co-located for 365 days, from October 1st 2006 to September 30th, 2007. CloudSat products have been used as truth to calibrate and validate all the proposed algorithms. The methodological approach followed can be summarised into two different steps. In a first step, an empirical search for a threshold, aimed at discriminating the case of no snow, was performed, following Kongoli et al. [2003]. This single-channel approach has not produced appropriate results, a more statistically sound approach was attempted. Two different techniques, which allow to compute the probability above and below a Brightness Temperature (BT) threshold, have been used on the available data. The first technique is based upon a Logistic Distribution to represent the probability of Snow given the predictors. The second technique, defined Bayesian Multivariate Binary Predictor (BMBP), is a fully Bayesian technique not requiring any hypothesis on the shape of the probabilistic model (such as for instance the Logistic), which only requires the estimation of the BT thresholds. The results obtained show that both methods proposed are able to discriminate snowing and non snowing condition over the Polar regions with a probability of correct detection larger than 0.5, highlighting the importance of a multispectral approach.
Resumo:
In my PhD thesis I propose a Bayesian nonparametric estimation method for structural econometric models where the functional parameter of interest describes the economic agent's behavior. The structural parameter is characterized as the solution of a functional equation, or by using more technical words, as the solution of an inverse problem that can be either ill-posed or well-posed. From a Bayesian point of view, the parameter of interest is a random function and the solution to the inference problem is the posterior distribution of this parameter. A regular version of the posterior distribution in functional spaces is characterized. However, the infinite dimension of the considered spaces causes a problem of non continuity of the solution and then a problem of inconsistency, from a frequentist point of view, of the posterior distribution (i.e. problem of ill-posedness). The contribution of this essay is to propose new methods to deal with this problem of ill-posedness. The first one consists in adopting a Tikhonov regularization scheme in the construction of the posterior distribution so that I end up with a new object that I call regularized posterior distribution and that I guess it is solution of the inverse problem. The second approach consists in specifying a prior distribution on the parameter of interest of the g-prior type. Then, I detect a class of models for which the prior distribution is able to correct for the ill-posedness also in infinite dimensional problems. I study asymptotic properties of these proposed solutions and I prove that, under some regularity condition satisfied by the true value of the parameter of interest, they are consistent in a "frequentist" sense. Once I have set the general theory, I apply my bayesian nonparametric methodology to different estimation problems. First, I apply this estimator to deconvolution and to hazard rate, density and regression estimation. Then, I consider the estimation of an Instrumental Regression that is useful in micro-econometrics when we have to deal with problems of endogeneity. Finally, I develop an application in finance: I get the bayesian estimator for the equilibrium asset pricing functional by using the Euler equation defined in the Lucas'(1978) tree-type models.
Resumo:
Changepoint analysis is a well established area of statistical research, but in the context of spatio-temporal point processes it is as yet relatively unexplored. Some substantial differences with regard to standard changepoint analysis have to be taken into account: firstly, at every time point the datum is an irregular pattern of points; secondly, in real situations issues of spatial dependence between points and temporal dependence within time segments raise. Our motivating example consists of data concerning the monitoring and recovery of radioactive particles from Sandside beach, North of Scotland; there have been two major changes in the equipment used to detect the particles, representing known potential changepoints in the number of retrieved particles. In addition, offshore particle retrieval campaigns are believed may reduce the particle intensity onshore with an unknown temporal lag; in this latter case, the problem concerns multiple unknown changepoints. We therefore propose a Bayesian approach for detecting multiple changepoints in the intensity function of a spatio-temporal point process, allowing for spatial and temporal dependence within segments. We use Log-Gaussian Cox Processes, a very flexible class of models suitable for environmental applications that can be implemented using integrated nested Laplace approximation (INLA), a computationally efficient alternative to Monte Carlo Markov Chain methods for approximating the posterior distribution of the parameters. Once the posterior curve is obtained, we propose a few methods for detecting significant change points. We present a simulation study, which consists in generating spatio-temporal point pattern series under several scenarios; the performance of the methods is assessed in terms of type I and II errors, detected changepoint locations and accuracy of the segment intensity estimates. We finally apply the above methods to the motivating dataset and find good and sensible results about the presence and quality of changes in the process.
Resumo:
Genomic alterations have been linked to the development and progression of cancer. The technique of Comparative Genomic Hybridization (CGH) yields data consisting of fluorescence intensity ratios of test and reference DNA samples. The intensity ratios provide information about the number of copies in DNA. Practical issues such as the contamination of tumor cells in tissue specimens and normalization errors necessitate the use of statistics for learning about the genomic alterations from array-CGH data. As increasing amounts of array CGH data become available, there is a growing need for automated algorithms for characterizing genomic profiles. Specifically, there is a need for algorithms that can identify gains and losses in the number of copies based on statistical considerations, rather than merely detect trends in the data. We adopt a Bayesian approach, relying on the hidden Markov model to account for the inherent dependence in the intensity ratios. Posterior inferences are made about gains and losses in copy number. Localized amplifications (associated with oncogene mutations) and deletions (associated with mutations of tumor suppressors) are identified using posterior probabilities. Global trends such as extended regions of altered copy number are detected. Since the posterior distribution is analytically intractable, we implement a Metropolis-within-Gibbs algorithm for efficient simulation-based inference. Publicly available data on pancreatic adenocarcinoma, glioblastoma multiforme and breast cancer are analyzed, and comparisons are made with some widely-used algorithms to illustrate the reliability and success of the technique.
Resumo:
We investigate the changes of extreme European winter (December-February) precipitation back to 1700 and show for various European regions that return periods of extremely wet and dry winters are subject to significant changes both before and after the onset of anthropogenic influences. Generally, winter precipitation has become more extreme. We also examine the spatial pattern of the changes of the extremes covering the last 300 years where data quality is sufficient. Over central and Eastern Europe dry winters occurred more frequently during the 18th and the second part of the 19th century relative to 1951–2000. Dry winters were less frequent during both the 18th and 19th century over the British Isles and the Mediterranean. Wet winters have been less abundant during the last three centuries compared to 1951–2000 except during the early 18th century in central Europe. Although winter precipitation extremes are affected by climate change, no obvious connection of these changes was found to solar, volcanic or anthropogenic forcing. However, physically meaningful interpretation with atmospheric circulation changes was possible.
Resumo:
The present distribution of freshwater fish in the Alpine region has been strongly affected by colonization events occurring after the last glacial maximum (LGM), some 20,000 years ago. We use here a spatially explicit simulation framework to model and better understand their colonization dynamics in the Swiss Rhine basin. This approach is applied to the European bullhead (Cottus gobio), which is an ideal model organism to study fish past demographic processes since it has not been managed by humans. The molecular diversity of eight sampled populations is simulated and compared to observed data at six microsatellite loci under an approximate Bayesian computation framework to estimate the parameters of the colonization process. Our demographic estimates fit well with current knowledge about the biology of this species, but they suggest that the Swiss Rhine basin was colonized very recently, after the Younger Dryas some 6600 years ago. We discuss the implication of this result, as well as the strengths and limits of the spatially explicit approach coupled to the approximate Bayesian computation framework.
Resumo:
Charcoal has been known for a considerable length of time to have the property of recovering gold, silver, and copper from cyanide solutions of these metals. Quantitative data that may shed light on the mechanism of the removal of these metals is very limited except that charcoal in a form known as activated has the power to abstract gold and silver in considerable quantities from the above solutions.
Resumo:
We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T2.33TC.
Resumo:
The considerable search for synergistic agents in cancer research is motivated by the therapeutic benefits achieved by combining anti-cancer agents. Synergistic agents make it possible to reduce dosage while maintaining or enhancing a desired effect. Other favorable outcomes of synergistic agents include reduction in toxicity and minimizing or delaying drug resistance. Dose-response assessment and drug-drug interaction analysis play an important part in the drug discovery process, however analysis are often poorly done. This dissertation is an effort to notably improve dose-response assessment and drug-drug interaction analysis. The most commonly used method in published analysis is the Median-Effect Principle/Combination Index method (Chou and Talalay, 1984). The Median-Effect Principle/Combination Index method leads to inefficiency by ignoring important sources of variation inherent in dose-response data and discarding data points that do not fit the Median-Effect Principle. Previous work has shown that the conventional method yields a high rate of false positives (Boik, Boik, Newman, 2008; Hennessey, Rosner, Bast, Chen, 2010) and, in some cases, low power to detect synergy. There is a great need for improving the current methodology. We developed a Bayesian framework for dose-response modeling and drug-drug interaction analysis. First, we developed a hierarchical meta-regression dose-response model that accounts for various sources of variation and uncertainty and allows one to incorporate knowledge from prior studies into the current analysis, thus offering a more efficient and reliable inference. Second, in the case that parametric dose-response models do not fit the data, we developed a practical and flexible nonparametric regression method for meta-analysis of independently repeated dose-response experiments. Third, and lastly, we developed a method, based on Loewe additivity that allows one to quantitatively assess interaction between two agents combined at a fixed dose ratio. The proposed method makes a comprehensive and honest account of uncertainty within drug interaction assessment. Extensive simulation studies show that the novel methodology improves the screening process of effective/synergistic agents and reduces the incidence of type I error. We consider an ovarian cancer cell line study that investigates the combined effect of DNA methylation inhibitors and histone deacetylation inhibitors in human ovarian cancer cell lines. The hypothesis is that the combination of DNA methylation inhibitors and histone deacetylation inhibitors will enhance antiproliferative activity in human ovarian cancer cell lines compared to treatment with each inhibitor alone. By applying the proposed Bayesian methodology, in vitro synergy was declared for DNA methylation inhibitor, 5-AZA-2'-deoxycytidine combined with one histone deacetylation inhibitor, suberoylanilide hydroxamic acid or trichostatin A in the cell lines HEY and SKOV3. This suggests potential new epigenetic therapies in cell growth inhibition of ovarian cancer cells.