35 resultados para Bayesian inference on precipitation
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Deuterium (δD) and oxygen (δ18O) isotopes are powerful tracers of the hydrological cycle and have been extensively used for paleoclimate reconstructions as they can provide information on past precipitation, temperature and atmospheric circulation. More recently, the use of δ17O excess derived from precise measurement of δ17O and δ18O gives new and additional insights in tracing the hydrological cycle whereas uncertainties surround this proxy. However, 17O excess could provide additional information on the atmospheric conditions at the moisture source as well as about fractionations associated with transport and site processes. In this paper we trace water stable isotopes (δD,δ17O and δ18O) along their path from precipitation to cave drip water and finally to speleothem fluid inclusions for Milandre cave in northwestern Switzerland. A two year-long daily resolved precipitation isotope record close to the cave site is compared to collected cave drip water (3 months average resolution) and fluid inclusions of modern and Holocene stalagmites. Amount weighted mean δD,δ18O and δ17O are -71.0‰, -9.9‰, -5.2‰ for precipitation, -60.3‰, -8.7‰, -4.6‰ for cave drip water and -61.3‰, -8.3‰, -4.7‰ for recent fluid inclusions respectively. Second order parameters have also been derived in precipitation and drip water and present similar values with 18 per meg for 17O excess whereas d-excess is 1.5‰ more negative in drip water. Furthermore, the atmospheric signal is shifted towards enriched values in the drip water and fluid inclusions (Δ of ~ + 10‰ for δD). The isotopic composition of cave drip water exhibits a weak seasonal signal which is shifted by around 8 - 10 months (groundwater residence time) when compared to the precipitation. Moreover, we carried out the first δ17O measurement in speleothem fluid inclusions, as well as the first comparison of the δ17 O behaviour from the meteoric water to the fluid inclusions entrapment in speleothems. This study on precipitation, drip water and fluid inclusions will be used as a speleothem proxy calibration for Milandre cave in order to reconstruct paleotemperatures and moisture source variations for Western Central Europe.
Resumo:
Monte Carlo simulation was used to evaluate properties of a simple Bayesian MCMC analysis of the random effects model for single group Cormack-Jolly-Seber capture-recapture data. The MCMC method is applied to the model via a logit link, so parameters p, S are on a logit scale, where logit(S) is assumed to have, and is generated from, a normal distribution with mean μ and variance σ2 . Marginal prior distributions on logit(p) and μ were independent normal with mean zero and standard deviation 1.75 for logit(p) and 100 for μ ; hence minimally informative. Marginal prior distribution on σ2 was placed on τ2=1/σ2 as a gamma distribution with α=β=0.001 . The study design has 432 points spread over 5 factors: occasions (t) , new releases per occasion (u), p, μ , and σ . At each design point 100 independent trials were completed (hence 43,200 trials in total), each with sample size n=10,000 from the parameter posterior distribution. At 128 of these design points comparisons are made to previously reported results from a method of moments procedure. We looked at properties of point and interval inference on μ , and σ based on the posterior mean, median, and mode and equal-tailed 95% credibility interval. Bayesian inference did very well for the parameter μ , but under the conditions used here, MCMC inference performance for σ was mixed: poor for sparse data (i.e., only 7 occasions) or σ=0 , but good when there were sufficient data and not small σ .
Resumo:
Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC) algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC). It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.
Resumo:
Using a highly resolved atmospheric general circulation model, the impact of different glacial boundary conditions on precipitation and atmospheric dynamics in the North Atlantic region is investigated. Six 30-yr time slice experiments of the Last Glacial Maximum at 21 thousand years before the present (ka BP) and of a less pronounced glacial state – the Middle Weichselian (65 ka BP) – are compared to analyse the sensitivity to changes in the ice sheet distribution, in the radiative forcing and in the prescribed time-varying sea surface temperature and sea ice, which are taken from a lower-resolved, but fully coupled atmosphere-ocean general circulation model. The strongest differences are found for simulations with different heights of the Laurentide ice sheet. A high surface elevation of the Laurentide ice sheet leads to a southward displacement of the jet stream and the storm track in the North Atlantic region. These changes in the atmospheric dynamics generate a band of increased precipitation in the mid-latitudes across the Atlantic to southern Europe in winter, while the precipitation pattern in summer is only marginally affected. The impact of the radiative forcing differences between the two glacial periods and of the prescribed time-varying sea surface temperatures and sea ice are of second order importance compared to the one of the Laurentide ice sheet. They affect the atmospheric dynamics and precipitation in a similar but less pronounced manner compared with the topographic changes.
Resumo:
Strong tropical volcanic eruptions have significant effects on global and regional temperatures. Their effects on precipitation, however, are less well understood. Analyzing hydroclimatic anomalies after 14 strong eruptions during the last 400 years in climate reconstructions and model simulations, a reduction of the Asian and African summer monsoons and an increase of south-central European summer precipitation in the year following the eruption was found. The simulations provide evidence for a dynamical link between these phenomena. The weaker monsoon circulations weaken the northern branch of the Hadley circulation, alter the atmospheric circulation over the Atlantic–European sector, and increase precipitation over Europe. This mechanism is able to explain, for instance, the wet summer in parts of Europe during the “year without a summer” of 1816, which up to now has not been explained. This study underlines the importance of atmospheric teleconnections between the tropics and midlatitudes to better understand the regional climate response to stratospheric volcanic aerosols.
Resumo:
Automatic identification and extraction of bone contours from X-ray images is an essential first step task for further medical image analysis. In this paper we propose a 3D statistical model based framework for the proximal femur contour extraction from calibrated X-ray images. The automatic initialization is solved by an estimation of Bayesian network algorithm to fit a multiple component geometrical model to the X-ray data. The contour extraction is accomplished by a non-rigid 2D/3D registration between a 3D statistical model and the X-ray images, in which bone contours are extracted by a graphical model based Bayesian inference. Preliminary experiments on clinical data sets verified its validity
Resumo:
In this study, we present a novel genotyping scheme to classify German wild-type varicella-zoster virus (VZV) strains and to differentiate them from the Oka vaccine strain (genotype B). This approach is based on analysis of four loci in open reading frames (ORFs) 51 to 58, encompassing a total length of 1,990 bp. The new genotyping scheme produced identical clusters in phylogenetic analyses compared to full-genome sequences from well-characterized VZV strains. Based on genotype A, D, B, and C reference strains, a dichotomous identification key (DIK) was developed and applied for VZV strains obtained from vesicle fluid and liquor samples originating from 42 patients suffering from varicella or zoster between 2003 and 2006. Sequencing of regions in ORFs 51, 52, 53, 56, 57, and 58 identified 18 single-nucleotide polymorphisms (SNPs), including two novel ones, SNP 89727 and SNP 92792 in ORF51 and ORF52, respectively. The DIK as well as phylogenetic analysis by Bayesian inference showed that 14 VZV strains belonged to genotype A, and 28 VZV strains were classified as genotype D. Neither Japanese (vaccine)-like B strains nor recombinant-like C strains were found within the samples from Germany. The novel genotyping scheme and the DIK were demonstrated to be practical and simple and allow the highly efficient replication of phylogenetic patterns in VZV initially derived from full-genome DNA sequence analyses. Therefore, this approach may allow us to draw a more comprehensive picture of wild-type VZV strains circulating in Germany and Central Europe by high-throughput procedures in the future.
Resumo:
The direct Bayesian admissible region approach is an a priori state free measurement association and initial orbit determination technique for optical tracks. In this paper, we test a hybrid approach that appends a least squares estimator to the direct Bayesian method on measurements taken at the Zimmerwald Observatory of the Astronomical Institute at the University of Bern. Over half of the association pairs agreed with conventional geometric track correlation and least squares techniques. The remaining pairs cast light on the fundamental limits of conducting tracklet association based solely on dynamical and geometrical information.
Resumo:
Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.
Resumo:
Seizure freedom in patients suffering from pharmacoresistant epilepsies is still not achieved in 20–30% of all cases. Hence, current therapies need to be improved, based on a more complete understanding of ictogenesis. In this respect, the analysis of functional networks derived from intracranial electroencephalographic (iEEG) data has recently become a standard tool. Functional networks however are purely descriptive models and thus are conceptually unable to predict fundamental features of iEEG time-series, e.g., in the context of therapeutical brain stimulation. In this paper we present some first steps towards overcoming the limitations of functional network analysis, by showing that its results are implied by a simple predictive model of time-sliced iEEG time-series. More specifically, we learn distinct graphical models (so called Chow–Liu (CL) trees) as models for the spatial dependencies between iEEG signals. Bayesian inference is then applied to the CL trees, allowing for an analytic derivation/prediction of functional networks, based on thresholding of the absolute value Pearson correlation coefficient (CC) matrix. Using various measures, the thus obtained networks are then compared to those which were derived in the classical way from the empirical CC-matrix. In the high threshold limit we find (a) an excellent agreement between the two networks and (b) key features of periictal networks as they have previously been reported in the literature. Apart from functional networks, both matrices are also compared element-wise, showing that the CL approach leads to a sparse representation, by setting small correlations to values close to zero while preserving the larger ones. Overall, this paper shows the validity of CL-trees as simple, spatially predictive models for periictal iEEG data. Moreover, we suggest straightforward generalizations of the CL-approach for modeling also the temporal features of iEEG signals.
Resumo:
Land and water management in semi-arid regions requires detailed information on precipitation distribution, including extremes, and changes therein. Such information is often lacking. This paper describes statistics of mean and extreme precipitation in a unique data set from the Mount Kenya region, encompassing around 50 stations with at least 30 years of data. We describe the data set, including quality control procedures and statistical break detection. Trends in mean precipitation and extreme indices calculated from these data for individual rainy seasons are compared with corresponding trends in reanalysis products. From 1979 to 2011, mean precipitation decreased at 75% of the stations during the ‘long rains’ (March to May) and increased at 70% of the stations during the ‘short rains’ (October to December). Corresponding trends are found in the number of heavy precipitation days, and maximum of consecutive 5-day precipitation. Conversely, an increase in consecutive dry days within both main rainy seasons is found. However, trends are only statistically significant in very few cases. Reanalysis data sets agree with observations with respect to interannual variability, while correlations are considerably lower for monthly deviations (ratios) from the mean annual cycle. While some products well reproduce the rainfall climatology and some the spatial trend pattern, no product reproduces both.