115 resultados para Bayesian probing
Resumo:
A pyridyl-functionalized diiron dithiolate complex, [μ-(4-pyCH2−NMI-S2)Fe2(CO)6] (3, py = pyridine(ligand), NMI = naphthalene monoimide) was synthesized and fully characterized. In the presence of zinc tetraphenylporphyrin (ZnTPP), a self-assembled 3·ZnTPP complex was readily formed in CH2Cl2 by the coordination of the pyridyl nitrogen to the porphyrin zinc center. Ultrafast photoinduced electron transfer from excited ZnTPP to complex 3 in the supramolecular assembly was observed in real time by monitoring the ν(CO) and ν(CO)NMI spectral changes with femtosecond time-resolved infrared (TRIR) spectroscopy. We have confirmed that photoinduced charge separation produced the monoreduced species by comparing the time-resolved IR spectra with the conventional IR spectra of 3•− generated by reversible electrochemical reduction. The lifetimes for the charge separation and charge recombination processes were found to be τCS = 40 ± 3 ps and τCR = 205 ± 14 ps, respectively. The charge recombination is much slower than that in an analogous covalent complex, demonstrating the potential of a supramolecular approach to extend the lifetime of the chargeseparated state in photocatalytic complexes. The observed vibrational frequency shifts provide a very sensitive probe of the delocalization of the electron-spin density over the different parts of the Fe2S2 complex. The TR and spectro-electrochemical IR spectra, electron paramagnetic resonance spectra, and density functional theory calculations all show that the spin density in 3•− is delocalized over the diiron core and the NMI bridge. This delocalization explains why the complex exhibits low catalytic dihydrogen production even though it features a very efficient photoinduced electron transfer. The ultrafast porphyrin-to-NMIS2−Fe2(CO)6 photoinduced electron transfer is the first reported example of a supramolecular Fe2S2-hydrogenase model studied by femtosecond TRIR spectroscopy. Our results show that TRIR spectroscopy is a powerful tool to investigate photoinduced electron transfer in potential dihydrogen-producing catalytic complexes, and that way to optimize their performance by rational approaches.
Resumo:
We consider the forecasting of macroeconomic variables that are subject to revisions, using Bayesian vintage-based vector autoregressions. The prior incorporates the belief that, after the first few data releases, subsequent ones are likely to consist of revisions that are largely unpredictable. The Bayesian approach allows the joint modelling of the data revisions of more than one variable, while keeping the concomitant increase in parameter estimation uncertainty manageable. Our model provides markedly more accurate forecasts of post-revision values of inflation than do other models in the literature.
Resumo:
The performance of rank dependent preference functionals under risk is comprehensively evaluated using Bayesian model averaging. Model comparisons are made at three levels of heterogeneity plus three ways of linking deterministic and stochastic models: the differences in utilities, the differences in certainty equivalents and contextualutility. Overall, the"bestmodel", which is conditional on the form of heterogeneity is a form of Rank Dependent Utility or Prospect Theory that cap tures the majority of behaviour at both the representative agent and individual level. However, the curvature of the probability weighting function for many individuals is S-shaped, or ostensibly concave or convex rather than the inverse S-shape commonly employed. Also contextual utility is broadly supported across all levels of heterogeneity. Finally, the Priority Heuristic model, previously examined within a deterministic setting, is estimated within a stochastic framework, and allowing for endogenous thresholds does improve model performance although it does not compete well with the other specications considered.
Resumo:
This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.
Resumo:
This study investigates the structural features of porcine gastric mucin (PGM) in aqueous dispersions and its interactions with water-soluble polymers (poly(acrylic acid) (PAA), poly(methacrylic acid) (PMAA), poly(ethylene oxide), and poly(ethylene glycol)) using isothermal titration calorimetry, turbidimetric titration, dynamic light scattering, and transmission electron microscopy. It is established that PAA (450 kDa) and PMAA (100 kDa) exhibit strong specific interactions with PGM causing further aggregation of its particles, while PAA (2 kDa), poly(ethylene oxide) (1 000 kDa), and poly(ethylene glycol) (10 kDa) do not show any detectable effects on mucin. Sonication of mucin dispersions prior to their mixing with PAA (450 kDa) and PMAA (100 kDa) leads to more pronounced intensity of interactions.
Resumo:
Microbial degradation is a major determinant of the fate of pollutants in the environment. para-Nitrophenol (PNP) is an EPA listed priority pollutant with a wide environmental distribution, but little is known about the microorganisms that degrade it in the environment. We studied the diversity of active PNP-degrading bacterial populations in river water using a novel functional marker approach coupled with [13C6]PNP stable isotope probing (SIP). Culturing together with culture-independent terminal restriction fragment length polymorphism analysis of 16S rRNA gene amplicons identified Pseudomonas syringae to be the major driver of PNP degradation in river water microcosms. This was confirmed by SIP-pyrosequencing of amplified 16S rRNA. Similarly, functional gene analysis showed that degradation followed the Gram-negative bacterial pathway and involved pnpA from Pseudomonas spp. However, analysis of maleylacetate reductase (encoded by mar), an enzyme common to late stages of both Gram-negative and Gram-positive bacterial PNP degradation pathways, identified a diverse assemblage of bacteria associated with PNP degradation, suggesting that mar has limited use as a specific marker of PNP biodegradation. Both the pnpA and mar genes were detected in a PNP-degrading isolate, P. syringae AKHD2, which was isolated from river water. Our results suggest that PNP-degrading cultures of Pseudomonas spp. are representative of environmental PNP-degrading populations.
Resumo:
Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes’ factors for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of biased weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates.
Resumo:
Approximate Bayesian computation (ABC) is a popular family of algorithms which perform approximate parameter inference when numerical evaluation of the likelihood function is not possible but data can be simulated from the model. They return a sample of parameter values which produce simulations close to the observed dataset. A standard approach is to reduce the simulated and observed datasets to vectors of summary statistics and accept when the difference between these is below a specified threshold. ABC can also be adapted to perform model choice. In this article, we present a new software package for R, abctools which provides methods for tuning ABC algorithms. This includes recent dimension reduction algorithms to tune the choice of summary statistics, and coverage methods to tune the choice of threshold. We provide several illustrations of these routines on applications taken from the ABC literature.
Resumo:
The potential impact of the abrupt 8.2 ka cold event on human demography, settlement patterns and culture in Europe and the Near East has emerged as a key theme in current discussion and debate. We test whether this event had an impact on the Mesolithic population of western Scotland, a case study located within the North Atlantic region where the environmental impact of the 8.2 ka event is likely to have been the most severe. By undertaking a Bayesian analysis of the radiocarbon record and using the number of activity events as a proxy for the size of the human population, we find evidence for a dramatic reduction in the Mesolithic population synchronous with the 8.2 ka event. We interpret this as reflecting the demographic collapse of a low density population that lacked the capability to adapt to the rapid onset of new environmental conditions. This impact of the 8.2 ka event in the North Atlantic region lends credence to the possibility of a similar impact on populations in Continental Europe and the Near East.
Resumo:
Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.