118 resultados para Bayesian Latent Class
em CentAUR: Central Archive University of Reading - UK
Resumo:
In this paper we present results from two choice experiments (CE), designed to take account of the different negative externalities associated with pesticide use in agricultural production. For cereal production, the most probable impact of pesticide use is a reduction in environmental quality. For fruit and vegetable production, the negative externality is on consumer health. Using latent class models we find evidence of the presence of preference heterogeneity in addition to reasonably high willingness to pay (WTP) estimates for a reduction in the use of pesticides for both environmental quality and consumer health. To place our WTP estimates in a policy context we convert them into an equivalent pesticide tax by type of externality. Our tax estimates suggest that pesticide taxes based on the primary externality resulting from a particular mode of agricultural production are a credible policy option that warrants further consideration.
Resumo:
Background: Psychotic phenomena appear to form a continuum with normal experience and beliefs, and may build on common emotional interpersonal concerns. Aims: We tested predictions that paranoid ideation is exponentially distributed and hierarchically arranged in the general population, and that persecutory ideas build on more common cognitions of mistrust, interpersonal sensitivity and ideas of reference. Method: Items were chosen from the Structured Clinical Interview for DSM-IV Axis II Disorders (SCID-II) questionnaire and the Psychosis Screening Questionnaire in the second British National Survey of Psychiatric Morbidity (n = 8580), to test a putative hierarchy of paranoid development using confirmatory factor analysis, latent class analysis and factor mixture modelling analysis. Results: Different types of paranoid ideation ranged in frequency from less than 2% to nearly 30%. Total scores on these items followed an almost perfect exponential distribution (r = 0.99). Our four a priori first-order factors were corroborated (interpersonal sensitivity; mistrust; ideas of reference; ideas of persecution). These mapped onto four classes of individual respondents: a rare, severe, persecutory class with high endorsement of all item factors, including persecutory ideation; a quasi-normal class with infrequent endorsement of interpersonal sensitivity, mistrust and ideas of reference, and no ideas of persecution; and two intermediate classes, characterised respectively by relatively high endorsement of items relating to mistrust and to ideas of reference. Conclusions: The paranoia continuum has implications for the aetiology, mechanisms and treatment of psychotic disorders, while confirming the lack of a clear distinction from normal experiences and processes.
Resumo:
Sensible and latent heat fluxes are often calculated from bulk transfer equations combined with the energy balance. For spatial estimates of these fluxes, a combination of remotely sensed and standard meteorological data from weather stations is used. The success of this approach depends on the accuracy of the input data and on the accuracy of two variables in particular: aerodynamic and surface conductance. This paper presents a Bayesian approach to improve estimates of sensible and latent heat fluxes by using a priori estimates of aerodynamic and surface conductance alongside remote measurements of surface temperature. The method is validated for time series of half-hourly measurements in a fully grown maize field, a vineyard and a forest. It is shown that the Bayesian approach yields more accurate estimates of sensible and latent heat flux than traditional methods.
Resumo:
Undirected graphical models are widely used in statistics, physics and machine vision. However Bayesian parameter estimation for undirected models is extremely challenging, since evaluation of the posterior typically involves the calculation of an intractable normalising constant. This problem has received much attention, but very little of this has focussed on the important practical case where the data consists of noisy or incomplete observations of the underlying hidden structure. This paper specifically addresses this problem, comparing two alternative methodologies. In the first of these approaches particle Markov chain Monte Carlo (Andrieu et al., 2010) is used to efficiently explore the parameter space, combined with the exchange algorithm (Murray et al., 2006) for avoiding the calculation of the intractable normalising constant (a proof showing that this combination targets the correct distribution in found in a supplementary appendix online). This approach is compared with approximate Bayesian computation (Pritchard et al., 1999). Applications to estimating the parameters of Ising models and exponential random graphs from noisy data are presented. Each algorithm used in the paper targets an approximation to the true posterior due to the use of MCMC to simulate from the latent graphical model, in lieu of being able to do this exactly in general. The supplementary appendix also describes the nature of the resulting approximation.
Resumo:
A revised Bayesian algorithm for estimating surface rain rate, convective rain proportion, and latent heating profiles from satellite-borne passive microwave radiometer observations over ocean backgrounds is described. The algorithm searches a large database of cloud-radiative model simulations to find cloud profiles that are radiatively consistent with a given set of microwave radiance measurements. The properties of these radiatively consistent profiles are then composited to obtain best estimates of the observed properties. The revised algorithm is supported by an expanded and more physically consistent database of cloud-radiative model simulations. The algorithm also features a better quantification of the convective and nonconvective contributions to total rainfall, a new geographic database, and an improved representation of background radiances in rain-free regions. Bias and random error estimates are derived from applications of the algorithm to synthetic radiance data, based upon a subset of cloud-resolving model simulations, and from the Bayesian formulation itself. Synthetic rain-rate and latent heating estimates exhibit a trend of high (low) bias for low (high) retrieved values. The Bayesian estimates of random error are propagated to represent errors at coarser time and space resolutions, based upon applications of the algorithm to TRMM Microwave Imager (TMI) data. Errors in TMI instantaneous rain-rate estimates at 0.5°-resolution range from approximately 50% at 1 mm h−1 to 20% at 14 mm h−1. Errors in collocated spaceborne radar rain-rate estimates are roughly 50%–80% of the TMI errors at this resolution. The estimated algorithm random error in TMI rain rates at monthly, 2.5° resolution is relatively small (less than 6% at 5 mm day−1) in comparison with the random error resulting from infrequent satellite temporal sampling (8%–35% at the same rain rate). Percentage errors resulting from sampling decrease with increasing rain rate, and sampling errors in latent heating rates follow the same trend. Averaging over 3 months reduces sampling errors in rain rates to 6%–15% at 5 mm day−1, with proportionate reductions in latent heating sampling errors.
Resumo:
Scene classification based on latent Dirichlet allocation (LDA) is a more general modeling method known as a bag of visual words, in which the construction of a visual vocabulary is a crucial quantization process to ensure success of the classification. A framework is developed using the following new aspects: Gaussian mixture clustering for the quantization process, the use of an integrated visual vocabulary (IVV), which is built as the union of all centroids obtained from the separate quantization process of each class, and the usage of some features, including edge orientation histogram, CIELab color moments, and gray-level co-occurrence matrix (GLCM). The experiments are conducted on IKONOS images with six semantic classes (tree, grassland, residential, commercial/industrial, road, and water). The results show that the use of an IVV increases the overall accuracy (OA) by 11 to 12% and 6% when it is implemented on the selected and all features, respectively. The selected features of CIELab color moments and GLCM provide a better OA than the implementation over CIELab color moment or GLCM as individuals. The latter increases the OA by only ∼2 to 3%. Moreover, the results show that the OA of LDA outperforms the OA of C4.5 and naive Bayes tree by ∼20%. © 2014 Society of Photo-Optical Instrumentation Engineers (SPIE) [DOI: 10.1117/1.JRS.8.083690]
Resumo:
Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes’ factors for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of biased weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates.
Resumo:
The flow dynamics of crystal-rich high-viscosity magma is likely to be strongly influenced by viscous and latent heat release. Viscous heating is observed to play an important role in the dynamics of fluids with temperature-dependent viscosities. The growth of microlite crystals and the accompanying release of latent heat should play a similar role in raising fluid temperatures. Earlier models of viscous heating in magmas have shown the potential for unstable (thermal runaway) flow as described by a Gruntfest number, using an Arrhenius temperature dependence for the viscosity, but have not considered crystal growth or latent heating. We present a theoretical model for magma flow in an axisymmetric conduit and consider both heating effects using Finite Element Method techniques. We consider a constant mass flux in a 1-D infinitesimal conduit segment with isothermal and adiabatic boundary conditions and Newtonian and non-Newtonian magma flow properties. We find that the growth of crystals acts to stabilize the flow field and make the magma less likely to experience a thermal runaway. The additional heating influences crystal growth and can counteract supercooling from degassing-induced crystallization and drive the residual melt composition back towards the liquidus temperature. We illustrate the models with results generated using parameters appropriate for the andesite lava dome-forming eruption at Soufriere Hills Volcano, Montserrat. These results emphasize the radial variability of the magma. Both viscous and latent heating effects are shown to be capable of playing a significant role in the eruption dynamics of Soufriere Hills Volcano. Latent heating is a factor in the top two kilometres of the conduit and may be responsible for relatively short-term (days) transients. Viscous heating is less restricted spatially, but because thermal runaway requires periods of hundreds of days to be achieved, the process is likely to be interrupted. Our models show that thermal evolution of the conduit walls could lead to an increase in the effective diameter of flow and an increase in flux at constant magma pressure.
Resumo:
The purpose of this paper is to show that, for a large class of band-dominated operators on $\ell^\infty(Z,U)$, with $U$ being a complex Banach space, the injectivity of all limit operators of $A$ already implies their invertibility and the uniform boundedness of their inverses. The latter property is known to be equivalent to the invertibility at infinity of $A$, which, on the other hand, is often equivalent to the Fredholmness of $A$. As a consequence, for operators $A$ in the Wiener algebra, we can characterize the essential spectrum of $A$ on $\ell^p(Z,U)$, regardless of $p\in[1,\infty]$, as the union of point spectra of its limit operators considered as acting on $\ell^p(Z,U)$.
Resumo:
Mecoprop-p [(R)-2-(4-chloro-2-methylphenoxy) propanoic acid) is widely used in agriculture and poses an environmental concern because of its susceptibility to leach from soil to water. We investigated the effect of soil depth on mecoprop-p biodegradation and its relationship with the number and diversity of tfdA related genes, which are the most widely known genes involved in degradation of the phenoxyalkanoic acid group of herbicides by bacteria. Mecoprop-p half-life (DT50) was approximately 12 days in soil sampled from <30 cm depth, and increased progressively with soil depth, reaching over 84 days at 70–80 cm. In sub-soil there was a lag period of between 23 and 34 days prior to a phase of rapid degradation. No lag phase occurred in top-soil samples prior to the onset of degradation. The maximum degradation rate was the same in top-soil and sub-soil samples. Although diverse tfdAα and tfdA genes were present prior to mecoprop-p degradation, real time PCR revealed that degradation was associated with proliferation of tfdA genes. The number of tfdA genes and the most probable number of mecoprop-p degrading organisms in soil prior to mecoprop-p addition were below the limit of quantification and detection respectively. Melting curves from the real time PCR analysis showed that prior to mecoprop-p degradation both class I and class III tfdA genes were present in top- and sub-soil samples. However at all soil depths only tfdA class III genes proliferated during degradation. Denaturing gradient gel electrophoresis confirmed that class III tfdA genes were associated with mecoprop-p degradation. Degradation was not associated with the induction of novel tfdA genes in top- or sub-soil samples, and there were no apparent differences in tfdA gene diversity with soil depth prior to or following degradation.
Correlating Bayesian date estimates with climatic events and domestication using a bovine case study
Resumo:
The tribe Bovini contains a number of commercially and culturally important species, such as cattle. Understanding their evolutionary time scale is important for distinguishing between post-glacial and domestication-associated population expansions, but estimates of bovine divergence times have been hindered by a lack of reliable calibration points. We present a Bayesian phylogenetic analysis of 481 mitochondrial D-loop sequences, including 228 radiocarbon-dated ancient DNA sequences, using a multi-demographic coalescent model. By employing the radiocarbon dates as internal calibrations, we co-estimate the bovine phylogeny and divergence times in a relaxed-clock framework. The analysis yields evidence for significant population expansions in both taurine and zebu cattle, European aurochs and yak clades. The divergence age estimates support domestication-associated expansion times (less than 12 kyr) for the major haplogroups of cattle. We compare the molecular and palaeontological estimates for the Bison-Bos divergence.