976 resultados para Classification, Markov chain Monte Carlo, k-nearest neighbours


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we describe an analysis for data collected on a three-dimensional spatial lattice with treatments applied at the horizontal lattice points. Spatial correlation is accounted for using a conditional autoregressive model. Observations are defined as neighbours only if they are at the same depth. This allows the corresponding variance components to vary by depth. We use the Markov chain Monte Carlo method with block updating, together with Krylov subspace methods, for efficient estimation of the model. The method is applicable to both regular and irregular horizontal lattices and hence to data collected at any set of horizontal sites for a set of depths or heights, for example, water column or soil profile data. The model for the three-dimensional data is applied to agricultural trial data for five separate days taken roughly six months apart in order to determine possible relationships over time. The purpose of the trial is to determine a form of cropping that leads to less moist soils in the root zone and beyond.We estimate moisture for each date, depth and treatment accounting for spatial correlation and determine relationships of these and other parameters over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To explore the role of the neighborhood environment in supporting walking Design: Cross sectional study of 10,286 residents of 200 neighborhoods. Participants were selected using a stratified two-stage cluster design. Data were collected by mail survey (68.5% response rate). Setting: The Brisbane City Local Government Area, Australia, 2007. Subjects: Brisbane residents aged 40 to 65 years. Measures Environmental: street connectivity, residential density, hilliness, tree coverage, bikeways, and street lights within a one kilometer circular buffer from each residentâs home; and network distance to nearest river or coast, public transport, shop, and park. Walking: minutes in the previous week categorized as < 30 minutes, ⥠30 < 90 minutes, ⥠90 < 150 minutes, ⥠150 < 300 minutes, and ⥠300 minutes. Analysis: The association between each neighborhood characteristic and walking was examined using multilevel multinomial logistic regression and the model parameters were estimated using Markov chain Monte Carlo simulation. Results: After adjustment for individual factors, the likelihood of walking for more than 300 minutes (relative to <30 minutes) was highest in areas with the most connectivity (OR=1.93, 99% CI 1.32-2.80), the greatest residential density (OR=1.47, 99% CI 1.02-2.12), the least tree coverage (OR=1.69, 99% CI 1.13-2.51), the most bikeways (OR=1.60, 99% CI 1.16-2.21), and the most street lights (OR=1.50, 99% CI 1.07-2.11). The likelihood of walking for more than 300 minutes was also higher among those who lived closest to a river or the coast (OR=2.06, 99% CI 1.41-3.02). Conclusion: The likelihood of meeting (and exceeding) physical activity recommendations on the basis of walking was higher in neighborhoods with greater street connectivity and residential density, more street lights and bikeways, closer proximity to waterways, and less tree coverage. Interventions targeting these neighborhood characteristics may lead to improved environmental quality as well as lower rates of overweight and obesity and associated chromic disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For clinical use, in electrocardiogram (ECG) signal analysis it is important to detect not only the centre of the P wave, the QRS complex and the T wave, but also the time intervals, such as the ST segment. Much research focused entirely on qrs complex detection, via methods such as wavelet transforms, spline fitting and neural networks. However, drawbacks include the false classification of a severe noise spike as a QRS complex, possibly requiring manual editing, or the omission of information contained in other regions of the ECG signal. While some attempts were made to develop algorithms to detect additional signal characteristics, such as P and T waves, the reported success rates are subject to change from person-to-person and beat-to-beat. To address this variability we propose the use of Markov-chain Monte Carlo statistical modelling to extract the key features of an ECG signal and we report on a feasibility study to investigate the utility of the approach. The modelling approach is examined with reference to a realistic computer generated ECG signal, where details such as wave morphology and noise levels are variable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An experimental study has been performed to investigate the ignition delay of a modern heavy-duty common-rail diesel engine run with fumigated ethanol substitutions up to 40% on an energy basis. The ignition delay was determined through the use of statistical modelling in a Bayesian framework this framework allows for the accurate determination of the start of combustion from single consecutive cycles and does not require any differentiation of the in-cylinder pressure signal. At full load the ignition delay has been shown to decrease with increasing ethanol substitutions and evidence of combustion with high ethanol substitutions prior to diesel injection have also been shown experimentally and by modelling. Whereas, at half load increasing ethanol substitutions have increased the ignition delay. A threshold absolute air to fuel ratio (mole basis) of above ~110 for consistent operation has been determined from the inter-cycle variability of the ignition delay, a result that agrees well with previous research of other in-cylinder parameters and further highlights the correlation between the air to fuel ratio and inter-cycle variability. Numerical modelling to investigate the sensitivity of ethanol combustion has also been performed. It has been shown that ethanol combustion is sensitive to the initial air temperature around the feasible operating conditions of the engine. Moreover, a negative temperature coefficient region of approximately 900{1050 K (the approximate temperature at fuel injection) has been shown with for n-heptane and n-heptane/ethanol blends in the numerical modelling. A consequence of this is that the dominate effect influencing the ignition delay under increasing ethanol substitutions may rather be from an increase in chemical reactions and not from in-cylinder temperature. Further investigation revealed that the chemical reactions at low ethanol substitutions are different compared to the high (> 20%) ethanol substitutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hierarchical Bayesian models can assimilate surveillance and ecological information to estimate both invasion extent and model parameters for invading plant pests spread by people. A reliability analysis framework that can accommodate multiple dispersal modes is developed to estimate human-mediated dispersal parameters for an invasive species. Uncertainty in the observation process is modelled by accounting for local natural spread and population growth within spatial units. Broad scale incursion dynamics are based on a mechanistic gravity model with a Weibull distribution modification to incorporate a local pest build-up phase. The model uses Markov chain Monte Carlo simulations to infer the probability of colonisation times for discrete spatial units and to estimate connectivity parameters between these units. The hierarchical Bayesian model with observational and ecological components is applied to a surveillance dataset for a spiralling whitefly (Aleurodicus dispersus) invasion in Queensland, Australia. The model structure provides a useful application that draws on surveillance data and ecological knowledge that can be used to manage the risk of pest movement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we examine approaches to estimate a Bayesian mixture model at both single and multiple time points for a sample of actual and simulated aerosol particle size distribution (PSD) data. For estimation of a mixture model at a single time point, we use Reversible Jump Markov Chain Monte Carlo (RJMCMC) to estimate mixture model parameters including the number of components which is assumed to be unknown. We compare the results of this approach to a commonly used estimation method in the aerosol physics literature. As PSD data is often measured over time, often at small time intervals, we also examine the use of an informative prior for estimation of the mixture parameters which takes into account the correlated nature of the parameters. The Bayesian mixture model offers a promising approach, providing advantages both in estimation and inference.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Markov random fields (MRF) are popular in image processing applications to describe spatial dependencies between image units. Here, we take a look at the theory and the models of MRFs with an application to improve forest inventory estimates. Typically, autocorrelation between study units is a nuisance in statistical inference, but we take an advantage of the dependencies to smooth noisy measurements by borrowing information from the neighbouring units. We build a stochastic spatial model, which we estimate with a Markov chain Monte Carlo simulation method. The smooth values are validated against another data set increasing our confidence that the estimates are more accurate than the originals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work addresses the problem of estimating the optimal value function in a Markov Decision Process from observed state-action pairs. We adopt a Bayesian approach to inference, which allows both the model to be estimated and predictions about actions to be made in a unified framework, providing a principled approach to mimicry of a controller on the basis of observed data. A new Markov chain Monte Carlo (MCMC) sampler is devised for simulation from theposterior distribution over the optimal value function. This step includes a parameter expansion step, which is shown to be essential for good convergence properties of the MCMC sampler. As an illustration, the method is applied to learning a human controller.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As técnicas de injeção de traçadores têm sido amplamente utilizadas na investigação de escoamentos em meios porosos, principalmente em problemas envolvendo a simulação numérica de escoamentos miscíveis em reservatórios de petróleo e o transporte de contaminantes em aquíferos. Reservatórios subterrâneos são em geral heterogêneos e podem apresentar variações significativas das suas propriedades em várias escalas de comprimento. Estas variações espaciais são incorporadas às equações que governam o escoamento no interior do meio poroso por meio de campos aleatórios. Estes campos podem prover uma descrição das heterogeneidades da formação subterrânea nos casos onde o conhecimento geológico não fornece o detalhamento necessário para a predição determinística do escoamento através do meio poroso. Nesta tese é empregado um modelo lognormal para o campo de permeabilidades a fim de reproduzir-se a distribuição de permeabilidades do meio real, e a geração numérica destes campos aleatórios é feita pelo método da Soma Sucessiva de Campos Gaussianos Independentes (SSCGI). O objetivo principal deste trabalho é o estudo da quantificação de incertezas para o problema inverso do transporte de um traçador em um meio poroso heterogêneo empregando uma abordagem Bayesiana para a atualização dos campos de permeabilidades, baseada na medição dos valores da concentração espacial do traçador em tempos específicos. Um método do tipo Markov Chain Monte Carlo a dois estágios é utilizado na amostragem da distribuição de probabilidade a posteriori e a cadeia de Markov é construída a partir da reconstrução aleatória dos campos de permeabilidades. Na resolução do problema de pressão-velocidade que governa o escoamento empregase um método do tipo Elementos Finitos Mistos adequado para o cálculo acurado dos fluxos em campos de permeabilidades heterogêneos e uma abordagem Lagrangiana, o método Forward Integral Tracking (FIT), é utilizada na simulação numérica do problema do transporte do traçador. Resultados numéricos são obtidos e apresentados para um conjunto de realizações amostrais dos campos de permeabilidades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the inverse reinforcement learning problem, that is, the problem of learning from, and then predicting or mimicking a controller based on state/action data. We propose a statistical model for such data, derived from the structure of a Markov decision process. Adopting a Bayesian approach to inference, we show how latent variables of the model can be estimated, and how predictions about actions can be made, in a unified framework. A new Markov chain Monte Carlo (MCMC) sampler is devised for simulation from the posterior distribution. This step includes a parameter expansion step, which is shown to be essential for good convergence properties of the MCMC sampler. As an illustration, the method is applied to learning a human controller.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of variable selection in regression modeling in high-dimensional spaces where there is known structure among the covariates. This is an unconventional variable selection problem for two reasons: (1) The dimension of the covariate space is comparable, and often much larger, than the number of subjects in the study, and (2) the covariate space is highly structured, and in some cases it is desirable to incorporate this structural information in to the model building process. We approach this problem through the Bayesian variable selection framework, where we assume that the covariates lie on an undirected graph and formulate an Ising prior on the model space for incorporating structural information. Certain computational and statistical problems arise that are unique to such high-dimensional, structured settings, the most interesting being the phenomenon of phase transitions. We propose theoretical and computational schemes to mitigate these problems. We illustrate our methods on two different graph structures: the linear chain and the regular graph of degree k. Finally, we use our methods to study a specific application in genomics: the modeling of transcription factor binding sites in DNA sequences. © 2010 American Statistical Association.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a novel unsupervised approach for linking records across arbitrarily many files, while simultaneously detecting duplicate records within files. Our key innovation is to represent the pattern of links between records as a {\em bipartite} graph, in which records are directly linked to latent true individuals, and only indirectly linked to other records. This flexible new representation of the linkage structure naturally allows us to estimate the attributes of the unique observable people in the population, calculate $k$-way posterior probabilities of matches across records, and propagate the uncertainty of record linkage into later analyses. Our linkage structure lends itself to an efficient, linear-time, hybrid Markov chain Monte Carlo algorithm, which overcomes many obstacles encountered by previously proposed methods of record linkage, despite the high dimensional parameter space. We assess our results on real and simulated data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some of the first results are reported from RISE - a new fast camera mounted on the Liverpool Telescope primarily designed to obtain high time-resolution light curves of transiting extrasolar planets for the purpose of transit timing. A full and partial transit of WASP-3 are presented, and a Markov-Chain Monte Carlo analysis is used to update the parameters from the discovery paper. This results in a planetary radius of 1.29(-0.12)(+0.05) R-J and therefore a density of 0.82(-0.09)(+0.14) rho(J), consistent with previous results. The inclination is 85.06(-0.15)(+0.16) deg, in agreement (but with a significant improvement in the precision) with the previously determined value. Central transit times are found to be consistent with the ephemeris given in the discovery paper; however, a new ephemeris calculated using the longer baseline results in T-c(0) = 2 454 605.55915 +/- 0.00023 HJD and P = 1.846835 +/- 0.000002 days.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present nine newly observed transits of TrES-3, taken as part of a transit timing program using the RISE instrument on the Liverpool Telescope. A Markov-Chain Monte Carlo analysis was used to determine the planet star radius ratio and inclination of the system, which were found to be R-p/R-star = 0.1664(-0.0018)(+0.0011) and i = 81.73(-0.04)(+0.13), respectively, consistent with previous results. The central transit times and uncertainties were also calculated, using a residual-permutation algorithm as an independent check on the errors. A re-analysis of eight previously published TrES-3 light curves was conducted to determine the transit times and uncertainties using consistent techniques. Whilst the transit times were not found to be in agreement with a linear ephemeris, giving chi(2) = 35.07 for 15 degrees of freedom, we interpret this to be the result of systematics in the light curves rather than a real transit timing variation. This is because the light curves that show the largest deviation from a constant period either have relatively little out-of-transit coverage or have clear systematics. A new ephemeris was calculated using the transit times and was found to be T-c(0) = 2454632.62610 +/- 0.00006 HJD and P = 1.3061864 +/- 0.0000005 days. The transit times were then used to place upper mass limits as a function of the period ratio of a potential perturbing planet, showing that our data are sufficiently sensitive to have probed sub-Earth mass planets in both interior and exterior 2:1 resonances, assuming that the additional planet is in an initially circular orbit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

<p>The IntCal04 and Marine04 radiocarbon calibration curves have been updated from 12 cal kBP (cal kBP is here defined as thousands of calibrated years before AD 1950), and extended to 50 cal kBP, utilizing newly available data sets that meet the IntCal Working Group criteria for pristine corals and other carbonates and for quantification of uncertainty in both the <sup>14</sup>C<sup> </sup>and calendar timescales as established in 2002. No change was made to the curves from 0-12 cal kBP. The curves were constructed using a Markov chain Monte Carlo (MCMC) implementation of the random walk model used for IntCal04 and Marine04. The new curves were ratified at the 20th International Radiocarbon Conference in June 2009 and are available in the Supplemental Material at www.radiocarbon.org.</p>