232 resultados para SEQUENTIAL MONTE-CARLO


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The selection of optimal camera configurations (camera locations, orientations etc.) for multi-camera networks remains an unsolved problem. Previous approaches largely focus on proposing various objective functions to achieve different tasks. Most of them, however, do not generalize well to large scale networks. To tackle this, we introduce a statistical formulation of the optimal selection of camera configurations as well as propose a Trans-Dimensional Simulated Annealing (TDSA) algorithm to effectively solve the problem. We compare our approach with a state-of-the-art method based on Binary Integer Programming (BIP) and show that our approach offers similar performance on small scale problems. However, we also demonstrate the capability of our approach in dealing with large scale problems and show that our approach produces better results than 2 alternative heuristics designed to deal with the scalability issue of BIP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction and aims: Individual smokers from disadvantaged backgrounds are less likely to quit, which contributes to widening inequalities in smoking. Residents of disadvantaged neighbourhoods are more likely to smoke, and neighbourhood inequalities in smoking may also be widening because of neighbourhood differences in rates of cessation. This study examined the association between neighbourhood disadvantage and smoking cessation and its relationship with neighbourhood inequalities in smoking. Design and methods: A multilevel longitudinal study of mid-aged (40-67 years) residents (n=6915) of Brisbane, Australia, who lived in the same neighbourhoods (n=200) in 2007 and 2009. Neighbourhood inequalities in cessation and smoking were analysed using multilevel logistic regression and Markov chain Monte Carlo simulation. Results: After adjustment for individual-level socioeconomic factors, the probability of quitting smoking between 2007 and 2009 was lower for residents of disadvantaged neighbourhoods (9.0%-12.8%) than their counterparts in more advantaged neighbourhoods (20.7%-22.5%). These inequalities in cessation manifested in widening inequalities in smoking: in 2007 the between-neighbourhood variance in rates of smoking was 0.242 (p≤0.001) and in 2009 it was 0.260 (p≤0.001). In 2007, residents of the most disadvantaged neighbourhoods were 88% (OR 1.88, 95% CrI 1.41-2.49) more likely to smoke than residents in the least disadvantaged neighbourhoods: the corresponding difference in 2009 was 98% (OR 1.98 95% CrI 1.48-2.66). Conclusion: Fundamentally, social and economic inequalities at the neighbourhood and individual-levels cause smoking and cessation inequalities. Reducing these inequalities will require comprehensive, well-funded, and targeted tobacco control efforts and equity based policies that address the social and economic determinants of smoking.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A major priority for cancer control agencies is to reduce geographical inequalities in cancer outcomes. While the poorer breast cancer survival among socioeconomically disadvantaged women is well established, few studies have looked at the independent contribution that area- and individual-level factors make to breast cancer survival. Here we examine relationships between geographic remoteness, area-level socioeconomic disadvantage and breast cancer survival after adjustment for patients’ socio- demographic characteristics and stage at diagnosis. Multilevel logistic regression and Markov chain Monte Carlo simulation were used to analyze 18 568 breast cancer cases extracted from the Queensland Cancer Registry for women aged 30 to 70 years diagnosed between 1997 and 2006 from 478 Statistical Local Areas in Queensland, Australia. Independent of individual-level factors, area-level disadvantage was associated with breast-cancer survival (p=0.032). Compared to women in the least disadvantaged quintile (Quintile 5), women diagnosed while resident in one of the remaining four quintiles had significantly worse survival (OR 1.23, 1.27, 1.30, 1.37 for Quintiles 4, 3, 2 and 1 respectively).) Geographic remoteness was not related to lower survival after multivariable adjustment. There was no evidence that the impact of area-level disadvantage varied by geographic remoteness. At the individual level, Indigenous status, blue collar occupations and advanced disease were important predictors of poorer survival. A woman’s survival after a diagnosis of breast cancer depends on the socio-economic characteristics of the area where she lives, independently of her individual-level characteristics. It is crucial that the underlying reasons for these inequalities be identified to appropriately target policies, resources and effective intervention strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Parabolic Trough Concentrators (PTC) are the most proven solar collectors for solar thermal power plants, and are suitable for concentrating photovoltaic (CPV) applications. PV cells are sensitive to spatial uniformity of incident light and the cell operating temperature. This requires the design of CPV-PTCs to be optimised both optically and thermally. Optical modelling can be performed using Monte Carlo Ray Tracing (MCRT), with conjugate heat transfer (CHT) modelling using the computational fluid dynamics (CFD) to analyse the overall designs. This paper develops and evaluates a CHT simulation for a concentrating solar thermal PTC collector. It uses the ray tracing work by Cheng et al. (2010) and thermal performance data for LS-2 parabolic trough used in the SEGS III-VII plants from Dudley et al. (1994). This is a preliminary step to developing models to compare heat transfer performances of faceted absorbers for concentrating photovoltaic (CPV) applications. Reasonable agreement between the simulation results and the experimental data confirms the reliability of the numerical model. The model explores different physical issues as well as computational issues for this particular kind of system modeling. The physical issues include the resultant non-uniformity of the boundary heat flux profile and the temperature profile around the tube, and uneven heating of the HTF. The numerical issues include, most importantly, the design of the computational domain/s, and the solution techniques of the turbulence quantities and the near-wall physics. This simulation confirmed that optical simulation and the computational CHT simulation of the collector can be accomplished independently.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La varianza estadística del costo total de un proyecto usualmente se estima por medio de la simulación de Monte Carlo, bajo el supuesto de que los acercamientos analíticos son demasiado complicados. Este artículo analiza este supuesto y muestra que, contrario a lo esperado, la solución analítica es relativamente directa. También se muestra que el coeficiente de variación no se ve afectado por el tamaño (área superficial) del proyecto cuando se usan los costos de los componentes estandarizados. Se provee un caso de estudio en el cual se analizan los costos reales de los componentes para obtener la varianza del costo total requerida. Los resultados confirman trabajos previos al mostrar que la aproximación del segundo momento (varianza) bajo el supuesto de independencia subestima considerablemente el valor exacto. El análisis continua examinando los efectos del juicio profesional y con los datos simulados utilizados, la aproximación resulta razonablemente exacta - el juicio profesional absorbe la mayor parte de las intercorrelaciones involucradas. También se da un ejemplo en el cual las cantidades de los componentes unitarios son valoradas por sus costos unitarios promedios y muestra, una vez más, que la aproximación es cercana al valor real. Finalmente, el trabajo se extiende mostrando cómo obtener, para cada proyecto, las varianzas exactas del costo total.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we present a new simulation methodology in order to obtain exact or approximate Bayesian inference for models for low-valued count time series data that have computationally demanding likelihood functions. The algorithm fits within the framework of particle Markov chain Monte Carlo (PMCMC) methods. The particle filter requires only model simulations and, in this regard, our approach has connections with approximate Bayesian computation (ABC). However, an advantage of using the PMCMC approach in this setting is that simulated data can be matched with data observed one-at-a-time, rather than attempting to match on the full dataset simultaneously or on a low-dimensional non-sufficient summary statistic, which is common practice in ABC. For low-valued count time series data we find that it is often computationally feasible to match simulated data with observed data exactly. Our particle filter maintains $N$ particles by repeating the simulation until $N+1$ exact matches are obtained. Our algorithm creates an unbiased estimate of the likelihood, resulting in exact posterior inferences when included in an MCMC algorithm. In cases where exact matching is computationally prohibitive, a tolerance is introduced as per ABC. A novel aspect of our approach is that we introduce auxiliary variables into our particle filter so that partially observed and/or non-Markovian models can be accommodated. We demonstrate that Bayesian model choice problems can be easily handled in this framework.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this work is to develop software that is capable of back projecting primary fluence images obtained from EPID measurements through phantom and patient geometries in order to calculate 3D dose distributions. In the first instance, we aim to develop a tool for pretreatment verification in IMRT. In our approach, a Geant4 application is used to back project primary fluence values from each EPID pixel towards the source. Each beam is considered to be polyenergetic, with a spectrum obtained from Monte Carlo calculations for the LINAC in question. At each step of the ray tracing process, the energy differential fluence is corrected for attenuation and beam divergence. Subsequently, the TERMA is calculated and accumulated to an energy differential 3D TERMA distribution. This distribution is then convolved with monoenergetic point spread kernels, thus generating energy differential 3D dose distributions. The resulting dose distributions are accumulated to yield the total dose distribution, which can then be used for pre-treatment verification of IMRT plans. Preliminary results were obtained for a test EPID image comprised of 100 9 100 pixels of unity fluence. Back projection of this field into a 30 cm9 30 cm 9 30 cm water phantom was performed, with TERMA distributions obtained in approximately 10 min (running on a single core of a 3 GHz processor). Point spread kernels for monoenergetic photons in water were calculated using a separate Geant4 application. Following convolution and summation, the resulting 3D dose distribution produced familiar build-up and penumbral features. In order to validate the dose model we will use EPID images recorded without any attenuating material in the beam for a number of MLC defined square fields. The dose distributions in water will be calculated and compared to TPS predictions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A novel in-cylinder pressure method for determining ignition delay has been proposed and demonstrated. This method proposes a new Bayesian statistical model to resolve the start of combustion, defined as being the point at which the band-pass in-cylinder pressure deviates from background noise and the combustion resonance begins. Further, it is demonstrated that this method is still accurate in situations where there is noise present. The start of combustion can be resolved for each cycle without the need for ad hoc methods such as cycle averaging. Therefore, this method allows for analysis of consecutive cycles and inter-cycle variability studies. Ignition delay obtained by this method and by the net rate of heat release have been shown to give good agreement. However, the use of combustion resonance to determine the start of combustion is preferable over the net rate of heat release method because it does not rely on knowledge of heat losses and will still function accurately in the presence of noise. Results for a six-cylinder turbo-charged common-rail diesel engine run with neat diesel fuel at full, three quarters and half load have been presented. Under these conditions the ignition delay was shown to increase as the load was decreased with a significant increase in ignition delay at half load, when compared with three quarter and full loads.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this study, a treatment plan for a spinal lesion, with all beams transmitted though a titanium vertebral reconstruction implant, was used to investigate the potential effect of a high-density implant on a three-dimensional dose distribution for a radiotherapy treatment. The BEAMnrc/DOSXYZnrc and MCDTK Monte Carlo codes were used to simulate the treatment using both a simplified, recltilinear model and a detailed model incorporating the full complexity of the patient anatomy and treatment plan. The resulting Monte Carlo dose distributions showed that the commercial treatment planning system failed to accurately predict both the depletion of dose downstream of the implant and the increase in scattered dose adjacent to the implant. Overall, the dosimetric effect of the implant was underestimated by the commercial treatment planning system and overestimated by the simplified Monte Carlo model. The value of performing detailed Monte Carlo calculations, using the full patient and treatment geometry, was demonstrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent literature has focused on realized volatility models to predict financial risk. This paper studies the benefit of explicitly modeling jumps in this class of models for value at risk (VaR) prediction. Several popular realized volatility models are compared in terms of their VaR forecasting performances through a Monte Carlo study and an analysis based on empirical data of eight Chinese stocks. The results suggest that careful modeling of jumps in realized volatility models can largely improve VaR prediction, especially for emerging markets where jumps play a stronger role than those in developed markets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we analyse the effects of highway traffic flow parameters like vehicle arrival rate and density on the performance of Amplify and Forward (AF) cooperative vehicular networks along a multi-lane highway under free flow state. We derive analytical expressions for connectivity performance and verify them with Monte-Carlo simulations. When AF cooperative relaying is employed together with Maximum Ratio Combining (MRC) at the receivers the average route error rate shows 10-20 fold improvement compared to direct communication. A 4-8 fold increase in maximum number of traversable hops can also be observed at different vehicle densities when AF cooperative communication is used to strengthen communication routes. However the theorical upper bound of maximum number of hops promises higher performance gains.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The emergence of pseudo-marginal algorithms has led to improved computational efficiency for dealing with complex Bayesian models with latent variables. Here an unbiased estimator of the likelihood replaces the true likelihood in order to produce a Bayesian algorithm that remains on the marginal space of the model parameter (with latent variables integrated out), with a target distribution that is still the correct posterior distribution. Very efficient proposal distributions can be developed on the marginal space relative to the joint space of model parameter and latent variables. Thus psuedo-marginal algorithms tend to have substantially better mixing properties. However, for pseudo-marginal approaches to perform well, the likelihood has to be estimated rather precisely. This can be difficult to achieve in complex applications. In this paper we propose to take advantage of multiple central processing units (CPUs), that are readily available on most standard desktop computers. Here the likelihood is estimated independently on the multiple CPUs, with the ultimate estimate of the likelihood being the average of the estimates obtained from the multiple CPUs. The estimate remains unbiased, but the variability is reduced. We compare and contrast two different technologies that allow the implementation of this idea, both of which require a negligible amount of extra programming effort. The superior performance of this idea over the standard approach is demonstrated on simulated data from a stochastic volatility model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Polymeric graphitic carbon nitride materials have attracted increasing attention in recent years owning to their potential applications in energy conversion, environment protection, and so on. Here, from first-principles calculations, we report the electronic structure modification of graphitic carbon nitride (g-C3N4) in response to carbon doping. We showed that each dopant atom can induce a local magnetic moment of 1.0 μB in non-magnetic g-C3N4. At the doping concentration of 1/14, the local magnetic moments of the most stable doping configuration which has the dopant atom at the center of heptazine unit prefer to align in a parallel way leading to long-range ferromagnetic (FM) ordering. When the joint N atom is replaced by C atom, the system favors an antiferromagnetic (AFM) ordering at unstrained state, but can be tuned to ferromagnetism (FM) by applying biaxial tensile strain. More interestingly, the FM state of the strained system is half-metallic with abundant states at the Fermi level in one spin channel and a band gap of 1.82 eV in another spin channel. The Curie temperature (Tc) was also evaluated using a mean-field theory and Monte Carlo simulations within the Ising model. Such tunable electron spin-polarization and ferromagnetism are quite promising for the applications of graphitic carbon nitride in spintronics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis introduced Bayesian statistics as an analysis technique to isolate resonant frequency information in in-cylinder pressure signals taken from internal combustion engines. Applications of these techniques are relevant to engine design (performance and noise), energy conservation (fuel consumption) and alternative fuel evaluation. The use of Bayesian statistics, over traditional techniques, allowed for a more in-depth investigation into previously difficult to isolate engine parameters on a cycle-by-cycle basis. Specifically, these techniques facilitated the determination of the start of pre-mixed and diffusion combustion and for the in-cylinder temperature profile to be resolved on individual consecutive engine cycles. Dr Bodisco further showed the utility of the Bayesian analysis techniques by applying them to in-cylinder pressure signals taken from a compression ignition engine run with fumigated ethanol.