976 resultados para Simulation Monte-Carlo


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction and aims: Individual smokers from disadvantaged backgrounds are less likely to quit, which contributes to widening inequalities in smoking. Residents of disadvantaged neighbourhoods are more likely to smoke, and neighbourhood inequalities in smoking may also be widening because of neighbourhood differences in rates of cessation. This study examined the association between neighbourhood disadvantage and smoking cessation and its relationship with neighbourhood inequalities in smoking. Design and methods: A multilevel longitudinal study of mid-aged (40-67 years) residents (n=6915) of Brisbane, Australia, who lived in the same neighbourhoods (n=200) in 2007 and 2009. Neighbourhood inequalities in cessation and smoking were analysed using multilevel logistic regression and Markov chain Monte Carlo simulation. Results: After adjustment for individual-level socioeconomic factors, the probability of quitting smoking between 2007 and 2009 was lower for residents of disadvantaged neighbourhoods (9.0%-12.8%) than their counterparts in more advantaged neighbourhoods (20.7%-22.5%). These inequalities in cessation manifested in widening inequalities in smoking: in 2007 the between-neighbourhood variance in rates of smoking was 0.242 (p≤0.001) and in 2009 it was 0.260 (p≤0.001). In 2007, residents of the most disadvantaged neighbourhoods were 88% (OR 1.88, 95% CrI 1.41-2.49) more likely to smoke than residents in the least disadvantaged neighbourhoods: the corresponding difference in 2009 was 98% (OR 1.98 95% CrI 1.48-2.66). Conclusion: Fundamentally, social and economic inequalities at the neighbourhood and individual-levels cause smoking and cessation inequalities. Reducing these inequalities will require comprehensive, well-funded, and targeted tobacco control efforts and equity based policies that address the social and economic determinants of smoking.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A major priority for cancer control agencies is to reduce geographical inequalities in cancer outcomes. While the poorer breast cancer survival among socioeconomically disadvantaged women is well established, few studies have looked at the independent contribution that area- and individual-level factors make to breast cancer survival. Here we examine relationships between geographic remoteness, area-level socioeconomic disadvantage and breast cancer survival after adjustment for patients’ socio- demographic characteristics and stage at diagnosis. Multilevel logistic regression and Markov chain Monte Carlo simulation were used to analyze 18 568 breast cancer cases extracted from the Queensland Cancer Registry for women aged 30 to 70 years diagnosed between 1997 and 2006 from 478 Statistical Local Areas in Queensland, Australia. Independent of individual-level factors, area-level disadvantage was associated with breast-cancer survival (p=0.032). Compared to women in the least disadvantaged quintile (Quintile 5), women diagnosed while resident in one of the remaining four quintiles had significantly worse survival (OR 1.23, 1.27, 1.30, 1.37 for Quintiles 4, 3, 2 and 1 respectively).) Geographic remoteness was not related to lower survival after multivariable adjustment. There was no evidence that the impact of area-level disadvantage varied by geographic remoteness. At the individual level, Indigenous status, blue collar occupations and advanced disease were important predictors of poorer survival. A woman’s survival after a diagnosis of breast cancer depends on the socio-economic characteristics of the area where she lives, independently of her individual-level characteristics. It is crucial that the underlying reasons for these inequalities be identified to appropriately target policies, resources and effective intervention strategies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we present a new simulation methodology in order to obtain exact or approximate Bayesian inference for models for low-valued count time series data that have computationally demanding likelihood functions. The algorithm fits within the framework of particle Markov chain Monte Carlo (PMCMC) methods. The particle filter requires only model simulations and, in this regard, our approach has connections with approximate Bayesian computation (ABC). However, an advantage of using the PMCMC approach in this setting is that simulated data can be matched with data observed one-at-a-time, rather than attempting to match on the full dataset simultaneously or on a low-dimensional non-sufficient summary statistic, which is common practice in ABC. For low-valued count time series data we find that it is often computationally feasible to match simulated data with observed data exactly. Our particle filter maintains $N$ particles by repeating the simulation until $N+1$ exact matches are obtained. Our algorithm creates an unbiased estimate of the likelihood, resulting in exact posterior inferences when included in an MCMC algorithm. In cases where exact matching is computationally prohibitive, a tolerance is introduced as per ABC. A novel aspect of our approach is that we introduce auxiliary variables into our particle filter so that partially observed and/or non-Markovian models can be accommodated. We demonstrate that Bayesian model choice problems can be easily handled in this framework.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the decision-making of multi-area ATC (Available Transfer Capacity) in electricity market environment, the existing resources of transmission network should be optimally dispatched and coordinately employed on the premise that the secure system operation is maintained and risk associated is controllable. The non-sequential Monte Carlo simulation is used to determine the ATC probability density distribution of specified areas under the influence of several uncertainty factors, based on which, a coordinated probabilistic optimal decision-making model with the maximal risk benefit as its objective is developed for multi-area ATC. The NSGA-II is applied to calculate the ATC of each area, which considers the risk cost caused by relevant uncertainty factors and the synchronous coordination among areas. The essential characteristics of the developed model and the employed algorithm are illustrated by the example of IEEE 118-bus test system. Simulative result shows that, the risk of multi-area ATC decision-making is influenced by the uncertainties in power system operation and the relative importance degrees of different areas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multiple Sclerosis (MS) is a chronic neurological disease characterized by demyelination associated with infiltrating white blood cells in the central nervous system (CNS). Nitric oxide synthases (NOS) are a family of enzymes that control the production of nitric oxide. It is possible that neuronal NOS could be involved in MS pathophysiology and hence the nNOS gene is a potential candidate for involvement in disease susceptibility. The aim of this study was to determine whether allelic variation at the nNOS gene locus is associated with MS in an Australian cohort. DNA samples obtained from a Caucasian Australian population affected with MS and an unaffected control population, matched for gender, age and ethnicity, were genotyped for a microsatellite polymorphism in the promoter region of the nNOS gene. Allele frequencies were compared using chi-squared based statistical analyses with significance tested by Monte Carlo simulation. Allelic analysis of MS cases and controls produced a chi-squared value of 5.63 with simulated P = 0.96 (OR(max) = 1.41, 95% CI: 0.926-2.15). Similarly, a Mann-Whitney U analysis gave a non-significant P-value of 0.377 for allele distribution. No differences in allele frequencies were observed for gender or clinical course subtype (P > 0.05). Statistical analysis indicated that there is no association of this nNOS variant and MS and hence the gene does not appear to play a genetically significant role in disease susceptibility.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Parabolic trough concentrator collector is the most matured, proven and widespread technology for the exploitation of the solar energy on a large scale for middle temperature applications. The assessment of the opportunities and the possibilities of the collector system are relied on its optical performance. A reliable Monte Carlo ray tracing model of a parabolic trough collector is developed by using Zemax software. The optical performance of an ideal collector depends on the solar spectral distribution and the sunshape, and the spectral selectivity of the associated components. Therefore, each step of the model, including the spectral distribution of the solar energy, trough reflectance, glazing anti-reflection coating and the absorber selective coating is explained and verified. Radiation flux distribution around the receiver, and the optical efficiency are two basic aspects of optical simulation are calculated using the model, and verified with widely accepted analytical profile and measured values respectively. Reasonably very good agreement is obtained. Further investigations are carried out to analyse the characteristics of radiation distribution around the receiver tube at different insolation, envelop conditions, and selective coating on the receiver; and the impact of scattered light from the receiver surface on the efficiency. However, the model has the capability to analyse the optical performance at variable sunshape, tracking error, collector imperfections including absorber misalignment with focal line and de-focal effect of the absorber, different rim angles, and geometric concentrations. The current optical model can play a significant role in understanding the optical aspects of a trough collector, and can be employed to extract useful information on the optical performance. In the long run, this optical model will pave the way for the construction of low cost standalone photovoltaic and thermal hybrid collector in Australia for small scale domestic hot water and electricity production.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study was to investigate the effect of very small air gaps (less than 1 mm) on the dosimetry of small photon fields used for stereotactic treatments. Measurements were performed with optically stimulated luminescent dosimeters (OSLDs) for 6 MV photons on a Varian 21iX linear accelerator with a Brainlab lMLC attachment for square field sizes down to 6 mm 9 6 mm. Monte Carlo simulations were performed using EGSnrc C++ user code cavity. It was found that the Monte Carlo model used in this study accurately simulated the OSLD measurements on the linear accelerator. For the 6 mm field size, the 0.5 mm air gap upstream to the active area of the OSLD caused a 5.3 % dose reduction relative to a Monte Carlo simulation with no air gap...

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective Recently, Taylor et al. reported that use of the BrainLAB m3 microMLC, for stereotactic radiosurgery, results in a decreased out-of-field dose in the direction of leaf-motion compared to the outof- field dose measured in the direction orthogonal to leaf-motion [1]. It was recommended that, where possible, patients should be treated with their superior–inferior axes aligned with the microMLCs leafmotion direction, to minimise out-of-field doses [1]. This study aimed, therefore, to examine the causes of this asymmetry in outof- field dose and, in particular, to establish that a similar recommendation need not be made for radiotherapy treatments delivered by linear accelerators without external micro-collimation systems. Methods Monte Carlo simulations were used to study out-of-field dose from different linear accelerators (the Varian Clinacs 21iX and 600C and the Elekta Precise) with and without internal MLCs and external microMLCs [2]. Results Simulation results for the Varian Clinac 600C linear accelerator with BrainLAB m3 microMLC confirm Taylor et als [1] published experimental data. The out-of-field dose in the leaf motion direction is deposited by lower energy (more obliquely scattered) photons than the out-of-field dose in the orthogonal direction. Linear accelerators without microMLCs produce no asymmetry in out-offield dose. Conclusions The asymmetry in out-of-field dose previously measured by Taylor et al. [1] results from the shielding characteristics of the BrainLAB m3 microMLC device and is not produced by the linear accelerator to which it is attached.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Established Monte Carlo user codes BEAMnrc and DOSXYZnrc permit the accurate and straightforward simulation of radiotherapy experiments and treatments delivered from multiple beam angles. However, when an electronic portal imaging detector (EPID) is included in these simulations, treatment delivery from non-zero beam angles becomes problematic. This study introduces CTCombine, a purpose-built code for rotating selected CT data volumes, converting CT numbers to mass densities, combining the results with model EPIDs and writing output in a form which can easily be read and used by the dose calculation code DOSXYZnrc...

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A bioeconomic model was developed to evaluate the potential performance of brown tiger prawn stock enhancement in Exmouth Gulf, Australia. This paper presents the framework for the bioeconomic model and risk assessment for all components of a stock enhancement operation, i.e. hatchery, grow-out, releasing, population dynamics, fishery, and monitoring, for a commercial scale enhancement of about 100 metric tonnes, a 25% increase in average annual catch in Exmouth Gulf. The model incorporates uncertainty in estimates of parameters by using a distribution for the parameter over a certain range, based on experiments, published data, or similar studies. Monte Carlo simulation was then used to quantify the effects of these uncertainties on the model-output and on the economic potential of a particular production target. The model incorporates density-dependent effects in the nursery grounds of brown tiger prawns. The results predict that a release of 21 million 1 g prawns would produce an estimated enhanced prawn catch of about 100 t. This scale of enhancement has a 66.5% chance of making a profit. The largest contributor to the overall uncertainty of the enhanced prawn catch was the post-release mortality, followed by the density-dependent mortality caused by released prawns. These two mortality rates are most difficult to estimate in practice and are much under-researched in stock enhancement.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Textured silicon surfaces are widely used in manufacturing of solar cells due to increasing the light absorption probability and also the antireflection properties. However, these Si surfaces have a high density of surface defects that need to be passivated. In this study, the effect of the microscopic surface texture on the plasma surface passivation of solar cells is investigated. The movement of 105 H+ ions in the texture-modified plasma sheath is studied by Monte Carlo numerical simulation. The hydrogen ions are driven by the combined electric field of the plasma sheath and the textured surface. The ion dynamics is simulated, and the relative ion distribution over the textured substrate is presented. This distribution can be used to interpret the quality of the Si dangling bonds saturation and consequently, the direct plasma surface passivation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is shown that, owing to selective delivery of ionic and neutral building blocks directly from the ionized gas phase and via surface migration, plasma environments offer a better deal of deterministic synthesis of ordered nanoassemblies compared to thermal chemical vapor deposition. The results of hybrid Monte Carlo (gas phase) and adatom self-organization (surface) simulation suggest that higher aspect ratios and better size and pattern uniformity of carbon nanotip microemitters can be achieved via the plasma route. © 2006 American Institute of Physics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Using Monte Carlo simulation technique, we have calculated the distribution of ion current extracted from low-temperature plasmas and deposited onto the substrate covered with a nanotube array. We have shown that a free-standing carbon nanotube is enclosed in a circular bead of the ion current, whereas in square and hexagonal nanotube patterns, the ion current is mainly concentrated along the lines connecting the nearest nanotubes. In a very dense array (with the distance between nanotubes/nanotube-height ratio less than 0.05), the ions do not penetrate to the substrate surface and deposit on side surfaces of the nanotubes.