975 resultados para QUANTUM MONTE-CARLO


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A major priority for cancer control agencies is to reduce geographical inequalities in cancer outcomes. While the poorer breast cancer survival among socioeconomically disadvantaged women is well established, few studies have looked at the independent contribution that area- and individual-level factors make to breast cancer survival. Here we examine relationships between geographic remoteness, area-level socioeconomic disadvantage and breast cancer survival after adjustment for patients’ socio- demographic characteristics and stage at diagnosis. Multilevel logistic regression and Markov chain Monte Carlo simulation were used to analyze 18 568 breast cancer cases extracted from the Queensland Cancer Registry for women aged 30 to 70 years diagnosed between 1997 and 2006 from 478 Statistical Local Areas in Queensland, Australia. Independent of individual-level factors, area-level disadvantage was associated with breast-cancer survival (p=0.032). Compared to women in the least disadvantaged quintile (Quintile 5), women diagnosed while resident in one of the remaining four quintiles had significantly worse survival (OR 1.23, 1.27, 1.30, 1.37 for Quintiles 4, 3, 2 and 1 respectively).) Geographic remoteness was not related to lower survival after multivariable adjustment. There was no evidence that the impact of area-level disadvantage varied by geographic remoteness. At the individual level, Indigenous status, blue collar occupations and advanced disease were important predictors of poorer survival. A woman’s survival after a diagnosis of breast cancer depends on the socio-economic characteristics of the area where she lives, independently of her individual-level characteristics. It is crucial that the underlying reasons for these inequalities be identified to appropriately target policies, resources and effective intervention strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Parabolic Trough Concentrators (PTC) are the most proven solar collectors for solar thermal power plants, and are suitable for concentrating photovoltaic (CPV) applications. PV cells are sensitive to spatial uniformity of incident light and the cell operating temperature. This requires the design of CPV-PTCs to be optimised both optically and thermally. Optical modelling can be performed using Monte Carlo Ray Tracing (MCRT), with conjugate heat transfer (CHT) modelling using the computational fluid dynamics (CFD) to analyse the overall designs. This paper develops and evaluates a CHT simulation for a concentrating solar thermal PTC collector. It uses the ray tracing work by Cheng et al. (2010) and thermal performance data for LS-2 parabolic trough used in the SEGS III-VII plants from Dudley et al. (1994). This is a preliminary step to developing models to compare heat transfer performances of faceted absorbers for concentrating photovoltaic (CPV) applications. Reasonable agreement between the simulation results and the experimental data confirms the reliability of the numerical model. The model explores different physical issues as well as computational issues for this particular kind of system modeling. The physical issues include the resultant non-uniformity of the boundary heat flux profile and the temperature profile around the tube, and uneven heating of the HTF. The numerical issues include, most importantly, the design of the computational domain/s, and the solution techniques of the turbulence quantities and the near-wall physics. This simulation confirmed that optical simulation and the computational CHT simulation of the collector can be accomplished independently.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La varianza estadística del costo total de un proyecto usualmente se estima por medio de la simulación de Monte Carlo, bajo el supuesto de que los acercamientos analíticos son demasiado complicados. Este artículo analiza este supuesto y muestra que, contrario a lo esperado, la solución analítica es relativamente directa. También se muestra que el coeficiente de variación no se ve afectado por el tamaño (área superficial) del proyecto cuando se usan los costos de los componentes estandarizados. Se provee un caso de estudio en el cual se analizan los costos reales de los componentes para obtener la varianza del costo total requerida. Los resultados confirman trabajos previos al mostrar que la aproximación del segundo momento (varianza) bajo el supuesto de independencia subestima considerablemente el valor exacto. El análisis continua examinando los efectos del juicio profesional y con los datos simulados utilizados, la aproximación resulta razonablemente exacta - el juicio profesional absorbe la mayor parte de las intercorrelaciones involucradas. También se da un ejemplo en el cual las cantidades de los componentes unitarios son valoradas por sus costos unitarios promedios y muestra, una vez más, que la aproximación es cercana al valor real. Finalmente, el trabajo se extiende mostrando cómo obtener, para cada proyecto, las varianzas exactas del costo total.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we present a new simulation methodology in order to obtain exact or approximate Bayesian inference for models for low-valued count time series data that have computationally demanding likelihood functions. The algorithm fits within the framework of particle Markov chain Monte Carlo (PMCMC) methods. The particle filter requires only model simulations and, in this regard, our approach has connections with approximate Bayesian computation (ABC). However, an advantage of using the PMCMC approach in this setting is that simulated data can be matched with data observed one-at-a-time, rather than attempting to match on the full dataset simultaneously or on a low-dimensional non-sufficient summary statistic, which is common practice in ABC. For low-valued count time series data we find that it is often computationally feasible to match simulated data with observed data exactly. Our particle filter maintains $N$ particles by repeating the simulation until $N+1$ exact matches are obtained. Our algorithm creates an unbiased estimate of the likelihood, resulting in exact posterior inferences when included in an MCMC algorithm. In cases where exact matching is computationally prohibitive, a tolerance is introduced as per ABC. A novel aspect of our approach is that we introduce auxiliary variables into our particle filter so that partially observed and/or non-Markovian models can be accommodated. We demonstrate that Bayesian model choice problems can be easily handled in this framework.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this work is to develop software that is capable of back projecting primary fluence images obtained from EPID measurements through phantom and patient geometries in order to calculate 3D dose distributions. In the first instance, we aim to develop a tool for pretreatment verification in IMRT. In our approach, a Geant4 application is used to back project primary fluence values from each EPID pixel towards the source. Each beam is considered to be polyenergetic, with a spectrum obtained from Monte Carlo calculations for the LINAC in question. At each step of the ray tracing process, the energy differential fluence is corrected for attenuation and beam divergence. Subsequently, the TERMA is calculated and accumulated to an energy differential 3D TERMA distribution. This distribution is then convolved with monoenergetic point spread kernels, thus generating energy differential 3D dose distributions. The resulting dose distributions are accumulated to yield the total dose distribution, which can then be used for pre-treatment verification of IMRT plans. Preliminary results were obtained for a test EPID image comprised of 100 9 100 pixels of unity fluence. Back projection of this field into a 30 cm9 30 cm 9 30 cm water phantom was performed, with TERMA distributions obtained in approximately 10 min (running on a single core of a 3 GHz processor). Point spread kernels for monoenergetic photons in water were calculated using a separate Geant4 application. Following convolution and summation, the resulting 3D dose distribution produced familiar build-up and penumbral features. In order to validate the dose model we will use EPID images recorded without any attenuating material in the beam for a number of MLC defined square fields. The dose distributions in water will be calculated and compared to TPS predictions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the decision-making of multi-area ATC (Available Transfer Capacity) in electricity market environment, the existing resources of transmission network should be optimally dispatched and coordinately employed on the premise that the secure system operation is maintained and risk associated is controllable. The non-sequential Monte Carlo simulation is used to determine the ATC probability density distribution of specified areas under the influence of several uncertainty factors, based on which, a coordinated probabilistic optimal decision-making model with the maximal risk benefit as its objective is developed for multi-area ATC. The NSGA-II is applied to calculate the ATC of each area, which considers the risk cost caused by relevant uncertainty factors and the synchronous coordination among areas. The essential characteristics of the developed model and the employed algorithm are illustrated by the example of IEEE 118-bus test system. Simulative result shows that, the risk of multi-area ATC decision-making is influenced by the uncertainties in power system operation and the relative importance degrees of different areas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A novel in-cylinder pressure method for determining ignition delay has been proposed and demonstrated. This method proposes a new Bayesian statistical model to resolve the start of combustion, defined as being the point at which the band-pass in-cylinder pressure deviates from background noise and the combustion resonance begins. Further, it is demonstrated that this method is still accurate in situations where there is noise present. The start of combustion can be resolved for each cycle without the need for ad hoc methods such as cycle averaging. Therefore, this method allows for analysis of consecutive cycles and inter-cycle variability studies. Ignition delay obtained by this method and by the net rate of heat release have been shown to give good agreement. However, the use of combustion resonance to determine the start of combustion is preferable over the net rate of heat release method because it does not rely on knowledge of heat losses and will still function accurately in the presence of noise. Results for a six-cylinder turbo-charged common-rail diesel engine run with neat diesel fuel at full, three quarters and half load have been presented. Under these conditions the ignition delay was shown to increase as the load was decreased with a significant increase in ignition delay at half load, when compared with three quarter and full loads.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this study, a treatment plan for a spinal lesion, with all beams transmitted though a titanium vertebral reconstruction implant, was used to investigate the potential effect of a high-density implant on a three-dimensional dose distribution for a radiotherapy treatment. The BEAMnrc/DOSXYZnrc and MCDTK Monte Carlo codes were used to simulate the treatment using both a simplified, recltilinear model and a detailed model incorporating the full complexity of the patient anatomy and treatment plan. The resulting Monte Carlo dose distributions showed that the commercial treatment planning system failed to accurately predict both the depletion of dose downstream of the implant and the increase in scattered dose adjacent to the implant. Overall, the dosimetric effect of the implant was underestimated by the commercial treatment planning system and overestimated by the simplified Monte Carlo model. The value of performing detailed Monte Carlo calculations, using the full patient and treatment geometry, was demonstrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent literature has focused on realized volatility models to predict financial risk. This paper studies the benefit of explicitly modeling jumps in this class of models for value at risk (VaR) prediction. Several popular realized volatility models are compared in terms of their VaR forecasting performances through a Monte Carlo study and an analysis based on empirical data of eight Chinese stocks. The results suggest that careful modeling of jumps in realized volatility models can largely improve VaR prediction, especially for emerging markets where jumps play a stronger role than those in developed markets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we analyse the effects of highway traffic flow parameters like vehicle arrival rate and density on the performance of Amplify and Forward (AF) cooperative vehicular networks along a multi-lane highway under free flow state. We derive analytical expressions for connectivity performance and verify them with Monte-Carlo simulations. When AF cooperative relaying is employed together with Maximum Ratio Combining (MRC) at the receivers the average route error rate shows 10-20 fold improvement compared to direct communication. A 4-8 fold increase in maximum number of traversable hops can also be observed at different vehicle densities when AF cooperative communication is used to strengthen communication routes. However the theorical upper bound of maximum number of hops promises higher performance gains.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The emergence of pseudo-marginal algorithms has led to improved computational efficiency for dealing with complex Bayesian models with latent variables. Here an unbiased estimator of the likelihood replaces the true likelihood in order to produce a Bayesian algorithm that remains on the marginal space of the model parameter (with latent variables integrated out), with a target distribution that is still the correct posterior distribution. Very efficient proposal distributions can be developed on the marginal space relative to the joint space of model parameter and latent variables. Thus psuedo-marginal algorithms tend to have substantially better mixing properties. However, for pseudo-marginal approaches to perform well, the likelihood has to be estimated rather precisely. This can be difficult to achieve in complex applications. In this paper we propose to take advantage of multiple central processing units (CPUs), that are readily available on most standard desktop computers. Here the likelihood is estimated independently on the multiple CPUs, with the ultimate estimate of the likelihood being the average of the estimates obtained from the multiple CPUs. The estimate remains unbiased, but the variability is reduced. We compare and contrast two different technologies that allow the implementation of this idea, both of which require a negligible amount of extra programming effort. The superior performance of this idea over the standard approach is demonstrated on simulated data from a stochastic volatility model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Polymeric graphitic carbon nitride materials have attracted increasing attention in recent years owning to their potential applications in energy conversion, environment protection, and so on. Here, from first-principles calculations, we report the electronic structure modification of graphitic carbon nitride (g-C3N4) in response to carbon doping. We showed that each dopant atom can induce a local magnetic moment of 1.0 μB in non-magnetic g-C3N4. At the doping concentration of 1/14, the local magnetic moments of the most stable doping configuration which has the dopant atom at the center of heptazine unit prefer to align in a parallel way leading to long-range ferromagnetic (FM) ordering. When the joint N atom is replaced by C atom, the system favors an antiferromagnetic (AFM) ordering at unstrained state, but can be tuned to ferromagnetism (FM) by applying biaxial tensile strain. More interestingly, the FM state of the strained system is half-metallic with abundant states at the Fermi level in one spin channel and a band gap of 1.82 eV in another spin channel. The Curie temperature (Tc) was also evaluated using a mean-field theory and Monte Carlo simulations within the Ising model. Such tunable electron spin-polarization and ferromagnetism are quite promising for the applications of graphitic carbon nitride in spintronics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the electricity market environment, coordination of system reliability and economics of a power system is of great significance in determining the available transfer capability (ATC). In addition, the risks associated with uncertainties should be properly addressed in the ATC determination process for risk-benefit maximization. Against this background, it is necessary that the ATC be optimally allocated and utilized within relative security constraints. First of all, the non-sequential Monte Carlo stimulation is employed to derive the probability density distribution of ATC of designated areas incorporating uncertainty factors. Second, on the basis of that, a multi-objective optimization model is formulated to determine the multi-area ATC so as to maximize the risk-benefits. Then, the solution to the developed model is achieved by the fast non-dominated sorting (NSGA-II) algorithm, which could decrease the risk caused by uncertainties while coordinating the ATCs of different areas. Finally, the IEEE 118-bus test system is served for demonstrating the essential features of the developed model and employed algorithm.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis introduced Bayesian statistics as an analysis technique to isolate resonant frequency information in in-cylinder pressure signals taken from internal combustion engines. Applications of these techniques are relevant to engine design (performance and noise), energy conservation (fuel consumption) and alternative fuel evaluation. The use of Bayesian statistics, over traditional techniques, allowed for a more in-depth investigation into previously difficult to isolate engine parameters on a cycle-by-cycle basis. Specifically, these techniques facilitated the determination of the start of pre-mixed and diffusion combustion and for the in-cylinder temperature profile to be resolved on individual consecutive engine cycles. Dr Bodisco further showed the utility of the Bayesian analysis techniques by applying them to in-cylinder pressure signals taken from a compression ignition engine run with fumigated ethanol.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Australia, and elsewhere, the movement of trains on long-haul rail networks is usually planned in advance. Typically, a train plan is developed to confirm that the required train movements and track maintenance activities can occur. The plan specifies when track segments will be occupied by particular trains and maintenance activities. On the day of operation, a train controller monitors and controls the movement of trains and maintenance crews, and updates the train plan in response to unplanned disruptions. It can be difficult to predict how good a plan will be in practice. The main performance indicator for a train service should be reliability - the proportion of trains running the service that complete at or before the scheduled time. We define the robustness of a planned train service to be the expected reliability. The robustness of individual train services and for a train plan as a whole can be estimated by simulating the train plan many times with random, but realistic, perturbations to train departure times and segment durations, and then analysing the distributions of arrival times. This process can also be used to set arrival times that will achieve a desired level of robustness for each train service.