975 resultados para markov chains monte carlo methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Therefore, relatively small differences in the total proton stopping power given, for example, by the different models provided by GEANT4 can lead to significant disagreements in the final proton energy spectra when integrated along lengthy proton trajectories. This work presents proton energy spectra obtained by GEANT4.8.2 simulations using ICRU49, Ziegler1985 and Ziegler2000 models for 19.68MeV protons passing through a number of Al absorbers with various thicknesses. The spectra were compared with the experimental data, with TRIM/SRIM2008 and MCNPX2.4.0 simulations, and with the Payne analytical solution for the transport equation in the Fokker-Plank approximation. It is shown that the MCNPX simulations reasonably reproduce well all experimental spectra. For the relatively thin targets all the methods give practically identical results but this is not the same for the thick absorbers. It should be noted that all the spectra were measured at the proton energies significantly above 2MeV, i.e., in the so-called Bethe-Bloch region. Therefore the observed disagreements in GEANT4 results, simulated with different models, are somewhat unexpected. Further studies are necessary for better understanding and definitive conclusions. © 2009 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pCT deals with relatively thick targets like the human head or trunk. Thus, the fidelity of pCT as a tool for proton therapy planning depends on the accuracy of physical formulas used for proton interaction with thick absorbers. Although the actual overall accuracy of the proton stopping power in the Bethe-Bloch domain is about 1%, the analytical calculations and the Monte Carlo simulations with codes like TRIM/SRIM, MCNPX and GEANT4 do not agreed with each other. A tentative to validate the codes against experimental data for thick absorbers bring some difficulties: only a few data is available and the existing data sets have been acquired at different initial proton energies, and for different absorber materials. In this work we compare the results of our Monte Carlo simulations with existing experimental data in terms of reduced calibration curve, i.e. the range - energy dependence normalized on the range scale by the full projected CSDA range for given initial proton energy in a given material, taken from the NIST PSTAR database, and on the final proton energy scale - by the given initial energy of protons. This approach is almost energy and material independent. The results of our analysis are important for pCT development because the contradictions observed at arbitrary low initial proton energies could be easily scaled now to typical pCT energies. © 2010 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a new approach for optimal phasor measurement units placement for fault location on electric power distribution systems using Greedy Randomized Adaptive Search Procedure metaheuristic and Monte Carlo simulation. The optimized placement model herein proposed is a general methodology that can be used to place devices aiming to record the voltage sag magnitudes for any fault location algorithm that uses voltage information measured at a limited set of nodes along the feeder. An overhead, three-phase, three-wire, 13.8 kV, 134-node, real-life feeder model is used to evaluate the algorithm. Tests show that the results of the fault location methodology were improved thanks to the new optimized allocation of the meters pinpointed using this methodology. © 2011 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper a framework based on the decomposition of the first-order optimality conditions is described and applied to solve the Probabilistic Power Flow (PPF) problem in a coordinated but decentralized way in the context of multi-area power systems. The purpose of the decomposition framework is to solve the problem through a process of solving smaller subproblems, associated with each area of the power system, iteratively. This strategy allows the probabilistic analysis of the variables of interest, in a particular area, without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. An efficient method for probabilistic analysis, considering uncertainty in n system loads, is applied. The proposal is to use a particular case of the point estimate method, known as Two-Point Estimate Method (TPM), rather than the traditional approach based on Monte Carlo simulation. The main feature of the TPM is that it only requires resolve 2n power flows for to obtain the behavior of any random variable. An iterative coordination algorithm between areas is also presented. This algorithm solves the Multi-Area PPF problem in a decentralized way, ensures the independent operation of each area and integrates the decomposition framework and the TPM appropriately. The IEEE RTS-96 system is used in order to show the operation and effectiveness of the proposed approach and the Monte Carlo simulations are used to validation of the results. © 2011 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an approach for probabilistic analysis of unbalanced three-phase weakly meshed distribution systems considering uncertainty in load demand. In order to achieve high computational efficiency this approach uses both an efficient method for probabilistic analysis and a radial power flow. The probabilistic approach used is the well-known Two-Point Estimate Method. Meanwhile, the compensation-based radial power flow is used in order to extract benefits from the topological characteristics of the distribution systems. The generation model proposed allows modeling either PQ or PV bus on the connection point between the network and the distributed generator. In addition allows control of the generator operating conditions, such as the field current and the power delivery at terminals. Results on test with IEEE 37 bus system is given to illustrate the operation and effectiveness of the proposed approach. A Monte Carlo Simulations method is used to validate the results. © 2011 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper the point estimation method is applied to solve the probabilistic power flow problem for unbalanced three-phase distribution systems. Through the implementation of this method the probability distribution functions of voltages (magnitude and angle) as well as the active and reactive power flows in the branches of the distribution system are determined. Two different approaches of the point estimation method are presented (2m and 2m+1 point schemes). In order to test the proposed methodology, the IEEE 34 and 123 bus test systems are used. The results obtained with both schemes are compared with the ones obtained by a Monte Carlo Simulation (MCS).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of QoS parameters to evaluate the quality of service in a mesh network is essential mainly when providing multimedia services. This paper proposes an algorithm for planning wireless mesh networks in order to satisfy some QoS parameters, given a set of test points (TPs) and potential access points (APs). Examples of QoS parameters include: probability of packet loss and mean delay in responding to a request. The proposed algorithm uses a Mathematical Programming model to determine an adequate topology for the network and Monte Carlo simulation to verify whether the QoS parameters are being satisfied. The results obtained show that the proposed algorithm is able to find satisfactory solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 pb-1 of data collected in pp collisions at s = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV/c is above 95% over the whole region of pseudorapidity covered by the CMS muon system, < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeVc is higher than 90% over the full η range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100GeV/c and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV/c. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation. © 2012 IOP Publishing Ltd and Sissa Medialab srl.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on the literature data from HT-29 cell monolayers, we develop a model for its growth, analogous to an epidemic model, mixing local and global interactions. First, we propose and solve a deterministic equation for the progress of these colonies. Thus, we add a stochastic (local) interaction and simulate the evolution of an Eden-like aggregate by using dynamical Monte Carlo methods. The growth curves of both deterministic and stochastic models are in excellent agreement with the experimental observations. The waiting times distributions, generated via our stochastic model, allowed us to analyze the role of mesoscopic events. We obtain log-normal distributions in the initial stages of the growth and Gaussians at long times. We interpret these outcomes in the light of cellular division events: in the early stages, the phenomena are dependent each other in a multiplicative geometric-based process, and they are independent at long times. We conclude that the main ingredients for a good minimalist model of tumor growth, at mesoscopic level, are intrinsic cooperative mechanisms and competitive search for space. © 2013 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Zootecnia - FMVZ

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A complete analysis of the sensitivity to new Hbb̄ couplings from the process e+e- → bb̄vv̄ at the next generation of linear colliders was performed. These new couplings were predicted by many extensions of the Standard Model. The results are comparable to the study performed previously where a global fit analysis for L=500 fb-1 and √s=500 GeV resulted in a relative accuracy of 2.2% in the gHbb Yukawa coupling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose a bivariate distribution for the bivariate survival times based on Farlie-Gumbel-Morgenstern copula to model the dependence on a bivariate survival data. The proposed model allows for the presence of censored data and covariates. For inferential purpose a Bayesian approach via Markov Chain Monte Carlo (MCMC) is considered. Further, some discussions on the model selection criteria are given. In order to examine outlying and influential observations, we present a Bayesian case deletion influence diagnostics based on the Kullback-Leibler divergence. The newly developed procedures are illustrated via a simulation study and a real dataset.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose a hybrid hazard regression model with threshold stress which includes the proportional hazards and the accelerated failure time models as particular cases. To express the behavior of lifetimes the generalized-gamma distribution is assumed and an inverse power law model with a threshold stress is considered. For parameter estimation we develop a sampling-based posterior inference procedure based on Markov Chain Monte Carlo techniques. We assume proper but vague priors for the parameters of interest. A simulation study investigates the frequentist properties of the proposed estimators obtained under the assumption of vague priors. Further, some discussions on model selection criteria are given. The methodology is illustrated on simulated and real lifetime data set.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hepatitis C virus (HCV) is a public health problem throughout the world and 3% of the world population is infected with this virus. It is estimated that 3-4 millions individuals are being infected every year. It has been estimated that around 1.5% of Brazilian population is anti-HCV positive and the Northeast region showed the highest prevalence in Brazil. The aim of this study was to characterize HCV genotypes circulating in Pernambuco State (PE), Brazil, located in the Northeast region of the country. This study included 85 anti-HCV positive patients followed up between 2004 and 2011. For genotyping, a 380bp fragment of HCV RNA in the NS5B region was amplified by nested PCR. Phylogenetic analysis was conducted using Bayesian Markov chain Monte Carlo simulation (MCMC) using BEAST v.1.5.3. From 85 samples, 63 (74.1%) positive to NS5B fragment were successfully sequenced. Subtype 1b was the most prevalent in this population (42-66.7%), followed by 3a (16-25.4%), 1a (4-6.3%) and 2b (1-1.6%). Twelve (63.1%) and seven (36.9%) patients with HCV and schistosomiasis were infected with subtypes 1b and 3a, respectively. Brazil is a large country with many different population backgrounds; a large variation in the frequencies of HCV genotypes is predictable throughout its territory. This study reports HCV genotypes from Pernambuco State where subtype 1b was found to be the most prevalent. Phylogenetic analysis suggests the presence of the different HCV strains circulating within this population. (C) 2012 Elsevier B.V. All rights reserved.