61 resultados para Markov Model with Monte-Carlo microsimulations
Resumo:
Classification methods with embedded feature selection capability are very appealing for the analysis of complex processes since they allow the analysis of root causes even when the number of input variables is high. In this work, we investigate the performance of three techniques for classification within a Monte Carlo strategy with the aim of root cause analysis. We consider the naive bayes classifier and the logistic regression model with two different implementations for controlling model complexity, namely, a LASSO-like implementation with a L1 norm regularization and a fully Bayesian implementation of the logistic model, the so called relevance vector machine. Several challenges can arise when estimating such models mainly linked to the characteristics of the data: a large number of input variables, high correlation among subsets of variables, the situation where the number of variables is higher than the number of available data points and the case of unbalanced datasets. Using an ecological and a semiconductor manufacturing dataset, we show advantages and drawbacks of each method, highlighting the superior performance in term of classification accuracy for the relevance vector machine with respect to the other classifiers. Moreover, we show how the combination of the proposed techniques and the Monte Carlo approach can be used to get more robust insights into the problem under analysis when faced with challenging modelling conditions.
Resumo:
Gold nanoparticles (GNPs) have shown potential to be used as a radiosensitizer for radiation therapy. Despite extensive research activity to study GNP radiosensitization using photon beams, only a few studies have been carried out using proton beams. In this work Monte Carlo simulations were used to assess the dose enhancement of GNPs for proton therapy. The enhancement effect was compared between a clinical proton spectrum, a clinical 6 MV photon spectrum, and a kilovoltage photon source similar to those used in many radiobiology lab settings. We showed that the mechanism by which GNPs can lead to dose enhancements in radiation therapy differs when comparing photon and proton radiation. The GNP dose enhancement using protons can be up to 14 and is independent of proton energy, while the dose enhancement is highly dependent on the photon energy used. For the same amount of energy absorbed in the GNP, interactions with protons, kVp photons and MV photons produce similar doses within several nanometers of the GNP surface, and differences are below 15% for the first 10 nm. However, secondary electrons produced by kilovoltage photons have the longest range in water as compared to protons and MV photons, e.g. they cause a dose enhancement 20 times higher than the one caused by protons 10 μm away from the GNP surface. We conclude that GNPs have the potential to enhance radiation therapy depending on the type of radiation source. Proton therapy can be enhanced significantly only if the GNPs are in close proximity to the biological target.
Resumo:
Density functional calculations have been performed for ring isomers of sulfur with up to 18 atoms, and for chains with up to ten atoms. There are many isomers of both types, and the calculations predict the existence of new forms. Larger rings and chains are very flexible, with numerous local energy minima. Apart from a small, but consistent overestimate in the bond lengths, the results reproduce experimental structures where known. Calculations are also performed on the energy surfaces of S8 rings, on the interaction between a pair of such rings, and the reaction between one S8 ring and the triplet diradical S8 chain. The results for potential energies, vibrational frequencies, and reaction mechanisms in sulfur rings and chains provide essential ingredients for Monte Carlo simulations of the liquid–liquid phase transition. The results of these simulations will be presented in Part II.
Resumo:
Density functional calculations of the structure, potential energy surface and reactivity for organic systems closely related to bisphenol-A-polycarbonate (BPA-PC) provide the basis for a model describing the ring-opening polymerization of its cyclic oligomers by nucleophilic molecules. Monte Carlo simulations using this model show a strong tendency to polymerize that is increased by increasing density and temperature, and is greater in 3D than in 2D. Entropy in the distribution of inter-particle bonds is the driving force for chain formation. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Fixed-node diffusion Monte Carlo computations are used to determine the ground state energy and electron density for jellium spheres with up to N = 106 electrons and background densities corresponding to the electron gas parameter 1 less than or equal to r(s)less than or equal to5.62. We analyze the density and size dependence of the surface energy, and we extrapolate our data to the thermodynamic limit. The results agree well with the predictions of density functional computations using the local density approximation. In the case of N = 20, we extend our computation to higher densities and identify a transition between atomic- and jelliumlike nodal structures occurring at the background density corresponding to r(s)=0.13. In this case the local density approximation is unable to reproduce the changes in the correlation energy due to the discontinuous transition in the ground state nodal structure. We discuss the relevance of our results for nonlocal approximations to density functional theory.
Resumo:
First steps are taken to model the electrochemical deposition of metals in nanometer-sized cavities. In the present work, the electrochemical deposition of Cu atoms in nanometer-sized holes dug on Au(111) is investigated through Monte Carlo simulations using the embedded atom method to represent particle interactions. By sweeping the chemical potential of Cu, a cluster is allowed to grow within the hole rising four atomic layers above the surface. Its lateral extension remains confined to the area defined by the borders of the original defect. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Structural and kinetic aspects of 2-D irreversible metal deposition under potentiostatic conditions are analyzed by means of dynamic Monte Carlo simulations employing embedded atom potentials for a model system. Three limiting models, all considering adatom diffusion, were employed to describe adatom deposition. The first model (A) considers adatom deposition on any free substrate site on the surface at the same rate. The second model (B) considers adatom deposition only on substrate sites which exhibit no neighboring sites occupied by adatoms. The third model (C) allows deposition at higher rates on sites presenting neighboring sites occupied by adatoms. Under the proper conditions, the coverage (theta) versus time (t) relationship for the three cases can be heuristically fitted to the functional form theta = 1 - exp(-betat(alpha)), where alpha and beta are parameters. We suggest that the value of the parameter alpha can be employed to distinguish experimentally between the three cases. While model A trivially delivers a = 1, models B and C are characterized by alpha 1, respectively.
Resumo:
Nano- and meso-scale simulation of chemical ordering kinetics in nano-layered L1(0)-AB binary intermetallics was performed. In the nano- (atomistic) scale Monte Carlo (MC) technique with vacancy mechanism of atomic migration implemented with diverse models for the system energetics was used. The meso-scale microstructure evolution was, in turn, simulated by means of a MC procedure applied to a system built of meso-scale voxels ordered in particular L1(0) variants. The voxels were free to change the L1(0) variant and interacted with antiphase-boundary energies evaluated within the nano-scale simulations. The study addressed FePt thin layers considered as a material for ultra-high-density magnetic storage media and revealed metastability of the L1(0) c-variant superstructure with monoatomic planes parallel to the (001)-oriented layer surface and off-plane easy magnetization. The layers, originally perfectly ordered in the c-variant, showed discontinuous precipitation of a- and b-L1(0)-variant domains running in parallel with homogeneous disordering (i.e. generation of antisite defects). The domains nucleated heterogeneously on the free monoatomic Fe surface of the layer, grew inwards its volume and relaxed towards an equilibrium microstructure of the system. Two
Resumo:
We propose a new approach for the inversion of anisotropic P-wave data based on Monte Carlo methods combined with a multigrid approach. Simulated annealing facilitates objective minimization of the functional characterizing the misfit between observed and predicted traveltimes, as controlled by the Thomsen anisotropy parameters (epsilon, delta). Cycling between finer and coarser grids enhances the computational efficiency of the inversion process, thus accelerating the convergence of the solution while acting as a regularization technique of the inverse problem. Multigrid perturbation samples the probability density function without the requirements for the user to adjust tuning parameters. This increases the probability that the preferred global, rather than a poor local, minimum is attained. Undertaking multigrid refinement and Monte Carlo search in parallel produces more robust convergence than does the initially more intuitive approach of completing them sequentially. We demonstrate the usefulness of the new multigrid Monte Carlo (MGMC) scheme by applying it to (a) synthetic, noise-contaminated data reflecting an isotropic subsurface of constant slowness, horizontally layered geologic media and discrete subsurface anomalies; and (b) a crosshole seismic data set acquired by previous authors at the Reskajeage test site in Cornwall, UK. Inverted distributions of slowness (s) and the Thomson anisotropy parameters (epsilon, delta) compare favourably with those obtained previously using a popular matrix-based method. Reconstruction of the Thomsen epsilon parameter is particularly robust compared to that of slowness and the Thomsen delta parameter, even in the face of complex subsurface anomalies. The Thomsen epsilon and delta parameters have enhanced sensitivities to bulk-fabric and fracture-based anisotropies in the TI medium at Reskajeage. Because reconstruction of slowness (s) is intimately linked to that epsilon and delta in the MGMC scheme, inverted images of phase velocity reflect the integrated effects of these two modes of anisotropy. The new MGMC technique thus promises to facilitate rapid inversion of crosshole P-wave data for seismic slownesses and the Thomsen anisotropy parameters, with minimal user input in the inversion process.
Resumo:
In astrophysical systems, radiation-matter interactions are important in transferring energy and momentum between the radiation field and the surrounding material. This coupling often makes it necessary to consider the role of radiation when modelling the dynamics of astrophysical fluids. During the last few years, there have been rapid developments in the use of Monte Carlo methods for numerical radiative transfer simulations. Here, we present an approach to radiation hydrodynamics that is based on coupling Monte Carlo radiative transfer techniques with finite-volume hydrodynamical methods in an operator-split manner. In particular, we adopt an indivisible packet formalism to discretize the radiation field into an ensemble of Monte Carlo packets and employ volume-based estimators to reconstruct the radiation field characteristics. In this paper the numerical tools of this method are presented and their accuracy is verified in a series of test calculations. Finally, as a practical example, we use our approach to study the influence of the radiation-matter coupling on the homologous expansion phase and the bolometric light curve of Type Ia supernova explosions. © 2012 The Authors Monthly Notices of the Royal Astronomical Society © 2012 RAS.
Resumo:
Quality of care is an important aspect of healthcare monitoring, which is used to ensure that the healthcare system is delivering care of the highest standard. With populations growing older there is an increased urgency in making sure that the healthcare delivered is of the highest standard. Healthcare providers are under increased pressure to ensure that this is the case with public and government demand expecting a healthcare system of the highest quality. Modelling quality of care is difficult to measure due to the many ways of defining it. This paper introduces a potential model which could be used to take quality of care into account when modelling length of stay. The Coxian phase-type distribution is used to model length of stay and the associated quality of care incorporated into the Coxian using a Hidden Markov model. Covariates are also introduced to determine their impact on the hidden level to find out what potentially can affect quality of care. This model is applied to geriatic patient data from the Lombardy region of Italy. The results obtained highlighted that bed numbers and the type of hospital (public or private) can have an effect on the quality of care delivered.
Resumo:
In this paper, a novel and effective lip-based biometric identification approach with the Discrete Hidden Markov Model Kernel (DHMMK) is developed. Lips are described by shape features (both geometrical and sequential) on two different grid layouts: rectangular and polar. These features are then specifically modeled by a DHMMK, and learnt by a support vector machine classifier. Our experiments are carried out in a ten-fold cross validation fashion on three different datasets, GPDS-ULPGC Face Dataset, PIE Face Dataset and RaFD Face Dataset. Results show that our approach has achieved an average classification accuracy of 99.8%, 97.13%, and 98.10%, using only two training images per class, on these three datasets, respectively. Our comparative studies further show that the DHMMK achieved a 53% improvement against the baseline HMM approach. The comparative ROC curves also confirm the efficacy of the proposed lip contour based biometrics learned by DHMMK. We also show that the performance of linear and RBF SVM is comparable under the frame work of DHMMK.
Resumo:
Objective To present a first and second trimester Down syndrome screening strategy, whereby second-trimester marker determination is contingent on the first-trimester results. Unlike non-disclosure sequential screening (the Integrated test), which requires all women to have markers in both trimesters, this allows a large proportion of the women to complete screening in the first trimester. Methods Two first-trimester risk cut-offs defined three types of results: positive and referred for early diagnosis; negative with screening complete; and intermediate, needing second-trimester markers. Multivariate Gaussian modelling with Monte Carlo simulation was used to estimate the false-positive rate for a fixed 85% detection rate. The false-positive rate was evaluated for various early detection rates and early test completion rates. Model parameters were taken from the SURUSS trial. Results Completion of screening in the first trimester for 75% of women resulted in a 30% early detection rate and a 55% second trimester detected rate (net 85%) with a false-positive rate only 0.1% above that achievable by the Integrated test. The screen-positive rate was 0.1% in the first trimester and 4.7% for those continuing to be tested in the second trimester. If the early detection rate were to be increased to 45% or the early completion rate were to be increased to 80%, there would be a further 0.1% increase in the false-positive rate. Conclusion Contingent screening can achieve results comparable with the Integrated test but with earlier completion of screening for most women. Both strategies need to be evaluated in large-scale prospective studies particularly in relation to psychological impact and practicability.