950 resultados para Simulação Monte Carlo


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we apply a simulation based approach for estimating transmission rates of nosocomial pathogens. In particular, the objective is to infer the transmission rate between colonised health-care practitioners and uncolonised patients (and vice versa) solely from routinely collected incidence data. The method, using approximate Bayesian computation, is substantially less computer intensive and easier to implement than likelihood-based approaches we refer to here. We find through replacing the likelihood with a comparison of an efficient summary statistic between observed and simulated data that little is lost in the precision of estimated transmission rates. Furthermore, we investigate the impact of incorporating uncertainty in previously fixed parameters on the precision of the estimated transmission rates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper considers VECMs for variables exhibiting cointegration and common features in the transitory components. While the presence of cointegration between the permanent components of series reduces the rank of the long-run multiplier matrix, a common feature among the transitory components leads to a rank reduction in the matrix summarizing short-run dynamics. The common feature also implies that there exists linear combinations of the first-differenced variables in a cointegrated VAR that are white noise and traditional tests focus on testing for this characteristic. An alternative, however, is to test the rank of the short-run dynamics matrix directly. Consequently, we use the literature on testing the rank of a matrix to produce some alternative test statistics. We also show that these are identical to one of the traditional tests. The performance of the different methods is illustrated in a Monte Carlo analysis which is then used to re-examine an existing empirical study. Finally, this approach is applied to provide a check for the presence of common dynamics in DSGE models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exposure to ultrafine particles (diameter less than 100 nm) is an important topic in epidemiological and toxicological studies. This study used the average particle number size distribution data obtained from our measurement survey in major micro-environments, together with the people activity pattern data obtained from the Italian Human Activity Pattern Survey to estimate the tracheobronchial and alveolar dose of submicrometer particles for different population age groups in Italy. We developed a numerical methodology based on Monte Carlo method, in order to estimate the best combination from a probabilistic point of view. More than 106 different cases were analyzed according to a purpose built sub-routine and our results showed that the daily alveolar particle number and surface area deposited for all of the age groups considered was equal to 1.5 x 1011 particles and 2.5 x 1015 m2, respectively, varying slightly for males and females living in Northern or Southern Italy. In terms of tracheobronchial deposition, the corresponding values for daily particle number and surface area for all age groups was equal to 6.5 x 1010 particles and 9.9 x 1014 m2, respectively. Overall, the highest contributions were found to come from indoor cooking (female), working time (male) and transportation (i.e. traffic derived particles) (children).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Automated feature extraction and correspondence determination is an extremely important problem in the face recognition community as it often forms the foundation of the normalisation and database construction phases of many recognition and verification systems. This paper presents a completely automatic feature extraction system based upon a modified volume descriptor. These features form a stable descriptor for faces and are utilised in a reversible jump Markov chain Monte Carlo correspondence algorithm to automatically determine correspondences which exist between faces. The developed system is invariant to changes in pose and occlusion and results indicate that it is also robust to minor face deformations which may be present with variations in expression.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most unsignalised intersection capacity calculation procedures are based on gap acceptance models. Accuracy of critical gap estimation affects accuracy of capacity and delay estimation. Several methods have been published to estimate drivers’ sample mean critical gap, the Maximum Likelihood Estimation (MLE) technique regarded as the most accurate. This study assesses three novel methods; Average Central Gap (ACG) method, Strength Weighted Central Gap method (SWCG), and Mode Central Gap method (MCG), against MLE for their fidelity in rendering true sample mean critical gaps. A Monte Carlo event based simulation model was used to draw the maximum rejected gap and accepted gap for each of a sample of 300 drivers across 32 simulation runs. Simulation mean critical gap is varied between 3s and 8s, while offered gap rate is varied between 0.05veh/s and 0.55veh/s. This study affirms that MLE provides a close to perfect fit to simulation mean critical gaps across a broad range of conditions. The MCG method also provides an almost perfect fit and has superior computational simplicity and efficiency to the MLE. The SWCG method performs robustly under high flows; however, poorly under low to moderate flows. Further research is recommended using field traffic data, under a variety of minor stream and major stream flow conditions for a variety of minor stream movement types, to compare critical gap estimates using MLE against MCG. Should the MCG method prove as robust as MLE, serious consideration should be given to its adoption to estimate critical gap parameters in guidelines.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Often CAD models already exist for parts of a geometry being simulated using GEANT4. Direct import of these CAD models into GEANT4 however,may not be possible and complex components may be diffcult to define via other means. Solutions that allow for users to work around the limited support in the GEANT4 toolkit for loading predefined CAD geometries have been presented by others, however these solutions require intermediate file format conversion using commercial software. Here within we describe a technique that allows for CAD models to be directly loaded as geometry without the need for commercial software and intermediate file format conversion. Robustness of the interface was tested using a set of CAD models of various complexity; for the models used in testing, no import errors were reported and all geometry was found to be navigable by GEANT4. Funding source: Cancer Australia (Department of Health and Ageing) Research Grant 614217

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study was to investigate the effect of very small air gaps (less than 1 mm) on the dosimetry of small photon fields used for stereotactic treatments. Measurements were performed with optically stimulated luminescent dosimeters (OSLDs) for 6 MV photons on a Varian 21iX linear accelerator with a Brainlab μMLC attachment for square field sizes down to 6 mm × 6 mm. Monte Carlo simulations were performed using EGSnrc C++ user code cavity. It was found that the Monte Carlo model used in this study accurately simulated the OSLD measurements on the linear accelerator. For the 6 mm field size, the 0.5 mm air gap upstream to the active area of the OSLD caused a 5.3 % dose reduction relative to a Monte Carlo simulation with no air gap. A hypothetical 0.2 mm air gap caused a dose reduction > 2 %, emphasizing the fact that even the tiniest air gaps can cause a large reduction in measured dose. The negligible effect on an 18 mm field size illustrated that the electronic disequilibrium caused by such small air gaps only affects the dosimetry of the very small fields. When performing small field dosimetry, care must be taken to avoid any air gaps, as can be often present when inserting detectors into solid phantoms. It is recommended that very small field dosimetry is performed in liquid water. When using small photon fields, sub-millimetre air gaps can also affect patient dosimetry if they cannot be spatially resolved on a CT scan. However the effect on the patient is debatable as the dose reduction caused by a 1 mm air gap, starting out at 19% in the first 0.1 mm behind the air gap, decreases to < 5 % after just 2 mm, and electronic equilibrium is fully re-established after just 5 mm.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Alveolar and tracheobronchial-deposited submicrometer particle number and surface area data received by different age groups in Australia are shown. Activity patterns were combined with microenvironmental data through a Monte-Carlo method. Particle number distributions for the most significant microenvironments were obtained from our measurement survey data and people activity pattern data from the Australian Human Activity Pattern Survey were used. Daily alveolar particle number (surface area) dose received by all age groups was equal to 3.0×1010 particles (4.5×102 mm2), varying slightly between males and females. In contrast to gender, the lifestyle was found to significantly affect the daily dose, with highest depositions characterizing adults. The main contribution was due to indoor microenvironments. Finally a comparison between Italian and Australian people in terms of received particle dose was reported; it shows that different cooking styles can affect dose levels: higher doses were received by Italians, mainly due to their particular cooking activity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work, a Langevin dynamics model of the diffusion of water in articular cartilage was developed. Numerical simulations of the translational dynamics of water molecules and their interaction with collagen fibers were used to study the quantitative relationship between the organization of the collagen fiber network and the diffusion tensor of water in model cartilage. Langevin dynamics was used to simulate water diffusion in both ordered and partially disordered cartilage models. In addition, an analytical approach was developed to estimate the diffusion tensor for a network comprising a given distribution of fiber orientations. The key findings are that (1) an approximately linear relationship was observed between collagen volume fraction and the fractional anisotropy of the diffusion tensor in fiber networks of a given degree of alignment, (2) for any given fiber volume fraction, fractional anisotropy follows a fiber alignment dependency similar to the square of the second Legendre polynomial of cos(θ), with the minimum anisotropy occurring at approximately the magic angle (θMA), and (3) a decrease in the principal eigenvalue and an increase in the transverse eigenvalues is observed as the fiber orientation angle θ progresses from 0◦ to 90◦. The corresponding diffusion ellipsoids are prolate for θ < θMA, spherical for θ ≈ θMA, and oblate for θ > θMA. Expansion of the model to include discrimination between the combined effects of alignment disorder and collagen fiber volume fraction on the diffusion tensor is discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.