962 resultados para Monte-Carlo-Simulation
Resumo:
This work is a contribution to the definition and assessment of structural robustness. Special emphasis is given to reliability of reinforced concrete structures under corrosion of longitudinal reinforcement. On this communication several authors’ proposals in order to define and measure structural robustness are analyzed and discussed. The probabilistic based robustness index is defined, considering the reliability index decreasing for all possible damage levels. Damage is considered as the corrosion level of the longitudinal reinforcement in terms of rebar weight loss. Damage produces changes in both cross sectional area of rebar and bond strength. The proposed methodology is illustrated by means of an application example. In order to consider the impact of reinforcement corrosion on failure probability growth, an advanced methodology based on the strong discontinuities approach and an isotropic continuum damage model for concrete is adopted. The methodology consist on a two-step analysis: on the first step an analysis of the cross section is performed in order to capture phenomena such as expansion of the reinforcement due to the corrosion products accumulation and damage and cracking in the reinforcement surrounding concrete; on the second step a 2D deteriorated structural model is built with the results obtained on the first step of the analysis. The referred methodology combined with a Monte Carlo simulation is then used to compute the failure probability and the reliability index of the structure for different corrosion levels. Finally, structural robustness is assessed using the proposed probabilistic index.
Resumo:
Old timber structures may show significant variation in the cross section geometry along the same element, as a result of both construction methods and deterioration. As consequence, the definition of the geometric parameters in situ may be both time consuming and costly. This work presents the results of inspections carried out in different timber structures. Based on the obtained results, different simplified geometric models are proposed in order to efficiently model the geometry variations found. Probabilistic modelling techniques are also used to define safety parameters of existing timber structures, when subjected to dead and live loads, namely self-weight and wind actions. The parameters of the models have been defined as probabilistic variables, and safety of a selected case study was assessed using the Monte Carlo simulation technique. Assuming a target reliability index, a model was defined for both the residual cross section and the time dependent deterioration evolution. As a consequence, it was possible to compute probabilities of failure and reliability indices, as well as, time evolution deterioration curves for this structure. The results obtained provide a proposal for definition of the cross section geometric parameters of existing timber structures with different levels of decay, using a simplified probabilistic geometry model and considering a remaining capacity factor for the decayed areas. This model can be used for assessing the safety of the structure at present and for predicting future performance.
Resumo:
Assessing the safety of existing timber structures is of paramount importance for taking reliable decisions on repair actions and their extent. The results obtained through semi-probabilistic methods are unrealistic, as the partial safety factors present in codes are calibrated considering the uncertainty present in new structures. In order to overcome these limitations, and also to include the effects of decay in the safety analysis, probabilistic methods, based on Monte-Carlo simulation are applied here to assess the safety of existing timber structures. In particular, the impact of decay on structural safety is analyzed and discussed, using a simple structural model, similar to that used for current semi-probabilistic analysis.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
ABSTRACTINTRODUCTION: Monte Carlo simulations have been used for selecting optimal antibiotic regimens for treatment of bacterial infections. The aim of this study was to assess the pharmacokinetic and pharmacodynamic target attainment of intravenous β-lactam regimens commonly used to treat bloodstream infections (BSIs) caused by Gram-negative rod-shaped organisms in a Brazilian teaching hospital.METHODS: In total, 5,000 patients were included in the Monte Carlo simulations of distinct antimicrobial regimens to estimate the likelihood of achieving free drug concentrations above the minimum inhibitory concentration (MIC; fT > MIC) for the requisite periods to clear distinct target organisms. Microbiological data were obtained from blood culture isolates harvested in our hospital from 2008 to 2010.RESULTS: In total, 614 bacterial isolates, including Escherichia coli, Enterobacterspp., Klebsiella pneumoniae, Acinetobacter baumannii, and Pseudomonas aeruginosa, were analyzed Piperacillin/tazobactam failed to achieve a cumulative fraction of response (CFR) > 90% for any of the isolates. While standard dosing (short infusion) of β-lactams achieved target attainment for BSIs caused by E. coliand Enterobacterspp., pharmacodynamic target attainment against K. pneumoniaeisolates was only achieved with ceftazidime and meropenem (prolonged infusion). Lastly, only prolonged infusion of high-dose meropenem approached an ideal CFR against P. aeruginosa; however, no antimicrobial regimen achieved an ideal CFR against A. baumannii.CONCLUSIONS:These data reinforce the use of prolonged infusions of high-dose β-lactam antimicrobials as a reasonable strategy for the treatment of BSIs caused by multidrug resistant Gram-negative bacteria in Brazil.
Resumo:
Extreme value models are widely used in different areas. The Birnbaum–Saunders distribution is receiving considerable attention due to its physical arguments and its good properties. We propose a methodology based on extreme value Birnbaum–Saunders regression models, which includes model formulation, estimation, inference and checking. We further conduct a simulation study for evaluating its performance. A statistical analysis with real-world extreme value environmental data using the methodology is provided as illustration.
Resumo:
Dissertação de mestrado em Engenharia Industrial
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
Electromagnetic compatibility, lightning, crosstalk surge voltages, Monte Carlo simulation, accident initiator
Resumo:
This paper shows how a high level matrix programming language may be used to perform Monte Carlo simulation, bootstrapping, estimation by maximum likelihood and GMM, and kernel regression in parallel on symmetric multiprocessor computers or clusters of workstations. The implementation of parallelization is done in a way such that an investigator may use the programs without any knowledge of parallel programming. A bootable CD that allows rapid creation of a cluster for parallel computing is introduced. Examples show that parallelization can lead to important reductions in computational time. Detailed discussion of how the Monte Carlo problem was parallelized is included as an example for learning to write parallel programs for Octave.
Resumo:
We propose a nonlinear heterogeneous panel unit root test for testing the null hypothesis of unit-roots processes against the alternative that allows a proportion of units to be generated by globally stationary ESTAR processes and a remaining non-zero proportion to be generated by unit root processes. The proposed test is simple to implement and accommodates cross sectional dependence. We show that the distribution of the test statistic is free of nuisance parameters as (N, T) −! 1. Monte Carlo simulation shows that our test holds correct size and under the hypothesis that data are generated by globally stationary ESTAR processes has a better power than the recent test proposed in Pesaran [2007]. Various applications are provided.
Resumo:
The effects of structural breaks in dynamic panels are more complicated than in time series models as the bias can be either negative or positive. This paper focuses on the effects of mean shifts in otherwise stationary processes within an instrumental variable panel estimation framework. We show the sources of the bias and a Monte Carlo analysis calibrated on United States bank lending data demonstrates the size of the bias for a range of auto-regressive parameters. We also propose additional moment conditions that can be used to reduce the biases caused by shifts in the mean of the data.
Resumo:
Since 1895, when X-rays were discovered, ionizing radiation became part of our life. Its use in medicine has brought significant health benefits to the population globally. The benefit of any diagnostic procedure is to reduce the uncertainty about the patient's health. However, there are potential detrimental effects of radiation exposure. Therefore, radiation protection authorities have become strict regarding the control of radiation risks.¦There are various situations where the radiation risk needs to be evaluated. International authority bodies point to the increasing number of radiologic procedures and recommend population surveys. These surveys provide valuable data to public health authorities which helps them to prioritize and focus on patient groups in the population that are most highly exposed. On the other hand, physicians need to be aware of radiation risks from diagnostic procedures in order to justify and optimize the procedure and inform the patient.¦The aim of this work was to examine the different aspects of radiation protection and investigate a new method to estimate patient radiation risks.¦The first part of this work concerned radiation risk assessment from the regulatory authority point of view. To do so, a population dose survey was performed to evaluate the annual population exposure. This survey determined the contribution of different imaging modalities to the total collective dose as well as the annual effective dose per caput. It was revealed that although interventional procedures are not so frequent, they significantly contribute to the collective dose. Among the main results of this work, it was shown that interventional cardiological procedures are dose-intensive and therefore more attention should be paid to optimize the exposure.¦The second part of the project was related to the patient and physician oriented risk assessment. In this part, interventional cardiology procedures were studied by means of Monte Carlo simulations. The organ radiation doses as well as effective doses were estimated. Cancer incidence risks for different organs were calculated for different sex and age-at-exposure using the lifetime attributable risks provided by the Biological Effects of Ionizing Radiations Report VII. Advantages and disadvantages of the latter results were examined as an alternative method to estimate radiation risks. The results show that this method is the most accurate, currently available, to estimate radiation risks. The conclusions of this work may guide future studies in the field of radiation protection in medicine.¦-¦Depuis la découverte des rayons X en 1895, ce type de rayonnement a joué un rôle important dans de nombreux domaines. Son utilisation en médecine a bénéficié à la population mondiale puisque l'avantage d'un examen diagnostique est de réduire les incertitudes sur l'état de santé du patient. Cependant, leur utilisation peut conduire à l'apparition de cancers radio-induits. Par conséquent, les autorités sanitaires sont strictes quant au contrôle du risque radiologique.¦Le risque lié aux radiations doit être estimé dans différentes situations pratiques, dont l'utilisation médicale des rayons X. Les autorités internationales de radioprotection indiquent que le nombre d'examens et de procédures radiologiques augmente et elles recommandent des enquêtes visant à déterminer les doses de radiation délivrées à la population. Ces enquêtes assurent que les groupes de patients les plus à risque soient prioritaires. D'un autre côté, les médecins ont également besoin de connaître le risque lié aux radiations afin de justifier et optimiser les procédures et informer les patients.¦Le présent travail a pour objectif d'examiner les différents aspects de la radioprotection et de proposer une manière efficace pour estimer le risque radiologique au patient.¦Premièrement, le risque a été évalué du point de vue des autorités sanitaires. Une enquête nationale a été réalisée pour déterminer la contribution des différentes modalités radiologiques et des divers types d'examens à la dose efficace collective due à l'application médicale des rayons X. Bien que les procédures interventionnelles soient rares, elles contribuent de façon significative à la dose délivrée à la population. Parmi les principaux résultats de ce travail, il a été montré que les procédures de cardiologie interventionnelle délivrent des doses élevées et devraient donc être optimisées en priorité.¦La seconde approche concerne l'évaluation du risque du point de vue du patient et du médecin. Dans cette partie, des procédures interventionnelles cardiaques ont été étudiées au moyen de simulations Monte Carlo. La dose délivrée aux organes ainsi que la dose efficace ont été estimées. Les risques de développer des cancers dans plusieurs organes ont été calculés en fonction du sexe et de l'âge en utilisant la méthode établie dans Biological Effects of Ionizing Radiations Report VII. Les avantages et inconvénients de cette nouvelle technique ont été examinés et comparés à ceux de la dose efficace. Les résultats ont montré que cette méthode est la plus précise actuellement disponible pour estimer le risque lié aux radiations. Les conclusions de ce travail pourront guider de futures études dans le domaine de la radioprotection en médicine.
Resumo:
This paper analyses the impact of using different correlation assumptions between lines of business when estimating the risk-based capital reserve, the Solvency Capital Requirement (SCR), under Solvency II regulations. A case study is presented and the SCR is calculated according to the Standard Model approach. Alternatively, the requirement is then calculated using an Internal Model based on a Monte Carlo simulation of the net underwriting result at a one-year horizon, with copulas being used to model the dependence between lines of business. To address the impact of these model assumptions on the SCR we conduct a sensitivity analysis. We examine changes in the correlation matrix between lines of business and address the choice of copulas. Drawing on aggregate historical data from the Spanish non-life insurance market between 2000 and 2009, we conclude that modifications of the correlation and dependence assumptions have a significant impact on SCR estimation.
Resumo:
This paper examines why a financial entity’s solvency capital estimation might be underestimated if the total amount required is obtained directly from a risk measurement. Using Monte Carlo simulation we show that, in some instances, a common risk measure such as Value-at-Risk is not subadditive when certain dependence structures are considered. Higher risk evaluations are obtained for independence between random variables than those obtained in the case of comonotonicity. The paper stresses, therefore, the relationship between dependence structures and capital estimation.