869 resultados para Risk model
Resumo:
We consider a spectrally-negative Markov additive process as a model of a risk process in a random environment. Following recent interest in alternative ruin concepts, we assume that ruin occurs when an independent Poissonian observer sees the process as negative, where the observation rate may depend on the state of the environment. Using an approximation argument and spectral theory, we establish an explicit formula for the resulting survival probabilities in this general setting. We also discuss an efficient evaluation of the involved quantities and provide a numerical illustration.
Resumo:
The activity of Fuego volcano during the 1999 - 2013 eruptive episode is studied through field, remote sensing and observatory records. Mapping of the deposits allows quantifying the erupted volumes and areas affected by the largest eruptions during this period. A wide range of volcanic processes results in a diversity of products and associated deposits, including minor airfall tephra, rockfall avalanches, lava flows, and pyroclastic flows. The activity can be characterized by long term, low level background activity, and sporadic larger explosive eruptions. Although the background activity erupts lava and ash at a low rate (~ 0.1 m3/s), the persistence of such activity over time results in a significant contribution (~ 30%) to the eruption budget during the studied period. Larger eruptions produced the majority of the volume of products during the studied period, mainly during three large events (May 21, 1999, June 29, 2003, and September 13, 2012), mostly in the form of pyroclastic flows. A total volume of ~ 1.4 x 108 m3 was estimated from the mapped deposits and the estimated background eruption rate. Posterior remobilization of pyroclastic flow material by stream erosion in the highly confined Barranca channels leads to lahar generation, either by normal rainfall, or by extreme rainfall events. A reassessment of the types of products and volumes erupted during the decade of 1970's allows comparing the activity happening since 1999 with the older activity, and suggests that many of the eruptive phenomena at Fuego may have similar mechanisms, despite the differences in scale between. The deposits of large pyroclastic flows erupted during the 1970's are remarkably similar in appearance to the deposit of pyroclastic flows from the 1999 - 2013 period, despite their much larger volume; this is also the case for prehistoric eruptions. Radiocarbon dating of pyroclastic flow deposits suggests that Fuego has produced large eruptions many times during the last ~ 2 ka, including larger eruptions during the last 500 years, which has important hazard implications. A survey was conducted among the local residents living near to the volcano, about their expectations of possible future crises. The results show that people are aware of the risk they could face in case of a large eruption, and therefore they are willing to evacuate in such case. However, their decision to evacuate may also be influenced by the conditions in which the evacuation could take place. If the evacuation represents a potential loss of their livelihood or property they will be more hesitant to leave their villages during a large eruption. The prospect of facing hardship conditions during the evacuation and in the shelters may further cause reluctance to evacuate. A short discussion on some of the issues regarding risk assessment and management through an early warning system is presented in the last chapter.
Resumo:
In questo lavoro di tesi viene presentato e validato un modello di rischio di alluvione a complessità intermedia per scenari climatici futuri. Questo modello appartiene a quella categoria di strumenti che mirano a soddisfare le esigenze identificate dal World Climate Research Program (WRCP) per affrontare gli effetti del cambiamento climatico. L'obiettivo perseguito è quello di sviluppare, seguendo un approccio ``bottom-up" al rischio climatico regionale, strumenti che possano aiutare i decisori a realizzare l'adattamento ai cambiamenti climatici. Il modello qui presentato è interamente basato su dati open-source forniti dai servizi Copernicus. Il contributo di questo lavoro di tesi riguarda lo sviluppo di un modello, formulato da (Ruggieri et al.), per stimare i danni di eventi alluvionali fluviali per specifici i livelli di riscaldamento globale (GWL). Il modello è stato testato su tre bacini idrografici di medie dimensioni in Emilia-Romagna, Panaro, Reno e Secchia. In questo lavoro, il modello viene sottoposto a test di sensibilità rispetto a un'ipotesi enunciata nella formulazione del modello, poi vengono effettuate analisi relative all'ensemble multi-modello utilizzato per le proiezioni. Il modello viene quindi validato, confrontando i danni stimati nel clima attuale per i tre fiumi con i danni osservati e confrontando le portate simulate con quelle osservate. Infine, vengono stimati i danni associati agli eventi alluvionali in tre scenari climatici futuri caratterizzati da GWL di 1.5° C, 2.0° C e 3.0°C.
Resumo:
2000 Mathematics Subject Classification: 60K10, 62P05
Resumo:
Although cigarette smoking and alcohol consumption increase risk for head and neck cancers, there have been few attempts to model risks quantitatively and to formally evaluate cancer site-specific risks. The authors pooled data from 15 case-control studies and modeled the excess odds ratio (EOR) to assess risk by total exposure (pack-years and drink-years) and its modification by exposure rate (cigarettes/day and drinks/day). The smoking analysis included 1,761 laryngeal, 2,453 pharyngeal, and 1,990 oral cavity cancers, and the alcohol analysis included 2,551 laryngeal, 3,693 pharyngeal, and 3,116 oval cavity cancers, with over 8,000 controls. Above 15 cigarettes/day, the EOR/pack-year decreased with increasing cigarettes/day, suggesting that greater cigarettes/day for a shorter duration was less deleterious than fewer cigarettes/day for a longer duration. Estimates of EOR/pack-year were homogeneous across sites, while the effects of cigarettes/day varied, indicating that the greater laryngeal cancer risk derived from differential cigarettes/day effects and not pack-years. EOR/drink-year estimates increased through 10 drinks/day, suggesting that greater drinks/day for a shorter duration was more deleterious than fewer drinks/day for a longer duration. Above 10 drinks/day, data were limited. EOR/drink-year estimates varied by site, while drinks/day effects were homogeneous, indicating that the greater pharyngeal/oral cavity cancer risk with alcohol consumption derived from the differential effects of drink-years and not drinks/day.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
Nowadays manufacturing companies are facing a more challenging environment due to the unpredictability of the markets in order to survive. Enterprises need to keep innovating and deliver products with new internal or external characteristics. There are strategies and solutions, to different organisational level from strategic to operational, when technology is growing faster in operational level, more specifically in manufacturing system. This means that companies have to deal with the changes of the emergent manufacturing systems while it can be expensive and not easy to be implement. An agile manufacturing system can help to cope with the markets changeability. Evolvable Production Systems (EPS) is an emergent paradigm which aims to bring new solutions to deal with changeability. The proposed paradigm is characterised by modularity and intends to introduce high flexibility and dynamism at shop floor level through the use of the evolution of new computational devices and technology. This new approach brings to enterprises the ability to plug and unplug new devices and allowing fast reformulation of the production line without reprogramming. There is no doubt about the advantages and benefits of this emerging technology but the feasibility and applicability is still under questioned. Most researches in this area are focused on technical side, explaining the advantages of those systems while there are no sufficient works discussing the implementation risks from different perspective, including business owner. The main objective of this work is to propose a methodology and model to identify, classify and measure potential risk associated with an implementation of this emergent paradigm. To quantify the proposed comprehensive risk model, an Intelligent Decision system is developed employing Fuzzy Inference System to deal with the knowledge of experts, as there are no historical data and sufficient research on this area. The result can be the vulnerability assessment of implementing EPS technology in manufacturing companies when the focus is more on SMEs. The present dissertation used the experts’ knowledge and experiences, who were involved in FP7 project IDEAS, which is one of the leading projects in this area.
Resumo:
Cette thèse s'intéresse à étudier les propriétés extrémales de certains modèles de risque d'intérêt dans diverses applications de l'assurance, de la finance et des statistiques. Cette thèse se développe selon deux axes principaux, à savoir: Dans la première partie, nous nous concentrons sur deux modèles de risques univariés, c'est-à- dire, un modèle de risque de déflation et un modèle de risque de réassurance. Nous étudions le développement des queues de distribution sous certaines conditions des risques commun¬s. Les principaux résultats sont ainsi illustrés par des exemples typiques et des simulations numériques. Enfin, les résultats sont appliqués aux domaines des assurances, par exemple, les approximations de Value-at-Risk, d'espérance conditionnelle unilatérale etc. La deuxième partie de cette thèse est consacrée à trois modèles à deux variables: Le premier modèle concerne la censure à deux variables des événements extrême. Pour ce modèle, nous proposons tout d'abord une classe d'estimateurs pour les coefficients de dépendance et la probabilité des queues de distributions. Ces estimateurs sont flexibles en raison d'un paramètre de réglage. Leurs distributions asymptotiques sont obtenues sous certaines condi¬tions lentes bivariées de second ordre. Ensuite, nous donnons quelques exemples et présentons une petite étude de simulations de Monte Carlo, suivie par une application sur un ensemble de données réelles d'assurance. L'objectif de notre deuxième modèle de risque à deux variables est l'étude de coefficients de dépendance des queues de distributions obliques et asymétriques à deux variables. Ces distri¬butions obliques et asymétriques sont largement utiles dans les applications statistiques. Elles sont générées principalement par le mélange moyenne-variance de lois normales et le mélange de lois normales asymétriques d'échelles, qui distinguent la structure de dépendance de queue comme indiqué par nos principaux résultats. Le troisième modèle de risque à deux variables concerne le rapprochement des maxima de séries triangulaires elliptiques obliques. Les résultats théoriques sont fondés sur certaines hypothèses concernant le périmètre aléatoire sous-jacent des queues de distributions. -- This thesis aims to investigate the extremal properties of certain risk models of interest in vari¬ous applications from insurance, finance and statistics. This thesis develops along two principal lines, namely: In the first part, we focus on two univariate risk models, i.e., deflated risk and reinsurance risk models. Therein we investigate their tail expansions under certain tail conditions of the common risks. Our main results are illustrated by some typical examples and numerical simu¬lations as well. Finally, the findings are formulated into some applications in insurance fields, for instance, the approximations of Value-at-Risk, conditional tail expectations etc. The second part of this thesis is devoted to the following three bivariate models: The first model is concerned with bivariate censoring of extreme events. For this model, we first propose a class of estimators for both tail dependence coefficient and tail probability. These estimators are flexible due to a tuning parameter and their asymptotic distributions are obtained under some second order bivariate slowly varying conditions of the model. Then, we give some examples and present a small Monte Carlo simulation study followed by an application on a real-data set from insurance. The objective of our second bivariate risk model is the investigation of tail dependence coefficient of bivariate skew slash distributions. Such skew slash distributions are extensively useful in statistical applications and they are generated mainly by normal mean-variance mixture and scaled skew-normal mixture, which distinguish the tail dependence structure as shown by our principle results. The third bivariate risk model is concerned with the approximation of the component-wise maxima of skew elliptical triangular arrays. The theoretical results are based on certain tail assumptions on the underlying random radius.
Resumo:
Remote sensing and geographical information technologies were used to discriminate areas of high and low risk for contracting kala-azar or visceral leishmaniasis. Satellite data were digitally processed to generate maps of land cover and spectral indices, such as the normalised difference vegetation index and wetness index. To map estimated vector abundance and indoor climate data, local polynomial interpolations were used based on the weightage values. Attribute layers were prepared based on illiteracy and the unemployed proportion of the population and associated with village boundaries. Pearson's correlation coefficient was used to estimate the relationship between environmental variables and disease incidence across the study area. The cell values for each input raster in the analysis were assigned values from the evaluation scale. Simple weighting/ratings based on the degree of favourable conditions for kala-azar transmission were used for all the variables, leading to geo-environmental risk model. Variables such as, land use/land cover, vegetation conditions, surface dampness, the indoor climate, illiteracy rates and the size of the unemployed population were considered for inclusion in the geo-environmental kala-azar risk model. The risk model was stratified into areas of "risk"and "non-risk"for the disease, based on calculation of risk indices. The described approach constitutes a promising tool for microlevel kala-azar surveillance and aids in directing control efforts.
Resumo:
In the traditional actuarial risk model, if the surplus is negative, the company is ruined and has to go out of business. In this paper we distinguish between ruin (negative surplus) and bankruptcy (going out of business), where the probability of bankruptcy is a function of the level of negative surplus. The idea for this notion of bankruptcy comes from the observation that in some industries, companies can continue doing business even though they are technically ruined. Assuming that dividends can only be paid with a certain probability at each point of time, we derive closed-form formulas for the expected discounted dividends until bankruptcy under a barrier strategy. Subsequently, the optimal barrier is determined, and several explicit identities for the optimal value are found. The surplus process of the company is modeled by a Wiener process (Brownian motion).
Resumo:
Background: Post-surgical management of stage I seminoma includes: surveillance with repeated CT-scans and treatment reserved for those who relapse, or adjuvant treatment with either immediate radiation therapy (RT) or carboplatin. The cancer specific survival is close to 100%. Cure without long-term sequelae of treatment is the aim. Our goal is to estimate the risk of radiation-induced secondary cancers (SC) death from for patients undergoing S, adjuvant RT or adjuvant carboplatin (AC).Materials and Methods: We measured organ doses from CT scans (3 phases each one) of a seminoma patient who was part of the active surveillance strategy and from a man undergoing adjuvant RT 20-Gy and a 30-Gy salvage RT treatment to the para-aortic area using helical Intensity Modulated RT (Tomotherapy®) with accurate delineation of organs at risk and a CTV to PTV expansion of 1 cm. Effective doses to organs in mSv were estimated according to the tissue-weighting factors recommendations of the International Commission on Radiological Protection 103 (Ann ICRP 2007). We estimated SC incidence and mortality for a 10,000 people population based on the excess absolute risk model from the Biological Effects of Ionizing Radiation (BEIR) VII (Health Risk of Exposure to Low Levels of Ionizing Radiation, NCR, The National Academies Press Washington, DC, 2006) assuming a seminoma diagnosis at age 30, a total life expectancy of 80 years.Results: The nominal risk for a fatal secondary cancers was calculated 1.5% for 15 abdominal CT scans, 14.8% for adjuvant RT (20 Gy paraaortic field) and 22.2% for salvage RT (30 Gy). The calculation assumed that the risk of relapse on surveillance and adjuvant AC was 15% and 4% respectively and that all patients were salvaged at relapse with RT. n CT abdomen/Pelvis = secondary cancer % RT Dose and % receiving treatment = secondary cancer % Total secondary cancer risk in % Active surveillance 15 = 1.5% 30 Gy in 15% of pts = 3.3% 4.8 Adjuvant carboplatin 7 = 0.7% 30 Gy in 4% of pts = 0.88% 1.58 Adjuvant radiotherapy 7 = 0.7% 20 Gy in 100% of pts = 14.8% 15.5Conclusions: These data suggest that: 1) Adjuvant radiotherapy is harmful and should not anymore be regarded as a standard option for seminoma stage I. 2) AC seems to be an option to reduce radiation induced cancers. Limitations: the study does not consider secondary cancers due to chemotherapy with AC (unknown). The use of BEIR VII for risk modeling with higher doses of RT needs to be validated.
Resumo:
OBJECTIVE: The aim of this study was to assess the association between frailty and risk for heart failure (HF) in older adults. BACKGROUND: Frailty is common in the elderly and is associated with adverse health outcomes. Impact of frailty on HF risk is not known. METHODS: We assessed the association between frailty, using the Health ABC Short Physical Performance Battery (HABC Battery) and the Gill index, and incident HF in 2825 participants aged 70 to 79 years. RESULTS: Mean age of participants was 74 ± 3 years; 48% were men and 59% were white. During a median follow up of 11.4 (7.1-11.7) years, 466 participants developed HF. Compared to non-frail participants, moderate (HR 1.36, 95% CI 1.08-1.71) and severe frailty (HR 1.88, 95% CI 1.02-3.47) by Gill index was associated with a higher risk for HF. HABC Battery score was linearly associated with HF risk after adjusting for the Health ABC HF Model (HR 1.24, 95% CI 1.13-1.36 per SD decrease in score) and remained significant when controlled for death as a competing risk (HR 1.30; 95% CI 1.00-1.55). Results were comparable across age, sex, and race, and in sub-groups based on diabetes mellitus or cardiovascular disease at baseline. Addition of HABC Battery scores to the Health ABC HF Risk Model improved discrimination (change in C-index, 0.014; 95% CI 0.018-0.010) and appropriately reclassified 13.4% (net-reclassification-improvement 0.073, 95% CI 0.021-0.125; P = .006) of participants (8.3% who developed HF and 5.1% who did not). CONCLUSIONS: Frailty is independently associated with risk of HF in older adults.
Resumo:
We characterize the value function of maximizing the total discounted utility of dividend payments for a compound Poisson insurance risk model when strictly positive transaction costs are included, leading to an impulse control problem. We illustrate that well known simple strategies can be optimal in the case of exponential claim amounts. Finally we develop a numerical procedure to deal with general claim amount distributions.
Resumo:
This paper studies a risk measure inherited from ruin theory and investigates some of its properties. Specifically, we consider a value-at-risk (VaR)-type risk measure defined as the smallest initial capital needed to ensure that the ultimate ruin probability is less than a given level. This VaR-type risk measure turns out to be equivalent to the VaR of the maximal deficit of the ruin process in infinite time. A related Tail-VaR-type risk measure is also discussed.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.