955 resultados para Multivariate risk model
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
Nowadays manufacturing companies are facing a more challenging environment due to the unpredictability of the markets in order to survive. Enterprises need to keep innovating and deliver products with new internal or external characteristics. There are strategies and solutions, to different organisational level from strategic to operational, when technology is growing faster in operational level, more specifically in manufacturing system. This means that companies have to deal with the changes of the emergent manufacturing systems while it can be expensive and not easy to be implement. An agile manufacturing system can help to cope with the markets changeability. Evolvable Production Systems (EPS) is an emergent paradigm which aims to bring new solutions to deal with changeability. The proposed paradigm is characterised by modularity and intends to introduce high flexibility and dynamism at shop floor level through the use of the evolution of new computational devices and technology. This new approach brings to enterprises the ability to plug and unplug new devices and allowing fast reformulation of the production line without reprogramming. There is no doubt about the advantages and benefits of this emerging technology but the feasibility and applicability is still under questioned. Most researches in this area are focused on technical side, explaining the advantages of those systems while there are no sufficient works discussing the implementation risks from different perspective, including business owner. The main objective of this work is to propose a methodology and model to identify, classify and measure potential risk associated with an implementation of this emergent paradigm. To quantify the proposed comprehensive risk model, an Intelligent Decision system is developed employing Fuzzy Inference System to deal with the knowledge of experts, as there are no historical data and sufficient research on this area. The result can be the vulnerability assessment of implementing EPS technology in manufacturing companies when the focus is more on SMEs. The present dissertation used the experts’ knowledge and experiences, who were involved in FP7 project IDEAS, which is one of the leading projects in this area.
Resumo:
Background: Several researchers seek methods for the selection of homogeneous groups of animals in experimental studies, a fact justified because homogeneity is an indispensable prerequisite for casualization of treatments. The lack of robust methods that comply with statistical and biological principles is the reason why researchers use empirical or subjective methods, influencing their results. Objective: To develop a multivariate statistical model for the selection of a homogeneous group of animals for experimental research and to elaborate a computational package to use it. Methods: The set of echocardiographic data of 115 male Wistar rats with supravalvular aortic stenosis (AoS) was used as an example of model development. Initially, the data were standardized, and became dimensionless. Then, the variance matrix of the set was submitted to principal components analysis (PCA), aiming at reducing the parametric space and at retaining the relevant variability. That technique established a new Cartesian system into which the animals were allocated, and finally the confidence region (ellipsoid) was built for the profile of the animals’ homogeneous responses. The animals located inside the ellipsoid were considered as belonging to the homogeneous batch; those outside the ellipsoid were considered spurious. Results: The PCA established eight descriptive axes that represented the accumulated variance of the data set in 88.71%. The allocation of the animals in the new system and the construction of the confidence region revealed six spurious animals as compared to the homogeneous batch of 109 animals. Conclusion: The biometric criterion presented proved to be effective, because it considers the animal as a whole, analyzing jointly all parameters measured, in addition to having a small discard rate.
Resumo:
AbstractBackground:30-40% of cardiac resynchronization therapy cases do not achieve favorable outcomes.Objective:This study aimed to develop predictive models for the combined endpoint of cardiac death and transplantation (Tx) at different stages of cardiac resynchronization therapy (CRT).Methods:Prospective observational study of 116 patients aged 64.8 ± 11.1 years, 68.1% of whom had functional class (FC) III and 31.9% had ambulatory class IV. Clinical, electrocardiographic and echocardiographic variables were assessed by using Cox regression and Kaplan-Meier curves.Results:The cardiac mortality/Tx rate was 16.3% during the follow-up period of 34.0 ± 17.9 months. Prior to implantation, right ventricular dysfunction (RVD), ejection fraction < 25% and use of high doses of diuretics (HDD) increased the risk of cardiac death and Tx by 3.9-, 4.8-, and 5.9-fold, respectively. In the first year after CRT, RVD, HDD and hospitalization due to congestive heart failure increased the risk of death at hazard ratios of 3.5, 5.3, and 12.5, respectively. In the second year after CRT, RVD and FC III/IV were significant risk factors of mortality in the multivariate Cox model. The accuracy rates of the models were 84.6% at preimplantation, 93% in the first year after CRT, and 90.5% in the second year after CRT. The models were validated by bootstrapping.Conclusion:We developed predictive models of cardiac death and Tx at different stages of CRT based on the analysis of simple and easily obtainable clinical and echocardiographic variables. The models showed good accuracy and adjustment, were validated internally, and are useful in the selection, monitoring and counseling of patients indicated for CRT.
Resumo:
Abstract Background: Cardiac resynchronization therapy (CRT) is the recommended treatment by leading global guidelines. However, 30%-40% of selected patients are non-responders. Objective: To develop an echocardiographic model to predict cardiac death or transplantation (Tx) 1 year after CRT. Method: Observational, prospective study, with the inclusion of 116 patients, aged 64.89 ± 11.18 years, 69.8% male, 68,1% in NYHA FC III and 31,9% in FC IV, 71.55% with left bundle-branch block, and median ejection fraction (EF) of 29%. Evaluations were made in the pre‑implantation period and 6-12 months after that, and correlated with cardiac mortality/Tx at the end of follow-up. Cox and logistic regression analyses were performed with ROC and Kaplan-Meier curves. The model was internally validated by bootstrapping. Results: There were 29 (25%) deaths/Tx during follow-up of 34.09 ± 17.9 months. Cardiac mortality/Tx was 16.3%. In the multivariate Cox model, EF < 30%, grade III/IV diastolic dysfunction and grade III mitral regurgitation at 6‑12 months were independently related to increased cardiac mortality or Tx, with hazard ratios of 3.1, 4.63 and 7.11, respectively. The area under the ROC curve was 0.78. Conclusion: EF lower than 30%, severe diastolic dysfunction and severe mitral regurgitation indicate poor prognosis 1 year after CRT. The combination of two of those variables indicate the need for other treatment options.
Resumo:
Cette thèse s'intéresse à étudier les propriétés extrémales de certains modèles de risque d'intérêt dans diverses applications de l'assurance, de la finance et des statistiques. Cette thèse se développe selon deux axes principaux, à savoir: Dans la première partie, nous nous concentrons sur deux modèles de risques univariés, c'est-à- dire, un modèle de risque de déflation et un modèle de risque de réassurance. Nous étudions le développement des queues de distribution sous certaines conditions des risques commun¬s. Les principaux résultats sont ainsi illustrés par des exemples typiques et des simulations numériques. Enfin, les résultats sont appliqués aux domaines des assurances, par exemple, les approximations de Value-at-Risk, d'espérance conditionnelle unilatérale etc. La deuxième partie de cette thèse est consacrée à trois modèles à deux variables: Le premier modèle concerne la censure à deux variables des événements extrême. Pour ce modèle, nous proposons tout d'abord une classe d'estimateurs pour les coefficients de dépendance et la probabilité des queues de distributions. Ces estimateurs sont flexibles en raison d'un paramètre de réglage. Leurs distributions asymptotiques sont obtenues sous certaines condi¬tions lentes bivariées de second ordre. Ensuite, nous donnons quelques exemples et présentons une petite étude de simulations de Monte Carlo, suivie par une application sur un ensemble de données réelles d'assurance. L'objectif de notre deuxième modèle de risque à deux variables est l'étude de coefficients de dépendance des queues de distributions obliques et asymétriques à deux variables. Ces distri¬butions obliques et asymétriques sont largement utiles dans les applications statistiques. Elles sont générées principalement par le mélange moyenne-variance de lois normales et le mélange de lois normales asymétriques d'échelles, qui distinguent la structure de dépendance de queue comme indiqué par nos principaux résultats. Le troisième modèle de risque à deux variables concerne le rapprochement des maxima de séries triangulaires elliptiques obliques. Les résultats théoriques sont fondés sur certaines hypothèses concernant le périmètre aléatoire sous-jacent des queues de distributions. -- This thesis aims to investigate the extremal properties of certain risk models of interest in vari¬ous applications from insurance, finance and statistics. This thesis develops along two principal lines, namely: In the first part, we focus on two univariate risk models, i.e., deflated risk and reinsurance risk models. Therein we investigate their tail expansions under certain tail conditions of the common risks. Our main results are illustrated by some typical examples and numerical simu¬lations as well. Finally, the findings are formulated into some applications in insurance fields, for instance, the approximations of Value-at-Risk, conditional tail expectations etc. The second part of this thesis is devoted to the following three bivariate models: The first model is concerned with bivariate censoring of extreme events. For this model, we first propose a class of estimators for both tail dependence coefficient and tail probability. These estimators are flexible due to a tuning parameter and their asymptotic distributions are obtained under some second order bivariate slowly varying conditions of the model. Then, we give some examples and present a small Monte Carlo simulation study followed by an application on a real-data set from insurance. The objective of our second bivariate risk model is the investigation of tail dependence coefficient of bivariate skew slash distributions. Such skew slash distributions are extensively useful in statistical applications and they are generated mainly by normal mean-variance mixture and scaled skew-normal mixture, which distinguish the tail dependence structure as shown by our principle results. The third bivariate risk model is concerned with the approximation of the component-wise maxima of skew elliptical triangular arrays. The theoretical results are based on certain tail assumptions on the underlying random radius.
Resumo:
Remote sensing and geographical information technologies were used to discriminate areas of high and low risk for contracting kala-azar or visceral leishmaniasis. Satellite data were digitally processed to generate maps of land cover and spectral indices, such as the normalised difference vegetation index and wetness index. To map estimated vector abundance and indoor climate data, local polynomial interpolations were used based on the weightage values. Attribute layers were prepared based on illiteracy and the unemployed proportion of the population and associated with village boundaries. Pearson's correlation coefficient was used to estimate the relationship between environmental variables and disease incidence across the study area. The cell values for each input raster in the analysis were assigned values from the evaluation scale. Simple weighting/ratings based on the degree of favourable conditions for kala-azar transmission were used for all the variables, leading to geo-environmental risk model. Variables such as, land use/land cover, vegetation conditions, surface dampness, the indoor climate, illiteracy rates and the size of the unemployed population were considered for inclusion in the geo-environmental kala-azar risk model. The risk model was stratified into areas of "risk"and "non-risk"for the disease, based on calculation of risk indices. The described approach constitutes a promising tool for microlevel kala-azar surveillance and aids in directing control efforts.
Resumo:
In the traditional actuarial risk model, if the surplus is negative, the company is ruined and has to go out of business. In this paper we distinguish between ruin (negative surplus) and bankruptcy (going out of business), where the probability of bankruptcy is a function of the level of negative surplus. The idea for this notion of bankruptcy comes from the observation that in some industries, companies can continue doing business even though they are technically ruined. Assuming that dividends can only be paid with a certain probability at each point of time, we derive closed-form formulas for the expected discounted dividends until bankruptcy under a barrier strategy. Subsequently, the optimal barrier is determined, and several explicit identities for the optimal value are found. The surplus process of the company is modeled by a Wiener process (Brownian motion).
Resumo:
BACKGROUND: Many factors affect survival in haemodialysis (HD) patients. Our aim was to study whether quality of clinical care may affect survival in this population, when adjusted for demographic characteristics and co-morbidities. METHODS: We studied survival in 553 patients treated by chronic HD during March 2001 in 21 dialysis facilities in western Switzerland. Indicators of quality of care were established for anaemia control, calcium and phosphate product, serum albumin, pre-dialysis blood pressure (BP), type of vascular access and dialysis adequacy (spKt/V) and their baseline values were related to 3-year survival. The modified Charlson co-morbidity index (including age) and transplantation status were also considered as a predictor of survival. RESULTS: Three-year survival was obtained for 96% of the patients; 39% (211/541) of these patients had died. The 3-year survival was 50, 62 and 69%, respectively, in patients who had 0-2, 3 and >or=4 fulfilled indicators of quality of care (test for linear trend, P < 0.001). In a Cox multivariate analysis model, the absence of transplantation, a higher modified Charlson's score, decreased fulfilment of indicators of good clinical care and low pre-dialysis systolic BP were independent predictors of death. CONCLUSION: Good clinical care improves survival in HD patients, even after adjustment for availability of transplantation and co-morbidities.
Resumo:
Background: Post-surgical management of stage I seminoma includes: surveillance with repeated CT-scans and treatment reserved for those who relapse, or adjuvant treatment with either immediate radiation therapy (RT) or carboplatin. The cancer specific survival is close to 100%. Cure without long-term sequelae of treatment is the aim. Our goal is to estimate the risk of radiation-induced secondary cancers (SC) death from for patients undergoing S, adjuvant RT or adjuvant carboplatin (AC).Materials and Methods: We measured organ doses from CT scans (3 phases each one) of a seminoma patient who was part of the active surveillance strategy and from a man undergoing adjuvant RT 20-Gy and a 30-Gy salvage RT treatment to the para-aortic area using helical Intensity Modulated RT (Tomotherapy®) with accurate delineation of organs at risk and a CTV to PTV expansion of 1 cm. Effective doses to organs in mSv were estimated according to the tissue-weighting factors recommendations of the International Commission on Radiological Protection 103 (Ann ICRP 2007). We estimated SC incidence and mortality for a 10,000 people population based on the excess absolute risk model from the Biological Effects of Ionizing Radiation (BEIR) VII (Health Risk of Exposure to Low Levels of Ionizing Radiation, NCR, The National Academies Press Washington, DC, 2006) assuming a seminoma diagnosis at age 30, a total life expectancy of 80 years.Results: The nominal risk for a fatal secondary cancers was calculated 1.5% for 15 abdominal CT scans, 14.8% for adjuvant RT (20 Gy paraaortic field) and 22.2% for salvage RT (30 Gy). The calculation assumed that the risk of relapse on surveillance and adjuvant AC was 15% and 4% respectively and that all patients were salvaged at relapse with RT. n CT abdomen/Pelvis = secondary cancer % RT Dose and % receiving treatment = secondary cancer % Total secondary cancer risk in % Active surveillance 15 = 1.5% 30 Gy in 15% of pts = 3.3% 4.8 Adjuvant carboplatin 7 = 0.7% 30 Gy in 4% of pts = 0.88% 1.58 Adjuvant radiotherapy 7 = 0.7% 20 Gy in 100% of pts = 14.8% 15.5Conclusions: These data suggest that: 1) Adjuvant radiotherapy is harmful and should not anymore be regarded as a standard option for seminoma stage I. 2) AC seems to be an option to reduce radiation induced cancers. Limitations: the study does not consider secondary cancers due to chemotherapy with AC (unknown). The use of BEIR VII for risk modeling with higher doses of RT needs to be validated.
Resumo:
OBJECTIVE: The aim of this study was to assess the association between frailty and risk for heart failure (HF) in older adults. BACKGROUND: Frailty is common in the elderly and is associated with adverse health outcomes. Impact of frailty on HF risk is not known. METHODS: We assessed the association between frailty, using the Health ABC Short Physical Performance Battery (HABC Battery) and the Gill index, and incident HF in 2825 participants aged 70 to 79 years. RESULTS: Mean age of participants was 74 ± 3 years; 48% were men and 59% were white. During a median follow up of 11.4 (7.1-11.7) years, 466 participants developed HF. Compared to non-frail participants, moderate (HR 1.36, 95% CI 1.08-1.71) and severe frailty (HR 1.88, 95% CI 1.02-3.47) by Gill index was associated with a higher risk for HF. HABC Battery score was linearly associated with HF risk after adjusting for the Health ABC HF Model (HR 1.24, 95% CI 1.13-1.36 per SD decrease in score) and remained significant when controlled for death as a competing risk (HR 1.30; 95% CI 1.00-1.55). Results were comparable across age, sex, and race, and in sub-groups based on diabetes mellitus or cardiovascular disease at baseline. Addition of HABC Battery scores to the Health ABC HF Risk Model improved discrimination (change in C-index, 0.014; 95% CI 0.018-0.010) and appropriately reclassified 13.4% (net-reclassification-improvement 0.073, 95% CI 0.021-0.125; P = .006) of participants (8.3% who developed HF and 5.1% who did not). CONCLUSIONS: Frailty is independently associated with risk of HF in older adults.
Resumo:
We characterize the value function of maximizing the total discounted utility of dividend payments for a compound Poisson insurance risk model when strictly positive transaction costs are included, leading to an impulse control problem. We illustrate that well known simple strategies can be optimal in the case of exponential claim amounts. Finally we develop a numerical procedure to deal with general claim amount distributions.
Resumo:
This paper studies a risk measure inherited from ruin theory and investigates some of its properties. Specifically, we consider a value-at-risk (VaR)-type risk measure defined as the smallest initial capital needed to ensure that the ultimate ruin probability is less than a given level. This VaR-type risk measure turns out to be equivalent to the VaR of the maximal deficit of the ruin process in infinite time. A related Tail-VaR-type risk measure is also discussed.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
BACKGROUND: Single-nucleotide polymorphisms (SNPs) in immune genes have been associated with susceptibility to invasive mold infection (IMI) among hematopoietic stem cell but not solid-organ transplant (SOT) recipients. METHODS: Twenty-four SNPs from systematically selected genes were genotyped among 1101 SOT recipients (715 kidney transplant recipients, 190 liver transplant recipients, 102 lung transplant recipients, 79 heart transplant recipients, and 15 recipients of other transplants) from the Swiss Transplant Cohort Study. Association between SNPs and the end point were assessed by log-rank test and Cox regression models. Cytokine production upon Aspergillus stimulation was measured by enzyme-linked immunosorbent assay in peripheral blood mononuclear cells (PBMCs) from healthy volunteers and correlated with relevant genotypes. RESULTS: Mold colonization (n = 45) and proven/probable IMI (n = 26) were associated with polymorphisms in the genes encoding interleukin 1β (IL1B; rs16944; recessive mode, P = .001 for colonization and P = .00005 for IMI, by the log-rank test), interleukin 1 receptor antagonist (IL1RN; rs419598; P = .01 and P = .02, respectively), and β-defensin 1 (DEFB1; rs1800972; P = .001 and P = .0002, respectively). The associations with IL1B and DEFB1 remained significant in a multivariate regression model (P = .002 for IL1B rs16944; P = .01 for DEFB1 rs1800972). The presence of 2 copies of the rare allele of rs16944 or rs419598 was associated with reduced Aspergillus-induced interleukin 1β and tumor necrosis factor α secretion by PBMCs. CONCLUSIONS: Functional polymorphisms in IL1B and DEFB1 influence susceptibility to mold infection in SOT recipients. This observation may contribute to individual risk stratification.