972 resultados para risk modelling
Resumo:
The aim of this research work was primarily to examine the relevance of patient parameters, ward structures, procedures and practices, in respect of the potential hazards of wound cross-infection and nasal colonisation with multiple resistant strains of Staphylococcus aureus, which it is thought might provide a useful indication of a patient's general susceptibility to wound infection. Information from a large cross-sectional survey involving 12,000 patients from some 41 hospitals and 375 wards was collected over a five-year period from 1967-72, and its validity checked before any subsequent analysis was carried out. Many environmental factors and procedures which had previously been thought (but never conclusively proved) to have an influence on wound infection or nasal colonisation rates, were assessed, and subsequently dismissed as not being significant, provided that the standard of the current range of practices and procedures is maintained and not allowed to deteriorate. Retrospective analysis revealed that the probability of wound infection was influenced by the patient's age, duration of pre-operative hospitalisation, sex, type of wound, presence and type of drain, number of patients in ward, and other special risk factors, whilst nasal colonisation was found to be influenced by the patient's age, total duration of hospitalisation, sex, antibiotics, proportion of occupied beds in the ward, average distance between bed centres and special risk factors. A multi-variate regression analysis technique was used to develop statistical models, consisting of variable patient and environmental factors which were found to have a significant influence on the risks pertaining to wound infection and nasal colonisation. A relationship between wound infection and nasal colonisation was then established and this led to the development of a more advanced model for predicting wound infections, taking advantage of the additional knowledge of the patient's state of nasal colonisation prior to operation.
Resumo:
This study proposes an integrated analytical framework for effective management of project risks using combined multiple criteria decision-making technique and decision tree analysis. First, a conceptual risk management model was developed through thorough literature review. The model was then applied through action research on a petroleum oil refinery construction project in the Central part of India in order to demonstrate its effectiveness. Oil refinery construction projects are risky because of technical complexity, resource unavailability, involvement of many stakeholders and strict environmental requirements. Although project risk management has been researched extensively, practical and easily adoptable framework is missing. In the proposed framework, risks are identified using cause and effect diagram, analysed using the analytic hierarchy process and responses are developed using the risk map. Additionally, decision tree analysis allows modelling various options for risk response development and optimises selection of risk mitigating strategy. The proposed risk management framework could be easily adopted and applied in any project and integrated with other project management knowledge areas.
Resumo:
One of the main challenges of classifying clinical data is determining how to handle missing features. Most research favours imputing of missing values or neglecting records that include missing data, both of which can degrade accuracy when missing values exceed a certain level. In this research we propose a methodology to handle data sets with a large percentage of missing values and with high variability in which particular data are missing. Feature selection is effected by picking variables sequentially in order of maximum correlation with the dependent variable and minimum correlation with variables already selected. Classification models are generated individually for each test case based on its particular feature set and the matching data values available in the training population. The method was applied to real patients' anonymous mental-health data where the task was to predict the suicide risk judgement clinicians would give for each patient's data, with eleven possible outcome classes: zero to ten, representing no risk to maximum risk. The results compare favourably with alternative methods and have the advantage of ensuring explanations of risk are based only on the data given, not imputed data. This is important for clinical decision support systems using human expertise for modelling and explaining predictions.
Resumo:
Reliability modelling and verification is indispensable in modern manufacturing, especially for product development risk reduction. Based on the discussion of the deficiencies of traditional reliability modelling methods for process reliability, a novel modelling method is presented herein that draws upon a knowledge network of process scenarios based on the analytic network process (ANP). An integration framework of manufacturing process reliability and product quality is presented together with a product development and reliability verification process. According to the roles of key characteristics (KCs) in manufacturing processes, KCs are organised into four clusters, that is, product KCs, material KCs, operation KCs and equipment KCs, which represent the process knowledge network of manufacturing processes. A mathematical model and algorithm is developed for calculating the reliability requirements of KCs with respect to different manufacturing process scenarios. A case study on valve-sleeve component manufacturing is provided as an application example of the new reliability modelling and verification procedure. This methodology is applied in the valve-sleeve component manufacturing processes to manage and deploy production resources.
Resumo:
2000 Mathematics Subject Classification: 62J12, 62P10.
Resumo:
Analysis of risk measures associated with price series data movements and its predictions are of strategic importance in the financial markets as well as to policy makers in particular for short- and longterm planning for setting up economic growth targets. For example, oilprice risk-management focuses primarily on when and how an organization can best prevent the costly exposure to price risk. Value-at-Risk (VaR) is the commonly practised instrument to measure risk and is evaluated by analysing the negative/positive tail of the probability distributions of the returns (profit or loss). In modelling applications, least-squares estimation (LSE)-based linear regression models are often employed for modeling and analyzing correlated data. These linear models are optimal and perform relatively well under conditions such as errors following normal or approximately normal distributions, being free of large size outliers and satisfying the Gauss-Markov assumptions. However, often in practical situations, the LSE-based linear regression models fail to provide optimal results, for instance, in non-Gaussian situations especially when the errors follow distributions with fat tails and error terms possess a finite variance. This is the situation in case of risk analysis which involves analyzing tail distributions. Thus, applications of the LSE-based regression models may be questioned for appropriateness and may have limited applicability. We have carried out the risk analysis of Iranian crude oil price data based on the Lp-norm regression models and have noted that the LSE-based models do not always perform the best. We discuss results from the L1, L2 and L∞-norm based linear regression models. ACM Computing Classification System (1998): B.1.2, F.1.3, F.2.3, G.3, J.2.
Resumo:
A szerző egy, a szennyezőanyag-kibocsátás európai kereskedelmi rendszerében megfelelésre kötelezett gázturbinás erőmű szén-dioxid-kibocsátását modellezi négy termékre (völgy- és csúcsidőszaki áramár, gázár, kibocsátási kvóta) vonatkozó reálopciós modell segítségével. A profitmaximalizáló erőmű csak abban az esetben termel és szennyez, ha a megtermelt áramon realizálható fedezete pozitív. A jövőbeli időszak összesített szén-dioxid-kibocsátása megfeleltethető európai típusú bináris különbözetopciók összegének. A modell keretein belül a szén-dioxid-kibocsátás várható értékét és sűrűségfüggvényét becsülhetjük, az utóbbi segítségével a szén-dioxid-kibocsátási pozíció kockáztatott értékét határozhatjuk meg, amely az erőmű számára előírt megfelelési kötelezettség teljesítésének adott konfidenciaszint melletti költségét jelenti. A sztochasztikus modellben az alaptermékek geometriai Ornstein-Uhlenbeck-folyamatot követnek. Ezt illesztette a szerző a német energiatőzsdéről származó publikus piaci adatokra. A szimulációs modellre támaszkodva megvizsgálta, hogy a különböző technológiai és piaci tényezők ceteris paribus megváltozása milyen hatással van a megfelelés költségére, a kockáztatott értékére. ______ The carbon-dioxide emissions of an EU Emissions Trading System participant, gas-fuelled power generator are modelled by using real options for four underlying instruments (peak and off-peak electricity, gas, emission quota). This profit-maximizing power plant operates and emits pollution only if its profit (spread) on energy produced is positive. The future emissions can be estimated by a sum of European binary-spread options. Based on the real-option model, the expected value of emissions and its probability-density function can be deducted. Also calculable is the Value at Risk of emission quota position, which gives the cost of compliance at a given confidence level. To model the prices of the four underlying instruments, the geometric Ornstein-Uhlenbeck process is supposed and matched to public available price data from EEX. Based on the simulation model, the effects of various technological and market factors are analysed for the emissions level and the cost of compliance.
Resumo:
A Szolvencia II néven említett új irányelv elfogadása az Európai Unióban új helyzetet teremt a biztosítók tőkeszükséglet-számításánál. A tanulmány a biztosítók működését modellezve azt elemzi, hogyan hatnak a biztosítók állományának egyes jellemzői a tőkeszükséglet értékére egy olyan elméleti modellben, amelyben a tőkeszükséglet-értékek a Szolvencia II szabályok alapján számolhatók. A modellben biztosítási illetve pénzügyi kockázati "modul" figyelembevételére kerül sor külön-külön számolással, illetve a két kockázatfajta közös modellben való együttes figyelembevételével (a Szolvencia II eredményekkel való összehasonlításhoz). Az elméleti eredmények alapján megállapítható, hogy a tőkeszükségletre vonatkozóan számolható értékek eltérhetnek e két esetben. Az eredmények alapján lehetőség van az eltérések hátterében álló tényezők tanulmányozására is. ____ The new Solvency II directive results in a new environment for calculating the solvency capital requirement of insurance companies in the European Union. By modelling insurance companies the study analyses the impact of certain characteristics of insurance population on the solvency capital based on Solvency II rules. The model includes insurance and financial risk module by calculating solvency capital for the given risk types separately and together, respectively. Based on the theoretical results the difference between these two approaches can be observed. Based on the results the analysis of factors in°uencing the differences is also possible.
Resumo:
The paper analyzes a special corporate banking product, the so called cash-pool, which gained remarkable popularity in the recent years as firms try to centralize and manage their liquidity more efficiently. The novelty of this paper is the formalization of a valuation model which can serve as a basis for a Monte Carlo simulation to assess the most important benefits of the firms arising from the pooling of their cash holdings. The literature emphasizes several benefits of cash-pooling such as interest rate savings, economy of scale and reduced cash-flow volatility. The presented model focuses on the interest rate savings complemented with a new aspect: the reduced counterparty risk toward the bank. The main conclusion of the analysis is that the value of a cash-pool is higher in case of firms with large, diverse and volatile cash-flows having less access to the capital markets especially if the partner bank is risky and offers a high interest spread. It is also shown that cash-pooling is not the privilege of large multinational firms any more as the initial direct costs can be easily regained within a year even in the case of SMEs.
Resumo:
To effectively assess and mitigate risk of permafrost disturbance, disturbance-p rone areas can be predicted through the application of susceptibility models. In this study we developed regional susceptibility models for permafrost disturbances using a field disturbance inventory to test the transferability of the model to a broader region in the Canadian High Arctic. Resulting maps of susceptibility were then used to explore the effect of terrain variables on the occurrence of disturbances within this region. To account for a large range of landscape charac- teristics, the model was calibrated using two locations: Sabine Peninsula, Melville Island, NU, and Fosheim Pen- insula, Ellesmere Island, NU. Spatial patterns of disturbance were predicted with a generalized linear model (GLM) and generalized additive model (GAM), each calibrated using disturbed and randomized undisturbed lo- cations from both locations and GIS-derived terrain predictor variables including slope, potential incoming solar radiation, wetness index, topographic position index, elevation, and distance to water. Each model was validated for the Sabine and Fosheim Peninsulas using independent data sets while the transferability of the model to an independent site was assessed at Cape Bounty, Melville Island, NU. The regional GLM and GAM validated well for both calibration sites (Sabine and Fosheim) with the area under the receiver operating curves (AUROC) N 0.79. Both models were applied directly to Cape Bounty without calibration and validated equally with AUROC's of 0.76; however, each model predicted disturbed and undisturbed samples differently. Addition- ally, the sensitivity of the transferred model was assessed using data sets with different sample sizes. Results in- dicated that models based on larger sample sizes transferred more consistently and captured the variability within the terrain attributes in the respective study areas. Terrain attributes associated with the initiation of dis- turbances were similar regardless of the location. Disturbances commonly occurred on slopes between 4 and 15°, below Holocene marine limit, and in areas with low potential incoming solar radiation
Resumo:
Modelling the susceptibility of permafrost slopes to disturbance can identify areas at risk to future disturbance and result in safer infrastructure and resource development in the Arctic. In this study, we use terrain attributes derived from a digital elevation model, an inventory of permafrost slope disturbances known as active-layer detachments (ALDs) and generalised additive modelling to produce a map of permafrost slope disturbance susceptibility for an area on northern Melville Island, in the Canadian High Arctic. By examining terrain variables and their relative importance, we identified factors important for initiating slope disturbance. The model was calibrated and validated using 70 and 30 per cent of a data-set of 760 mapped ALDs, including disturbed and randomised undisturbed samples. The generalised additive model calibrated and validated very well, with areas under the receiver operating characteristic curve of 0.89 and 0.81, respectively, demonstrating its effectiveness at predicting disturbed and undisturbed samples. ALDs were most likely to occur below the marine limit on slope angles between 3 and 10° and in areas with low values of potential incoming solar radiation (north-facing slopes).
Resumo:
Shelf seas comprise approximately 7% of the world’s oceans and host enormous economic activity. Development of energy installations (e.g. Offshore Wind Farms (OWFs), tidal turbines) in response to increased demand for renewable energy requires a careful analysis of potential impacts. Recent remote sensing observations have identified kilometrescale impacts from OWFs. Existing modelling evaluating monopile impacts has fallen into two camps: small-scale models with individually resolved turbines looking at local effects; and large-scale analyses but with sub-grid scale turbine parameterisations. This work straddles both scales through a 3D unstructured grid model (FVCOM): wind turbine monopiles in the eastern Irish Sea are explicitly described in the grid whilst the overall grid domain covers the south-western UK shelf. Localised regions of decreased velocity extend up to 250 times the monopile diameter away from the monopile. Shelf-wide, the amplitude of the M2 tidal constituent increases by up to 7%. The turbines enhance localised vertical mixing which decreases seasonal stratification. The spatial extent of this extends well beyond the turbines into the surrounding seas. With significant expansion of OWFs on continental shelves, this work highlights the importance of how OWFs may impact coastal (e.g. increased flooding risk) and offshore (e.g. stratification and nutrient cycling) areas.
Resumo:
Shelf seas comprise approximately 7% of the world’s oceans and host enormous economic activity. Development of energy installations (e.g. Offshore Wind Farms (OWFs), tidal turbines) in response to increased demand for renewable energy requires a careful analysis of potential impacts. Recent remote sensing observations have identified kilometrescale impacts from OWFs. Existing modelling evaluating monopile impacts has fallen into two camps: small-scale models with individually resolved turbines looking at local effects; and large-scale analyses but with sub-grid scale turbine parameterisations. This work straddles both scales through a 3D unstructured grid model (FVCOM): wind turbine monopiles in the eastern Irish Sea are explicitly described in the grid whilst the overall grid domain covers the south-western UK shelf. Localised regions of decreased velocity extend up to 250 times the monopile diameter away from the monopile. Shelf-wide, the amplitude of the M2 tidal constituent increases by up to 7%. The turbines enhance localised vertical mixing which decreases seasonal stratification. The spatial extent of this extends well beyond the turbines into the surrounding seas. With significant expansion of OWFs on continental shelves, this work highlights the importance of how OWFs may impact coastal (e.g. increased flooding risk) and offshore (e.g. stratification and nutrient cycling) areas.
Resumo:
The identification of subjects at high risk for Alzheimer’s disease is important for prognosis and early intervention. We investigated the polygenic architecture of Alzheimer’s disease and the accuracy of Alzheimer’s disease prediction models, including and excluding the polygenic component in the model. This study used genotype data from the powerful dataset comprising 17 008 cases and 37 154 controls obtained from the International Genomics of Alzheimer’s Project (IGAP). Polygenic score analysis tested whether the alleles identified to associate with disease in one sample set were significantly enriched in the cases relative to the controls in an independent sample. The disease prediction accuracy was investigated in a subset of the IGAP data, a sample of 3049 cases and 1554 controls (for whom APOE genotype data were available) by means of sensitivity, specificity, area under the receiver operating characteristic curve (AUC) and positive and negative predictive values. We observed significant evidence for a polygenic component enriched in Alzheimer’s disease (P = 4.9 × 10−26). This enrichment remained significant after APOE and other genome-wide associated regions were excluded (P = 3.4 × 10−19). The best prediction accuracy AUC = 78.2% (95% confidence interval 77–80%) was achieved by a logistic regression model with APOE, the polygenic score, sex and age as predictors. In conclusion, Alzheimer’s disease has a significant polygenic component, which has predictive utility for Alzheimer’s disease risk and could be a valuable research tool complementing experimental designs, including preventative clinical trials, stem cell selection and high/low risk clinical studies. In modelling a range of sample disease prevalences, we found that polygenic scores almost doubles case prediction from chance with increased prediction at polygenic extremes.
Resumo:
Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging because of reinforcing feedbacks between multiple drivers. We conducted semistructured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. The “Hands-off” scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production under drought conditions. The “Fire management” scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared with the “Fire suppression” scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a “boundary object” to facilitate collaboration and integration of different perceptions of fire in the region. This approach also has the potential to inform decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.