910 resultados para Risk models


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Unmanned Aircraft Systems (UAS) describe a diverse range of aircraft that are operated without a human pilot on-board. Unmanned aircraft range from small rotorcraft, which can fit in the palm of your hand, through to fixed wing aircraft comparable in size to that of a commercial passenger jet. The absence of a pilot on-board allows these aircraft to be developed with unique performance capabilities facilitating a wide range of applications in surveillance, environmental management, agriculture, defence, and search and rescue. However, regulations relating to the safe design and operation of UAS first need to be developed before the many potential benefits from these applications can be realised. According to the International Civil Aviation Organization (ICAO), a Risk Management Process (RMP) should support all civil aviation policy and rulemaking activities (ICAO 2009). The RMP is described in International standard, ISO 31000:2009 (ISO, 2009a). This standard is intentionally generic and high-level, providing limited guidance on how it can be effectively applied to complex socio-technical decision problems such as the development of regulations for UAS. Through the application of principles and tools drawn from systems philosophy and systems engineering, this thesis explores how the RMP can be effectively applied to support the development of safety regulations for UAS. A sound systems-theoretic foundation for the RMP is presented in this thesis. Using the case-study scenario of a UAS operation over an inhabited area and through the novel application of principles drawn from general systems modelling philosophy, a consolidated framework of the definitions of the concepts of: safe, risk and hazard is made. The framework is novel in that it facilitates the representation of broader subjective factors in an assessment of the safety of a system; describes the issues associated with the specification of a system-boundary; makes explicit the hierarchical nature of the relationship between the concepts and the subsequent constraints that exist between them; and can be evaluated using a range of analytic or deliberative modelling techniques. Following the general sequence of the RMP, the thesis explores the issues associated with the quantified specification of safety criteria for UAS. A novel risk analysis tool is presented. In contrast to existing risk tools, the analysis tool presented in this thesis quantifiably characterises both the societal and individual risk of UAS operations as a function of the flight path of the aircraft. A novel structuring of the risk evaluation and risk treatment decision processes is then proposed. The structuring is achieved through the application of the Decision Support Problem Technique; a modelling approach that has been previously used to effectively model complex engineering design processes and to support decision-making in relation to airspace design. The final contribution made by this thesis is in the development of an airworthiness regulatory framework for civil UAS. A novel "airworthiness certification matrix" is proposed as a basis for the definition of UAS "Part 21" regulations. The outcome airworthiness certification matrix provides a flexible, systematic and justifiable method for promulgating airworthiness regulations for UAS. In addition, an approach for deriving "Part 1309" regulations for UAS is presented. In contrast to existing approaches, the approach presented in this thesis facilitates a traceable and objective tailoring of system-level reliability requirements across the diverse range of UAS operations. The significance of the research contained in this thesis is clearly demonstrated by its practical real world outcomes. Industry regulatory development groups and the Civil Aviation Safety Authority have endorsed the proposed airworthiness certification matrix. The risk models have also been used to support research undertaken by the Australian Department of Defence. Ultimately, it is hoped that the outcomes from this research will play a significant part in the shaping of regulations for civil UAS, here in Australia and around the world.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Barmah Forest virus (BFV) disease is one of the most widespread mosquito-borne diseases in Australia. The number of outbreaks and the incidence rate of BFV in Australia have attracted growing concerns about the spatio-temporal complexity and underlying risk factors of BFV disease. A large number of notifications has been recorded continuously in Queensland since 1992. Yet, little is known about the spatial and temporal characteristics of the disease. I aim to use notification data to better understand the effects of climatic, demographic, socio-economic and ecological risk factors on the spatial epidemiology of BFV disease transmission, develop predictive risk models and forecast future disease risks under climate change scenarios. Computerised data files of daily notifications of BFV disease and climatic variables in Queensland during 1992-2008 were obtained from Queensland Health and Australian Bureau of Meteorology, respectively. Projections on climate data for years 2025, 2050 and 2100 were obtained from Council of Scientific Industrial Research Organisation. Data on socio-economic, demographic and ecological factors were also obtained from relevant government departments as follows: 1) socio-economic and demographic data from Australian Bureau of Statistics; 2) wetlands data from Department of Environment and Resource Management and 3) tidal readings from Queensland Department of Transport and Main roads. Disease notifications were geocoded and spatial and temporal patterns of disease were investigated using geostatistics. Visualisation of BFV disease incidence rates through mapping reveals the presence of substantial spatio-temporal variation at statistical local areas (SLA) over time. Results reveal high incidence rates of BFV disease along coastal areas compared to the whole area of Queensland. A Mantel-Haenszel Chi-square analysis for trend reveals a statistically significant relationship between BFV disease incidence rates and age groups (ƒÓ2 = 7587, p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the state. A cluster analysis was used to detect the hot spots/clusters of BFV disease at a SLA level. Most likely spatial and space-time clusters are detected at the same locations across coastal Queensland (p<0.05). The study demonstrates heterogeneity of disease risk at a SLA level and reveals the spatial and temporal clustering of BFV disease in Queensland. Discriminant analysis was employed to establish a link between wetland classes, climate zones and BFV disease. This is because the importance of wetlands in the transmission of BFV disease remains unclear. The multivariable discriminant modelling analyses demonstrate that wetland types of saline 1, riverine and saline tidal influence were the most significant risk factors for BFV disease in all climate and buffer zones, while lacustrine, palustrine, estuarine and saline 2 and saline 3 wetlands were less important. The model accuracies were 76%, 98% and 100% for BFV risk in subtropical, tropical and temperate climate zones, respectively. This study demonstrates that BFV disease risk varied with wetland class and climate zone. The study suggests that wetlands may act as potential breeding habitats for BFV vectors. Multivariable spatial regression models were applied to assess the impact of spatial climatic, socio-economic and tidal factors on the BFV disease in Queensland. Spatial regression models were developed to account for spatial effects. Spatial regression models generated superior estimates over a traditional regression model. In the spatial regression models, BFV disease incidence shows an inverse relationship with minimum temperature, low tide and distance to coast, and positive relationship with rainfall in coastal areas whereas in whole Queensland the disease shows an inverse relationship with minimum temperature and high tide and positive relationship with rainfall. This study determines the most significant spatial risk factors for BFV disease across Queensland. Empirical models were developed to forecast the future risk of BFV disease outbreaks in coastal Queensland using existing climatic, socio-economic and tidal conditions under climate change scenarios. Logistic regression models were developed using BFV disease outbreak data for the existing period (2000-2008). The most parsimonious model had high sensitivity, specificity and accuracy and this model was used to estimate and forecast BFV disease outbreaks for years 2025, 2050 and 2100 under climate change scenarios for Australia. Important contributions arising from this research are that: (i) it is innovative to identify high-risk coastal areas by creating buffers based on grid-centroid and the use of fine-grained spatial units, i.e., mesh blocks; (ii) a spatial regression method was used to account for spatial dependence and heterogeneity of data in the study area; (iii) it determined a range of potential spatial risk factors for BFV disease; and (iv) it predicted the future risk of BFV disease outbreaks under climate change scenarios in Queensland, Australia. In conclusion, the thesis demonstrates that the distribution of BFV disease exhibits a distinct spatial and temporal variation. Such variation is influenced by a range of spatial risk factors including climatic, demographic, socio-economic, ecological and tidal variables. The thesis demonstrates that spatial regression method can be applied to better understand the transmission dynamics of BFV disease and its risk factors. The research findings show that disease notification data can be integrated with multi-factorial risk factor data to develop build-up models and forecast future potential disease risks under climate change scenarios. This thesis may have implications in BFV disease control and prevention programs in Queensland.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper describes a safety data recording and analysis system that has been developed to capture safety occurrences including precursors using high-definition forward-facing video from train cabs and data from other train-borne systems. The paper describes the data processing model and how events detected through data analysis are related to an underlying socio-technical model of accident causation. The integrated approach to safety data recording and analysis insures systemic factors that condition, influence or potentially contribute to an occurrence are captured both for safety occurrences and precursor events, providing a rich tapestry of antecedent causal factors that can significantly improve learning around accident causation. This can ultimately provide benefit to railways through the development of targeted and more effective countermeasures, better risk models and more effective use and prioritization of safety funds. Level crossing occurrences are a key focus in this paper with data analysis scenarios describing causal factors around near-miss occurrences. The paper concludes with a discussion on how the system can also be applied to other types of railway safety occurrences.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is a wide range of potential study designs for intervention studies to decrease nosocomial infections in hospitals. The analysis is complex due to competing events, clustering, multiple timescales and time-dependent period and intervention variables. This review considers the popular pre-post quasi-experimental design and compares it with randomized designs. Randomization can be done in several ways: randomization of the cluster [intensive care unit (ICU) or hospital] in a parallel design; randomization of the sequence in a cross-over design; and randomization of the time of intervention in a stepped-wedge design. We introduce each design in the context of nosocomial infections and discuss the designs with respect to the following key points: bias, control for nonintervention factors, and generalizability. Statistical issues are discussed. A pre-post-intervention design is often the only choice that will be informative for a retrospective analysis of an outbreak setting. It can be seen as a pilot study with further, more rigorous designs needed to establish causality. To yield internally valid results, randomization is needed. Generally, the first choice in terms of the internal validity should be a parallel cluster randomized trial. However, generalizability might be stronger in a stepped-wedge design because a wider range of ICU clinicians may be convinced to participate, especially if there are pilot studies with promising results. For analysis, the use of extended competing risk models is recommended.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study sought predictors of mortality in patients aged >or=75 years with a first ST-segment elevation myocardial infarction (STEMI) and evaluated the validity of the GUSTO-I and TIMI risk models. Clinical variables, treatment and mortality data from 433 consecutive patients were collected. Univariable and multivariable logistic regression analyses were applied to identify baseline factors associated with 30-day mortality. Subsequently a model predicting 30-day mortality was created and compared with the performance of the GUSTO-I and TIMI models. After adjustment, a higher Killip class was the most important predictor (OR 16.1; 95% CI 5.7-45.6). Elevated heart rate, longer time delay to admission, hyperglycemia and older age were also associated with increased risk. Patients with hypercholesterolemia had a significantly lower risk (OR 0.46; 95% CI 0.24-0.86). Discrimination (c-statistic 0.79, 95% CI 0.75-0.84) and calibration (Hosmer-Lemeshow 6, p = 0.5) of our model were good. The GUSTO-I and TIMI risk scores produced adequate discrimination within our dataset (c-statistic 0.76, 95% CI 0.71-0.81, and c-statistic 0.77, 95% CI 0.72-0.82, respectively), but calibration was not satisfactory (HL 21.8, p = 0.005 for GUSTO-I, and HL 20.6, p = 0.008 for TIMI). In conclusion, short-term mortality in elderly patients with a first STEMI depends most importantly on initial clinical and hemodynamic status. The GUSTO-I and TIMI models are insufficiently adequate for providing an exact estimate of 30-day mortality risk.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A growing number of respected commentators now argue that regulatory capture of public agencies and public policy by leading banks was one of the main causal factors behind the financial crisis of 2007–2009, resulting in a permissive regulatory environment. This regulatory environment placed a faith in banks own internal risk models, contributed to pro-cyclical behaviour and turned a blind eye to excessive risk taking. The article argues that a form of ‘multi-level regulatory capture’ characterized the global financial architecture prior to the crisis. Simultaneously, regulatory capture fed off, but also nourished the financial boom, in a fashion that mirrored the life cycle of the boom itself. Minimizing future financial booms and crises will require continuous, conscious and explicit efforts to restrain financial regulatory capture now and into the future. The article assesses the extent to which this has been achieved in current global financial governance reform efforts and highlights some of the persistent difficulties that will continue to hamper efforts to restrain regulatory capture. The evidence concerning the extent to which regulatory capture is being effectively restrained is somewhat mixed, and where it is happening it is largely unintentional and accidental. Recent reforms have overlooked the political causes of the crisis and have failed to focus explicitly or systematically on regulatory capture.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES: To determine effective and efficient monitoring criteria for ocular hypertension [raised intraocular pressure (IOP)] through (i) identification and validation of glaucoma risk prediction models; and (ii) development of models to determine optimal surveillance pathways.

DESIGN: A discrete event simulation economic modelling evaluation. Data from systematic reviews of risk prediction models and agreement between tonometers, secondary analyses of existing datasets (to validate identified risk models and determine optimal monitoring criteria) and public preferences were used to structure and populate the economic model.

SETTING: Primary and secondary care.

PARTICIPANTS: Adults with ocular hypertension (IOP > 21 mmHg) and the public (surveillance preferences).

INTERVENTIONS: We compared five pathways: two based on National Institute for Health and Clinical Excellence (NICE) guidelines with monitoring interval and treatment depending on initial risk stratification, 'NICE intensive' (4-monthly to annual monitoring) and 'NICE conservative' (6-monthly to biennial monitoring); two pathways, differing in location (hospital and community), with monitoring biennially and treatment initiated for a ≥ 6% 5-year glaucoma risk; and a 'treat all' pathway involving treatment with a prostaglandin analogue if IOP > 21 mmHg and IOP measured annually in the community.

MAIN OUTCOME MEASURES: Glaucoma cases detected; tonometer agreement; public preferences; costs; willingness to pay and quality-adjusted life-years (QALYs).

RESULTS: The best available glaucoma risk prediction model estimated the 5-year risk based on age and ocular predictors (IOP, central corneal thickness, optic nerve damage and index of visual field status). Taking the average of two IOP readings, by tonometry, true change was detected at two years. Sizeable measurement variability was noted between tonometers. There was a general public preference for monitoring; good communication and understanding of the process predicted service value. 'Treat all' was the least costly and 'NICE intensive' the most costly pathway. Biennial monitoring reduced the number of cases of glaucoma conversion compared with a 'treat all' pathway and provided more QALYs, but the incremental cost-effectiveness ratio (ICER) was considerably more than £30,000. The 'NICE intensive' pathway also avoided glaucoma conversion, but NICE-based pathways were either dominated (more costly and less effective) by biennial hospital monitoring or had a ICERs > £30,000. Results were not sensitive to the risk threshold for initiating surveillance but were sensitive to the risk threshold for initiating treatment, NHS costs and treatment adherence.

LIMITATIONS: Optimal monitoring intervals were based on IOP data. There were insufficient data to determine the optimal frequency of measurement of the visual field or optic nerve head for identification of glaucoma. The economic modelling took a 20-year time horizon which may be insufficient to capture long-term benefits. Sensitivity analyses may not fully capture the uncertainty surrounding parameter estimates.

CONCLUSIONS: For confirmed ocular hypertension, findings suggest that there is no clear benefit from intensive monitoring. Consideration of the patient experience is important. A cohort study is recommended to provide data to refine the glaucoma risk prediction model, determine the optimum type and frequency of serial glaucoma tests and estimate costs and patient preferences for monitoring and treatment.

FUNDING: The National Institute for Health Research Health Technology Assessment Programme.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We introduce an algebraic operator framework to study discounted penalty functions in renewal risk models. For inter-arrival and claim size distributions with rational Laplace transform, the usual integral equation is transformed into a boundary value problem, which is solved by symbolic techniques. The factorization of the differential operator can be lifted to the level of boundary value problems, amounting to iteratively solving first-order problems. This leads to an explicit expression for the Gerber-Shiu function in terms of the penalty function.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

On présente une nouvelle approche de simulation pour la fonction de densité conjointe du surplus avant la ruine et du déficit au moment de la ruine, pour des modèles de risque déterminés par des subordinateurs de Lévy. Cette approche s'inspire de la décomposition "Ladder height" pour la probabilité de ruine dans le Modèle Classique. Ce modèle, déterminé par un processus de Poisson composé, est un cas particulier du modèle plus général déterminé par un subordinateur, pour lequel la décomposition "Ladder height" de la probabilité de ruine s'applique aussi. La Fonction de Pénalité Escomptée, encore appelée Fonction Gerber-Shiu (Fonction GS), a apporté une approche unificatrice dans l'étude des quantités liées à l'événement de la ruine été introduite. La probabilité de ruine et la fonction de densité conjointe du surplus avant la ruine et du déficit au moment de la ruine sont des cas particuliers de la Fonction GS. On retrouve, dans la littérature, des expressions pour exprimer ces deux quantités, mais elles sont difficilement exploitables de par leurs formes de séries infinies de convolutions sans formes analytiques fermées. Cependant, puisqu'elles sont dérivées de la Fonction GS, les expressions pour les deux quantités partagent une certaine ressemblance qui nous permet de nous inspirer de la décomposition "Ladder height" de la probabilité de ruine pour dériver une approche de simulation pour cette fonction de densité conjointe. On présente une introduction détaillée des modèles de risque que nous étudions dans ce mémoire et pour lesquels il est possible de réaliser la simulation. Afin de motiver ce travail, on introduit brièvement le vaste domaine des mesures de risque, afin d'en calculer quelques unes pour ces modèles de risque. Ce travail contribue à une meilleure compréhension du comportement des modèles de risques déterminés par des subordinateurs face à l'éventualité de la ruine, puisqu'il apporte un point de vue numérique absent de la littérature.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The service quality of any sector has two major aspects namely technical and functional. Technical quality can be attained by maintaining technical specification as decided by the organization. Functional quality refers to the manner which service is delivered to customer which can be assessed by the customer feed backs. A field survey was conducted based on the management tool SERVQUAL, by designing 28 constructs under 7 dimensions of service quality. Stratified sampling techniques were used to get 336 valid responses and the gap scores of expectations and perceptions are analyzed using statistical techniques to identify the weakest dimension. To assess the technical aspects of availability six months live outage data of base transceiver were collected. The statistical and exploratory techniques were used to model the network performance. The failure patterns have been modeled in competing risk models and probability distribution of service outage and restorations were parameterized. Since the availability of network is a function of the reliability and maintainability of the network elements, any service provider who wishes to keep up their service level agreements on availability should be aware of the variability of these elements and its effects on interactions. The availability variations were studied by designing a discrete time event simulation model with probabilistic input parameters. The probabilistic distribution parameters arrived from live data analysis was used to design experiments to define the availability domain of the network under consideration. The availability domain can be used as a reference for planning and implementing maintenance activities. A new metric is proposed which incorporates a consistency index along with key service parameters that can be used to compare the performance of different service providers. The developed tool can be used for reliability analysis of mobile communication systems and assumes greater significance in the wake of mobile portability facility. It is also possible to have a relative measure of the effectiveness of different service providers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En esta tesis se entiende la intuición y las matemáticas del modelo generalizado de la teoría de la ruina, expuesto por Li y Lu quienes se remiten a los cálculos de la probabilidad de ruina de Reinhard. Teniendo en cuenta las definiciones, la media y la varianza del proceso telegráfico con saltos encontradas en la tesis doctoral de Óscar López. Luego se simula el proceso de riesgo y finalmente con un ejemplo se calcula la probabilidad de ruina numéricamente. Todo el proceso se va a realizar teniendo en cuenta reclamaciones con distribución exponencial.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En este trabajo se realiza la medición del riesgo de mercado para el portafolio de TES de un banco colombiano determinado, abordando el pronóstico de valor en riesgo (VaR) mediante diferentes modelos multivariados de volatilidad: EWMA, GARCH ortogonal, GARCH robusto, así como distintos modelos de VaR con distribución normal y distribución t-student, evaluando su eficiencia con las metodologías de backtesting propuestas por Candelon et al. (2011) con base en el método generalizado de momentos, junto con los test de independencia y de cobertura condicional planteados por Christoffersen y Pelletier (2004) y por Berkowitz, Christoffersen y Pelletier (2010). Los resultados obtenidos demuestran que la mejor especificación del VaR para la medición del riesgo de mercado del portafolio de TES de los bancos colombianos, es el construido a partir de volatilidades EWMA y basado en la distribución normal, ya que satisface las hipótesis de cobertura no condicional, independencia y cobertura condicional, al igual que los requerimientos estipulados en Basilea II y en la normativa vigente en Colombia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La dependencia entre las series financieras, es un parámetro fundamental para la estimación de modelos de Riesgo. El Valor en Riesgo (VaR) es una de las medidas más importantes utilizadas para la administración y gestión de Riesgos Financieros, en la actualidad existen diferentes métodos para su estimación, como el método por simulación histórica, el cual no asume ninguna distribución sobre los retornos de los factores de riesgo o activos, o los métodos paramétricos que asumen normalidad sobre las distribuciones. En este documento se introduce la teoría de cópulas, como medida de dependencia entre las series, se estima un modelo ARMA-GARCH-Cópula para el cálculo del Valor en Riesgo de un portafolio compuesto por dos series financiera, la tasa de cambio Dólar-Peso y Euro-Peso. Los resultados obtenidos muestran que la estimación del VaR por medio de copulas es más preciso en relación a los métodos tradicionales.

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Investment risk models with infinite variance provide a better description of distributions of individual property returns in the IPD UK database over the period 1981 to 2003 than normally distributed risk models. This finding mirrors results in the US and Australia using identical methodology. Real estate investment risk is heteroskedastic, but the characteristic exponent of the investment risk function is constant across time – yet it may vary by property type. Asset diversification is far less effective at reducing the impact of non‐systematic investment risk on real estate portfolios than in the case of assets with normally distributed investment risk. The results, therefore, indicate that multi‐risk factor portfolio allocation models based on measures of investment codependence from finite‐variance statistics are ineffective in the real estate context