947 resultados para Risk based deployment


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study presents a decision-making method for maintenance policy selection of power plants equipment. The method is based on risk analysis concepts. The method first step consists in identifying critical equipment both for power plant operational performance and availability based on risk concepts. The second step involves the proposal of a potential maintenance policy that could be applied to critical equipment in order to increase its availability. The costs associated with each potential maintenance policy must be estimated, including the maintenance costs and the cost of failure that measures the critical equipment failure consequences for the power plant operation. Once the failure probabilities and the costs of failures are estimated, a decision-making procedure is applied to select the best maintenance policy. The decision criterion is to minimize the equipment cost of failure, considering the costs and likelihood of occurrence of failure scenarios. The method is applied to the analysis of a lubrication oil system used in gas turbines journal bearings. The turbine has more than 150 MW nominal output, installed in an open cycle thermoelectric power plant. A design modification with the installation of a redundant oil pump is proposed for lubricating oil system availability improvement. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Penalizing line management for the occurrence of lost time injuries has in some cases had unintended negative consequences. These are discussed. An alternative system is suggested that penalizes line management for accidents where the combination of the probability of recurrence and the maximum reasonable consequences such a recurrence may have exceeds an agreed limit. A reward is given for prompt effective control of the risk to below the agreed risk limit. The reward is smaller than the penalty. High-risk accidents require independent investigation by a safety officer using analytical techniques. Two case examples are given to illustrate the system. Continuous safety improvement is driven by a planned reduction in the agreed risk limit over time and reward for proactive risk assessment and control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A utilização de recursos energéticos renováveis apresenta-se como um caminho vital para a humanidade alcançar um desenvolvimento sustentável. Nesta campanha, a energia eólica surge como um dos principais vectores de orientação tendo evoluído de forma quase exponencial nos últimos anos. No entanto, apesar da sua relativa maturidade, esta tecnologia enfrenta ainda alguns problemas e desafios. Não obstante a experiência empírica da indústria eólica, adquirida nos últimos trinta anos e dos esforços para melhorar a fiabilidade operacional das turbinas, as taxas de falha ainda se apresentam elevadas. Face às correntes práticas de Manutenção das turbinas e parques eólicos e às características de falha, (por vezes catastróficas), existe a necessidade de optimizar as estratégias de manutenção das turbinas eólicas e reduzir os custos durante o ciclo de vida, de modo a maximizar o retorno do investimento. Descreve-se neste trabalho o estado do conhecimento actual face ao objectivo pretendido, a recolha de dados reais da operação e Manutenção, a aplicabilidade dos modelos escolhidos para obtenção da probabilidade de falha, e as consequências e avaliação do risco. Assim, desenvolveu-se uma ferramenta de apoio à decisão, baseada em Modelos de RBI (Risk Based Inspection) e RBIM (Risk Based Inspection and Maintenance) aplicados a turbinas eólicas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Engenharia Electrotécnica e de Computadores

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the financial crisis, risk based portfolio allocations have gained a great deal in popularity. This increase in popularity is primarily due to the fact that they make no assumptions as to the expected return of the assets in the portfolio. These portfolios implicitly put risk management at the heart of asset allocation and thus their recent appeal. This paper will serve as a comparison of four well-known risk based portfolio allocation methods; minimum variance, maximum diversification, inverse volatility and equally weighted risk contribution. Empirical backtests will be performed throughout rising interest rate periods from 1953 to 2015. Additionally, I will compare these portfolios to more simple allocation methods, such as equally weighted and a 60/40 asset-allocation mix. This paper will help to answer the question if these portfolios can survive in a rising interest rate environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research has demonstrated that landscape or watershed scale processes can influence instream aquatic ecosystems, in terms of the impacts of delivery of fine sediment, solutes and organic matter. Testing such impacts upon populations of organisms (i.e. at the catchment scale) has not proven straightforward and differences have emerged in the conclusions reached. This is: (1) partly because different studies have focused upon different scales of enquiry; but also (2) because the emphasis upon upstream land cover has rarely addressed the extent to which such land covers are hydrologically connected, and hence able to deliver diffuse pollution, to the drainage network However, there is a third issue. In order to develop suitable hydrological models, we need to conceptualise the process cascade. To do this, we need to know what matters to the organism being impacted by the hydrological system, such that we can identify which processes need to be modelled. Acquiring such knowledge is not easy, especially for organisms like fish that might occupy very different locations in the river over relatively short periods of time. However, and inevitably, hydrological modellers have started by building up piecemeal the aspects of the problem that we think matter to fish. Herein, we report two developments: (a) for the case of sediment associated diffuse pollution from agriculture, a risk-based modelling framework, SCIMAP, has been developed, which is distinct because it has an explicit focus upon hydrological connectivity; and (b) we use spatially distributed ecological data to infer the processes and the associated process parameters that matter to salmonid fry. We apply the model to spatially distributed salmon and fry data from the River Eden, Cumbria, England. The analysis shows, quite surprisingly, that arable land covers are relatively unimportant as drivers of fry abundance. What matters most is intensive pasture, a land cover that could be associated with a number of stressors on salmonid fry (e.g. pesticides, fine sediment) and which allows us to identify a series of risky field locations, where this land cover is readily connected to the river system by overland flow. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis examines the effect of operating leverage and financial leverage on the value premium in the Finnish stock markets 2002-2012. The purpose of the thesis is to examine whether operating leverage and financial leverage affect firm`s BE/ME and stock returns. The accounting data has been collected from Amadeus database and market-based data from the Datastream database. Sample used in this thesis covers years from 1998 to 2012. This thesis confirms the findings of previous research of tight connection between operating leverage and BE/ME and reinforces the findings of previous research that relation between financial leverage and BE/ME is not robust. In turn, relation between operating leverage, BE/ME and stock returns is not clearly perceived during the 2002-2012 period in the Finnish stock markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geoengineering by stratospheric aerosol injection has been proposed as a policy response to warming from human emissions of greenhouse gases, but it may produce unequal regional impacts. We present a simple, intuitive risk-based framework for classifying these impacts according to whether geoengineering increases or decreases the risk of substantial climate change, with further classification by the level of existing risk from climate change from increasing carbon dioxide concentrations. This framework is applied to two climate model simulations of geoengineering counterbalancing the surface warming produced by a quadrupling of carbon dioxide concentrations, with one using a layer of sulphate aerosol in the lower stratosphere, and the other a reduction in total solar irradiance. The solar dimming model simulation shows less regional inequality of impacts compared with the aerosol geoengineering simulation. In the solar dimming simulation, 10% of the Earth’s surface area, containing 10% of its population and 11% of its gross domestic product, experiences greater risk of substantial precipitation changes under geoengineering than under enhanced carbon dioxide concentrations. In the aerosol geoengineering simulation the increased risk of substantial precipitation change is experienced by 42% of Earth’s surface area, containing 36% of its population and 60% of its gross domestic product.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecastuncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called “How much are you prepared to pay for a forecast?”. The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydrometeorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants’ willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land development in the vicinity of airports often leads to land-use that can attract birds that are hazardous to aviation operations. For this reason, certain forms of land-use have traditionally been discouraged within prescribed distances of Canadian airports. However, this often leads to an unrealistic prohibition of land-use in the vicinity of airports located in urban settings. Furthermore, it is often unclear that the desired safety goals have been achieved. This paper describes a model that was created to assist in the development of zoning regulations for a future airport site in Canada. The framework links land-use to bird-related safety-risks and aircraft operations by categorizing the predictable relationships between: (i) different land uses found in urbanized and urbanizing settings near airports; (ii) bird species; and (iii) the different safety-risks to aircraft during various phases of flight. The latter is assessed relative to the runway approach and departure paths. Bird species are ranked to reflect the potential severity of an impact with an aircraft (using bird weight, flocking characteristics, and flight behaviours). These criteria are then employed to chart bird-related safety-risks relative to runway reference points. Each form of land-use is categorized to reflect the degree to which it attracts hazardous bird species. From this information, hazard and risk matrices have been developed and applied to the future airport setting, thereby providing risk-based guidance on appropriate land-uses that range from prohibited to acceptable. The framework has subsequently been applied to an existing Canadian airport, and is currently being adapted for national application. The framework provides a risk-based and science-based approach that offers municipalities and property owner’s flexibility in managing the risks to aviation related to their land use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La stima della frequenza di accadimento di eventi incidentali di tipo random da linee e apparecchiature è in generale effettuata sulla base delle informazioni presenti in banche dati specializzate. I dati presenti in tali banche contengono informazioni relative ad eventi incidentali avvenuti in varie tipologie di installazioni, che spaziano dagli impianti chimici a quelli petrolchimici. Alcune di queste banche dati risultano anche piuttosto datate, poiché fanno riferimento ad incidenti verificatisi ormai molto addietro negli anni. Ne segue che i valori relativi alle frequenze di perdita forniti dalle banche dati risultano molto conservativi. Per ovviare a tale limite e tenere in conto il progresso tecnico, la linea guida API Recommended Pratice 581, pubblicata nel 2000 e successivamente aggiornata nel 2008, ha introdotto un criterio per determinare frequenze di perdita specializzate alla realtà propria impiantistica, mediante l’ausilio di coefficienti correttivi che considerano il meccanismo di guasto del componente, il sistema di gestione della sicurezza e l’efficacia dell’attività ispettiva. Il presente lavoro di tesi ha lo scopo di mettere in evidenza l’evoluzione dell’approccio di valutazione delle frequenze di perdita da tubazione. Esso è articolato come descritto nel seguito. Il Capitolo 1 ha carattere introduttivo. Nel Capitolo 2 è affrontato lo studio delle frequenze di perdita reperibili nelle banche dati generaliste. Nel Capitolo 3 sono illustrati due approcci, uno qualitativo ed uno quantitativo, che permettono di determinare le linee che presentano priorità più alta per essere sottoposte all’attività ispettiva. Il Capitolo 4 è dedicato alla descrizione della guida API Recomended Practice 581. L’applicazione ad un caso di studio dei criteri di selezione delle linee illustrati nel Capitolo 3 e la definizione delle caratteristiche dell’attività ispettiva secondo la linea guida API Recomended Practice 581 sono illustrati nel Capitolo 5. Infine nel Capitolo 6 sono rese esplicite le considerazioni conclusive dello studio effettuato.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, the routine artificial digestion test is applied to assess the presence of Trichinella larvae in pigs. However, this diagnostic method has a low sensitivity compared to serological tests. The results from artificial digestion tests in Switzerland were evaluated over a time period of 15 years to determine by when freedom from infection based on these data could be confirmed. Freedom was defined as a 95% probability that the prevalence of infection was below 0.0001%. Freedom was demonstrated after 12 years at the latest. A new risk-based surveillance approach was then developed based on serology. Risk-based surveillance was also assessed over 15 years, starting in 2010. It was shown that by using this design, the sample size could be reduced by at least a factor of 4 when compared with the traditional testing regimen, without lowering the level of confidence in the Trichinella-free status of the pig population.