25 resultados para Risk based Maintenance
em CentAUR: Central Archive University of Reading - UK
Resumo:
Geoengineering by stratospheric aerosol injection has been proposed as a policy response to warming from human emissions of greenhouse gases, but it may produce unequal regional impacts. We present a simple, intuitive risk-based framework for classifying these impacts according to whether geoengineering increases or decreases the risk of substantial climate change, with further classification by the level of existing risk from climate change from increasing carbon dioxide concentrations. This framework is applied to two climate model simulations of geoengineering counterbalancing the surface warming produced by a quadrupling of carbon dioxide concentrations, with one using a layer of sulphate aerosol in the lower stratosphere, and the other a reduction in total solar irradiance. The solar dimming model simulation shows less regional inequality of impacts compared with the aerosol geoengineering simulation. In the solar dimming simulation, 10% of the Earth’s surface area, containing 10% of its population and 11% of its gross domestic product, experiences greater risk of substantial precipitation changes under geoengineering than under enhanced carbon dioxide concentrations. In the aerosol geoengineering simulation the increased risk of substantial precipitation change is experienced by 42% of Earth’s surface area, containing 36% of its population and 60% of its gross domestic product.
Resumo:
Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecastuncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called “How much are you prepared to pay for a forecast?”. The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydrometeorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants’ willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.
Resumo:
This paper presents an assessment of the implications of climate change for global river flood risk. It is based on the estimation of flood frequency relationships at a grid resolution of 0.5 × 0.5°, using a global hydrological model with climate scenarios derived from 21 climate models, together with projections of future population. Four indicators of the flood hazard are calculated; change in the magnitude and return period of flood peaks, flood-prone population and cropland exposed to substantial change in flood frequency, and a generalised measure of regional flood risk based on combining frequency curves with generic flood damage functions. Under one climate model, emissions and socioeconomic scenario (HadCM3 and SRES A1b), in 2050 the current 100-year flood would occur at least twice as frequently across 40 % of the globe, approximately 450 million flood-prone people and 430 thousand km2 of flood-prone cropland would be exposed to a doubling of flood frequency, and global flood risk would increase by approximately 187 % over the risk in 2050 in the absence of climate change. There is strong regional variability (most adverse impacts would be in Asia), and considerable variability between climate models. In 2050, the range in increased exposure across 21 climate models under SRES A1b is 31–450 million people and 59 to 430 thousand km2 of cropland, and the change in risk varies between −9 and +376 %. The paper presents impacts by region, and also presents relationships between change in global mean surface temperature and impacts on the global flood hazard. There are a number of caveats with the analysis; it is based on one global hydrological model only, the climate scenarios are constructed using pattern-scaling, and the precise impacts are sensitive to some of the assumptions in the definition and application.
Resumo:
As control systems have developed and the implications of poor hygienic practices have become better known, the evaluation of the hygienic status of premises has become more critical. The assessment of the overall status of premises hygiene call provide useful management data indicating whether the premises are improving or whether, whilst still meeting legal requirements, they might be failing to maintain previously high standards. Since the creation, for the United Kingdom, of the meat hygiene service (MHS), one of the aims of the service was to monitor hygiene on different premises to provide a means of comparing standards and to identify and encourage improvements. This desire led to the implementation of a scoring system known as the hygiene assessment system (HAS). This paper analyses English slaughterhouses HAS scores between 1998 and 2005 outlining the main incidents throughout this period, Although rising initially, the later results displayed a clear decrease in the general hygiene scores. These revealing results coincide with the start of a new meat inspection system where, after several years of discussion, risk based inspection is finally coming to a reality within Europe. The paper considers the implications of these changes in the way hygiene standards will be monitored in the future.
Resumo:
Design summer years representing near-extreme hot summers have been used in the United Kingdom for the evaluation of thermal comfort and overheating risk. The years have been selected from measured weather data basically representative of an assumed stationary climate. Recent developments have made available ‘morphed’ equivalents of these years by shifting and stretching the measured variables using change factors produced by the UKCIP02 climate projections. The release of the latest, probabilistic, climate projections of UKCP09 together with the availability of a weather generator that can produce plausible daily or hourly sequences of weather variables has opened up the opportunity for generating new design summer years which can be used in risk-based decision-making. There are many possible methods for the production of design summer years from UKCP09 output: in this article, the original concept of the design summer year is largely retained, but a number of alternative methodologies for generating the years are explored. An alternative, more robust measure of warmth (weighted cooling degree hours) is also employed. It is demonstrated that the UKCP09 weather generator is capable of producing years for the baseline period, which are comparable with those in current use. Four methodologies for the generation of future years are described, and their output related to the future (deterministic) years that are currently available. It is concluded that, in general, years produced from the UKCP09 projections are warmer than those generated previously. Practical applications: The methodologies described in this article will facilitate designers who have access to the output of the UKCP09 weather generator (WG) to generate Design Summer Year hourly files tailored to their needs. The files produced will differ according to the methodology selected, in addition to location, emissions scenario and timeslice.
Resumo:
This paper seeks to discuss EU policies relating to securities markets, created in the wake of the financial crisis and how ICT and specifically e-Government can be utilised within this context. This study utilises the UK as a basis for our discussion. The recent financial crisis has caused a change of perspective in relation to government services and polices. The regulation of the financial sector has been heavily criticised and so is undergoing radical change in the UK and the rest of Europe. New regulatory bodies are being defined with more focus on taking a risk-based system-wide approach to regulating the financial sector. This approach aims to prevent financial institutions becoming too big to fail and thus require massive government bail outs. In addition, a new wave of EU regulation is in the wind to update risk management practices and to further protect investors. This paper discusses the reasons for the financial crisis and the UK’s past and future regulatory landscape. The current and future approach and strategies adopted by the UK’s financial regulators are reviewed as is the lifecycle of EU Directives. The regulatory responses to the crisis are discussed and upcoming regulatory hotspots identified. Discussion of these issues provides the context for our evaluation of the role e-Government and ICT in improving the regulatory system. We identify several processes, which are elementary for regulatory compliance and discuss how ICT is elementary in their implementation. The processes considered include those required for internal control and monitoring, risk management, record keeping and disclosure to regulatory bodies. We find these processes offer an excellent opportunity to adopt an e-Government approach to improve services to both regulated businesses and individual investors through the benefits derived from a more effective and efficient regulatory system.
Resumo:
In this article, we illustrate experimentally an important consequence of the stochastic component in choice behaviour which has not been acknowledged so far. Namely, its potential to produce ‘regression to the mean’ (RTM) effects. We employ a novel approach to individual choice under risk, based on repeated multiple-lottery choices (i.e. choices among many lotteries), to show how the high degree of stochastic variability present in individual decisions can distort crucially certain results through RTM effects. We demonstrate the point in the context of a social comparison experiment.
Resumo:
Understanding the performance of banks is of the utmost importance due to the impact the sector may have on economic growth and financial stability. Residential mortgage loans constitute a large proportion of the portfolio of many banks and are one of the key assets in the determination of their performance. Using a dynamic panel model, we analyse the impact of residential mortgage loans on bank profitability and risk, based on a sample of 555 banks in the European Union (EU-15), over the period from 1995 to 2008. We find that an increase in residential mortgage loans seems to improve bank’s performance in terms of both profitability and credit risk in good market, pre-financial crisis, conditions. These findings may aid in explaining why banks rush to lend to property during booms because of the positive effect it has on performance. The results also show that credit risk and profitability are lower during the upturn in the residential property cycle.
Resumo:
The evidence for anthropogenic climate change continues to strengthen, and concerns about severe weather events are increasing. As a result, scientific interest is rapidly shifting from detection and attribution of global climate change to prediction of its impacts at the regional scale. However, nearly everything we have any confidence in when it comes to climate change is related to global patterns of surface temperature, which are primarily controlled by thermodynamics. In contrast, we have much less confidence in atmospheric circulation aspects of climate change, which are primarily controlled by dynamics and exert a strong control on regional climate. Model projections of circulation-related fields, including precipitation, show a wide range of possible outcomes, even on centennial timescales. Sources of uncertainty include low-frequency chaotic variability and the sensitivity to model error of the circulation response to climate forcing. As the circulation response to external forcing appears to project strongly onto existing patterns of variability, knowledge of errors in the dynamics of variability may provide some constraints on model projections. Nevertheless, higher scientific confidence in circulation-related aspects of climate change will be difficult to obtain. For effective decision-making, it is necessary to move to a more explicitly probabilistic, risk-based approach.
Resumo:
The globalization of trade in fish has created many challenges for the developing world specifically with regard to food safety and quality. International organisations have established a good basis for standards in international trade. Whilst these requirements are frequently embraced by the major importers (such as Japan, the EU and the USA), they often impose additional safety requirements and regularly identify batches which fail to meet their strict standards. Creating an effective national seafood control system which meets both the internal national needs as well the requirements for the export market can be challenging. Many countries adopt a dual system where seafood products for the major export markets are subject to tight control whilst the majority of the products (whether for the local market or for more regional trade) are less tightly controlled. With regional liberalization also occurring, deciding on appropriate controls is complex. In the Sultanate of Oman, fisheries production is one of the countries' chief sources of economic revenue after oil production and is a major source of the national food supply. In this paper the structure of the fish supply chain has been analysed and highlighted the different routes operating for the different markets. Although much of the fish are consumed within Oman, there is a major export trade to the local regional markets. Much smaller quantities meet the more stringent standards imposed by the major importing countries and exports to these are limited. The paper has considered the development of the Omani fish control system including the key legislative documents and the administrative structures that have been developed. Establishing modern controls which satisfy the demands of the major importers is possible but places additional costs on businesses. Enhanced controls such as HACCP and other management standards are required but can be difficult to justify when alternative markets do not specify these. These enhanced controls do however provide additional consumer protection and can bring benefits to local consumers. The Omani government is attempting to upgrade the system of controls and has made tremendous progress toward the implementation of HACCP and introducing enhanced management systems into its industrial sector. The existence of strengthened legislative and government support, including subsidies, has encouraged some businesses to implement HACCP. The current control systems have been reviewed and a SWOT analysis approach used to identify key factors for their future development. The study shows that seafood products in the supply chain are often exposed to lengthy handling and distribution process before reaching the consumers, a typical issue faced by many developing countries. As seafood products are often perishable, they safety is compromised if not adequately controlled. The enforcement of current food safety laws in the Sultanate of Oman is shared across various government agencies. Consequently, there is a need to harmonize all regulatory requirements, enhancing the domestic food protection and to continue to work towards a fully risk-based approach in order to compete successfully in the global market.
Resumo:
The extent to which a given extreme weather or climate event is attributable to anthropogenic climate change is a question of considerable public interest. From a scientific perspective, the question can be framed in various ways, and the answer depends very much on the framing. One such framing is a risk-based approach, which answers the question probabilistically, in terms of a change in likelihood of a class of event similar to the one in question, and natural variability is treated as noise. A rather different framing is a storyline approach, which examines the role of the various factors contributing to the event as it unfolded, including the anomalous aspects of natural variability, and answers the question deterministically. It is argued that these two apparently irreconcilable approaches can be viewed within a common framework, where the most useful level of conditioning will depend on the question being asked and the uncertainties involved.
Resumo:
We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.