24 resultados para decision framework

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

General principles of climate change adaptation for biodiversity have been formulated, but do not help prioritize actions. This is inhibiting their integration into conservation planning. We address this need with a decision framework that identifies and prioritizes actions to increase the adaptive capacity of species. The framework classifies species according to their current distribution and projected future climate space, as a basis for selecting appropriate decision trees. Decisions rely primarily on expert opinion, with additional information from quantitative models, where data are available. The framework considers in-situ management, followed by interventions at the landscape scale and finally translocation or ex-situ conservation. Synthesis and applications: From eight case studies, the key interventions identified for integrating climate change adaptation into conservation planning were local management and expansion of sites. We anticipate that, in combination with consideration of socio-economic and local factors, the decision framework will be a useful tool for conservation and natural resource managers to integrate adaptation measures into conservation plans.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Flood forecasting increasingly relies on numerical weather prediction forecasts to achieve longer lead times. One of the key difficulties that is emerging in constructing a decision framework for these flood forecasts is what to dowhen consecutive forecasts are so different that they lead to different conclusions regarding the issuing of warnings or triggering other action. In this opinion paper we explore some of the issues surrounding such forecast inconsistency (also known as "Jumpiness", "Turning points", "Continuity" or number of "Swings"). In thsi opinion paper we define forecast inconsistency; discuss the reasons why forecasts might be inconsistent; how we should analyse inconsistency; and what we should do about it; how we should communicate it and whether it is a totally undesirable property. The property of consistency is increasingly emerging as a hot topic in many forecasting environments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many countries have conservation plans for threatened species, but such plans have generally been developed without taking into account the potential impacts of climate change. Here, we apply a decision framework, specifically developed to identify and prioritise climate change adaptation actions and demonstrate its use for 30 species threatened in the UK. Our aim is to assess whether government conservation recommendations remain appropriate under a changing climate. The species, associated with three different habitats (lowland heath, broadleaved woodland and calcareous grassland), were selected from a range of taxonomic groups (primarily moths and vascular plants, but also including bees, bryophytes, carabid beetles and spiders). We compare the actions identified for these threatened species by the decision framework with those included in existing conservation plans, as developed by the UK Government's statutory adviser on nature conservation. We find that many existing conservation recommendations are also identified by the decision framework. However, there are large differences in the spatial prioritisation of actions when explicitly considering projected climate change impacts. This includes recommendations for actions to be carried out in areas where species do not currently occur, in order to allow them to track movement of suitable conditions for their survival. Uncertainties in climate change projections are not a reason to ignore them. Our results suggest that existing conservation plans, which do not take into account potential changes in suitable climatic conditions for species, may fail to maximise species persistence. Comparisons across species also suggest a more habitat-focused approach could be adopted to enable climate change adaptation for multiple species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multi-scale framework for decision support is presented that uses a combination of experiments, models, communication, education and decision support tools to arrive at a realistic strategy to minimise diffuse pollution. Effective partnerships between researchers and stakeholders play a key part in successful implementation of this strategy. The Decision Support Matrix (DSM) is introduced as a set of visualisations that can be used at all scales, both to inform decision making and as a communication tool in stakeholder workshops. A demonstration farm is presented and one of its fields is taken as a case study. Hydrological and nutrient flow path models are used for event based simulation (TOPCAT), catchment scale modelling (INCA) and field scale flow visualisation (TopManage). One of the DSMs; The Phosphorus Export Risk Matrix (PERM) is discussed in detail. The PERM was developed iteratively as a point of discussion in stakeholder workshops, as a decision support and education tool. The resulting interactive PERM contains a set of questions and proposed remediation measures that reflect both expert and local knowledge. Education and visualisation tools such as GIS, risk indicators, TopManage and the PERM are found to be invaluable in communicating improved farming practice to stakeholders. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A range of funding schemes and policy instruments exist to effect enhancement of the landscapes and habitats of the UK. While a number of assessments of these mechanisms have been conducted, little research has been undertaken to compare both quantitatively and qualitatively their relative effectiveness across a range of criteria. It is argued that few tools are available for such a multi-faceted evaluation of effectiveness. A form of Multiple Criteria Decision Analysis (MCDA) is justified and utilized as a framework in which to evaluate the effectiveness of nine mechanisms in relation to the protection of existing areas of chalk grassland and the creation of new areas in the South Downs of England. These include established schemes, such as the Countryside Stewardship and Environmentally Sensitive Area Schemes, along with other less common mechanisms, for example, land purchase and tender schemes. The steps involved in applying an MCDA to evaluate such mechanisms are identified and the process is described. Quantitative results from the comparison of the effectiveness of different mechanisms are presented, although the broader aim of the paper is that of demonstrating the performance of MCDA as a tool for measuring the effectiveness of mechanisms aimed at landscape and habitat enhancement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The games-against-nature approach to the analysis of uncertainty in decision-making relies on the assumption that the behaviour of a decision-maker can be explained by concepts such as maximin, minimax regret, or a similarly defined criterion. In reality, however, these criteria represent a spectrum and, the actual behaviour of a decision-maker is most likely to embody a mixture of such idealisations. This paper proposes that in game-theoretic approach to decision-making under uncertainty, a more realistic representation of a decision-maker's behaviour can be achieved by synthesising games-against-nature with goal programming into a single framework. The proposed formulation is illustrated by using a well-known example from the literature on mathematical programming models for agricultural-decision-making. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to present two multi-criteria decision-making models, including an Analytic Hierarchy Process (AHP) model and an Analytic Network Process (ANP) model for the assessment of deconstruction plans and to make a comparison between the two models with an experimental case study. Deconstruction planning is under pressure to reduce operation costs, adverse environmental impacts and duration, in the meanwhile to improve productivity and safety in accordance with structure characteristics, site conditions and past experiences. To achieve these targets in deconstruction projects, there is an impending need to develop a formal procedure for contractors to select a most appropriate deconstruction plan. Because numbers of factors influence the selection of deconstruction techniques, engineers definitely need effective tools to conduct the selection process. In this regard, multi-criteria decision-making methods such as AHP have been adopted to effectively support deconstruction technique selection in previous researches. in which it has been proved that AHP method can help decision-makers to make informed decisions on deconstruction technique selection based on a sound technical framework. In this paper, the authors present the application and comparison of two decision-making models including the AHP model and the ANP model for deconstruction plan assessment. The paper concludes that both AHP and ANP are viable and capable tools for deconstruction plan assessment under the same set of evaluation criteria. However, although the ANP can measure relationship among selection criteria and their sub-criteria, which is normally ignored in the AHP, the authors also indicate that whether the ANP model can provide a more accurate result should be examined in further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel intelligent multiple-controller framework incorporating a fuzzy-logic-based switching and tuning supervisor along with a generalised learning model (GLM) for an autonomous cruise control application. The proposed methodology combines the benefits of a conventional proportional-integral-derivative (PID) controller, and a PID structure-based (simultaneous) zero and pole placement controller. The switching decision between the two nonlinear fixed structure controllers is made on the basis of the required performance measure using a fuzzy-logic-based supervisor, operating at the highest level of the system. The supervisor is also employed to adaptively tune the parameters of the multiple controllers in order to achieve the desired closed-loop system performance. The intelligent multiple-controller framework is applied to the autonomous cruise control problem in order to maintain a desired vehicle speed by controlling the throttle plate angle in an electronic throttle control (ETC) system. Sample simulation results using a validated nonlinear vehicle model are used to demonstrate the effectiveness of the multiple-controller with respect to adaptively tracking the desired vehicle speed changes and achieving the desired speed of response, whilst penalising excessive control action. Crown Copyright (C) 2008 Published by Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When competing strategies for development programs, clinical trial designs, or data analysis methods exist, the alternatives need to be evaluated in a systematic way to facilitate informed decision making. Here we describe a refinement of the recently proposed clinical scenario evaluation framework for the assessment of competing strategies. The refinement is achieved by subdividing key elements previously proposed into new categories, distinguishing between quantities that can be estimated from preexisting data and those that cannot and between aspects under the control of the decision maker from those that are determined by external constraints. The refined framework is illustrated by an application to a design project for an adaptive seamless design for a clinical trial in progressive multiple sclerosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the building industry proceeds in the direction of low impact buildings, research attention is being drawn towards the reduction of carbon dioxide emission and waste. Starting from design and construction to operation and demolition, various building materials are used throughout the whole building lifecycle involving significant energy consumption and waste generation. Building Information Modelling (BIM) is emerging as a tool that can support holistic design-decision making for reducing embodied carbon and waste production in the building lifecycle. This study aims to establish a framework for assessing embodied carbon and waste underpinned by BIM technology. On the basis of current research review, the framework is considered to include functional modules for embodied carbon computation. There are a module for waste estimation, a knowledge-base of construction and demolition methods, a repository of building components information, and an inventory of construction materials’ energy and carbon. Through both static 3D model visualisation and dynamic modelling supported by the framework, embodied energy (carbon), waste and associated costs can be analysed in the boundary of cradle-to-gate, construction, operation, and demolition. The proposed holistic modelling framework provides a possibility to analyse embodied carbon and waste from different building lifecycle perspectives including associated costs. It brings together existing segmented embodied carbon and waste estimation into a unified model, so that interactions between various parameters through the different building lifecycle phases can be better understood. Thus, it can improve design-decision support for optimal low impact building development. The applicability of this framework is anticipated being developed and tested on industrial projects in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing concern in reducing greenhouse gas emissions all over the world. The U.K. has set 34% target reduction of emission before 2020 and 80% before 2050 compared to 1990 recently in Post Copenhagen Report on Climate Change. In practise, Life Cycle Cost (LCC) and Life Cycle Assessment (LCA) tools have been introduced to construction industry in order to achieve this such as. However, there is clear a disconnection between costs and environmental impacts over the life cycle of a built asset when using these two tools. Besides, the changes in Information and Communication Technologies (ICTs) lead to a change in the way information is represented, in particular, information is being fed more easily and distributed more quickly to different stakeholders by the use of tool such as the Building Information Modelling (BIM), with little consideration on incorporating LCC and LCA and their maximised usage within the BIM environment. The aim of this paper is to propose the development of a model-based LCC and LCA tool in order to provide sustainable building design decisions for clients, architects and quantity surveyors, by then an optimal investment decision can be made by studying the trade-off between costs and environmental impacts. An application framework is also proposed finally as the future work that shows how the proposed model can be incorporated into the BIM environment in practise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decadal predictions have a high profile in the climate science community and beyond, yet very little is known about their skill. Nor is there any agreed protocol for estimating their skill. This paper proposes a sound and coordinated framework for verification of decadal hindcast experiments. The framework is illustrated for decadal hindcasts tailored to meet the requirements and specifications of CMIP5 (Coupled Model Intercomparison Project phase 5). The chosen metrics address key questions about the information content in initialized decadal hindcasts. These questions are: (1) Do the initial conditions in the hindcasts lead to more accurate predictions of the climate, compared to un-initialized climate change projections? and (2) Is the prediction model’s ensemble spread an appropriate representation of forecast uncertainty on average? The first question is addressed through deterministic metrics that compare the initialized and uninitialized hindcasts. The second question is addressed through a probabilistic metric applied to the initialized hindcasts and comparing different ways to ascribe forecast uncertainty. Verification is advocated at smoothed regional scales that can illuminate broad areas of predictability, as well as at the grid scale, since many users of the decadal prediction experiments who feed the climate data into applications or decision models will use the data at grid scale, or downscale it to even higher resolution. An overall statement on skill of CMIP5 decadal hindcasts is not the aim of this paper. The results presented are only illustrative of the framework, which would enable such studies. However, broad conclusions that are beginning to emerge from the CMIP5 results include (1) Most predictability at the interannual-to-decadal scale, relative to climatological averages, comes from external forcing, particularly for temperature; (2) though moderate, additional skill is added by the initial conditions over what is imparted by external forcing alone; however, the impact of initialization may result in overall worse predictions in some regions than provided by uninitialized climate change projections; (3) limited hindcast records and the dearth of climate-quality observational data impede our ability to quantify expected skill as well as model biases; and (4) as is common to seasonal-to-interannual model predictions, the spread of the ensemble members is not necessarily a good representation of forecast uncertainty. The authors recommend that this framework be adopted to serve as a starting point to compare prediction quality across prediction systems. The framework can provide a baseline against which future improvements can be quantified. The framework also provides guidance on the use of these model predictions, which differ in fundamental ways from the climate change projections that much of the community has become familiar with, including adjustment of mean and conditional biases, and consideration of how to best approach forecast uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a world where massive amounts of data are recorded on a large scale we need data mining technologies to gain knowledge from the data in a reasonable time. The Top Down Induction of Decision Trees (TDIDT) algorithm is a very widely used technology to predict the classification of newly recorded data. However alternative technologies have been derived that often produce better rules but do not scale well on large datasets. Such an alternative to TDIDT is the PrismTCS algorithm. PrismTCS performs particularly well on noisy data but does not scale well on large datasets. In this paper we introduce Prism and investigate its scaling behaviour. We describe how we improved the scalability of the serial version of Prism and investigate its limitations. We then describe our work to overcome these limitations by developing a framework to parallelise algorithms of the Prism family and similar algorithms. We also present the scale up results of a first prototype implementation.