964 resultados para Policy Modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biosecurity is a great challenge to policy-makers globally. Biosecurity policies aim to either prevent invasions before they occur or to eradicate and/or effectively manage the invasive species and diseases once an invasion has occurred. Such policies have traditionally been directed towards professional producers in natural resource based sectors, including agriculture. Given the wide scope of issues threatened by invasive species and diseases, it is important to account for several types of stakeholders that are involved. We investigate the problem of an invasive insect pest feeding on an agricultural crop with heterogeneous producers: profit-oriented professional farmers and utility-oriented hobby farmers. We start from an ecological-economic model conceptually similar to the one developed by Eiswerth and Johnson [Eiswerth, M.E. and Johnson, W.S., 2002. Managing nonindigenous invasive species: insights from dynamic analysis. Environmental and Resource Economics 23, 319-342.] and extend it in three ways. First, we make explicit the relationship between the invaded state carrying capacity and farmers' planting decisions. Second, we add another producer type into the framework and hence account for the existence of both professional and hobby fanners. Third, we provide a theoretical contribution by discussing two alternative types of equilibria. We also apply the model to an empirical case to extract a number of stylised facts and in particular to assess: a) under which circumstances the invasion is likely to be not controllable; and b) how extending control policies to hobby farmers could affect both types of producers. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Health care providers, purchasers and policy makers need to make informed decisions regarding the provision of cost-effective care. When a new health care intervention is to be compared with the current standard, an economic evaluation alongside an evaluation of health benefits provides useful information for the decision making process. We consider the information on cost-effectiveness which arises from an individual clinical trial comparing the two interventions. Recent methods for conducting a cost-effectiveness analysis for a clinical trial have focused on the net benefit parameter. The net benefit parameter, a function of costs and health benefits, is positive if the new intervention is cost-effective compared with the standard. In this paper we describe frequentist and Bayesian approaches to cost-effectiveness analysis which have been suggested in the literature and apply them to data from a clinical trial comparing laparoscopic surgery with open mesh surgery for the repair of inguinal hernias. We extend the Bayesian model to allow the total cost to be divided into a number of different components. The advantages and disadvantages of the different approaches are discussed. In January 2001, NICE issued guidance on the type of surgery to be used for inguinal hernia repair. We discuss our example in the light of this information. Copyright © 2003 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change is one of the major challenges facing economic systems at the start of the 21st century. Reducing greenhouse gas emissions will require both restructuring the energy supply system (production) and addressing the efficiency and sufficiency of the social uses of energy (consumption). The energy production system is a complicated supply network of interlinked sectors with 'knock-on' effects throughout the economy. End use energy consumption is governed by complex sets of interdependent cultural, social, psychological and economic variables driven by shifts in consumer preference and technological development trajectories. To date, few models have been developed for exploring alternative joint energy production-consumption systems. The aim of this work is to propose one such model. This is achieved in a methodologically coherent manner through integration of qualitative input-output models of production, with Bayesian belief network models of consumption, at point of final demand. The resulting integrated framework can be applied either (relatively) quickly and qualitatively to explore alternative energy scenarios, or as a fully developed quantitative model to derive or assess specific energy policy options. The qualitative applications are explored here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quadratic programming techniques were applied to household food consumption data in England and Wales to estimate likely changes in diet under healthy eating guidelines, and the consequences this would have on agriculture and land use in England and Wales. The first step entailed imposing nutrient restrictions on food consumption following dietary recommendations suggested by the UK Department of Health. The resulting diet was used, in a second step as a proxy for demand in agricultural commodities, to test the impact of such a scenario on food production and land use in England and Wales and the impacts of this on agricultural landscapes. Results of the diet optimisation indicated a large drop in consumption of foods rich in saturated fats and sugar, essentially cheese and sugar-based products, along with lesser cuts of fat and meat products. Conversely, consumption of fruit and vegetables, cereals, and flour would increase to meet dietary fibre recommendations. Such a shift in demand would dramatically affect production patterns: the financial net margin of England and Wales agriculture would rise, due to increased production of high market value and high economic margin crops. Some regions would, however, be negatively affected, mostly those dependent on beef cattle and sheep production that could not benefit from an increased demand for cereals and horticultural crops. The effects of these changes would also be felt in upstream industries, such as animal feed suppliers. While arable dominated landscapes would be little affected, pastoral landscapes would suffer through loss of grazing management and, possibly, land abandonment, especially in upland areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper critically explores the politics that mediate the use of environmental science assessments as the basis of resource management policy. Drawing on recent literature in the political ecology tradition that has emphasised the politicised nature of the production and use of scientific knowledge in environmental management, the paper analyses a hydrological assessment in a small river basin in Chile, undertaken in response to concerns over the possible overexploitation of groundwater resources. The case study illustrates the limitations of an approach based predominantly on hydrogeological modelling to ascertain the effects of increased groundwater abstraction. In particular, it identifies the subjective ways in which the assessment was interpreted and used by the state water resources agency to underpin water allocation decisions in accordance with its own interests, and the role that a desocialised assessment played in reproducing unequal patterns of resource use and configuring uneven waterscapes. Nevertheless, as Chile’s ‘neoliberal’ political-economic framework privileges the role of science and technocracy, producing other forms of environmental knowledge to complement environmental science is likely to be contentious. In conclusion, the paper considers the potential of mobilising the concept of the hydrosocial cycle to further critically engage with environmental science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Government targets for CO2 reductions are being progressively tightened, the Climate Change Act set the UK target as an 80% reduction by 2050 on 1990 figures. The residential sector accounts for about 30% of emissions. This paper discusses current modelling techniques in the residential sector: principally top-down and bottom-up. Top-down models work on a macro-economic basis and can be used to consider large scale economic changes; bottom-up models are detail rich to model technological changes. Bottom-up models demonstrate what is technically possible. However, there are differences between the technical potential and what is likely given the limited economic rationality of the typical householder. This paper recommends research to better understand individuals’ behaviour. Such research needs to include actual choices, stated preferences and opinion research to allow a detailed understanding of the individual end user. This increased understanding can then be used in an agent based model (ABM). In an ABM, agents are used to model real world actors and can be given a rule set intended to emulate the actions and behaviours of real people. This can help in understanding how new technologies diffuse. In this way a degree of micro-economic realism can be added to domestic carbon modelling. Such a model should then be of use for both forward projections of CO2 and to analyse the cost effectiveness of various policy measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The UK has a target for an 80% reduction in CO2 emissions by 2050 from a 1990 base. Domestic energy use accounts for around 30% of total emissions. This paper presents a comprehensive review of existing models and modelling techniques and indicates how they might be improved by considering individual buying behaviour. Macro (top-down) and micro (bottom-up) models have been reviewed and analysed. It is found that bottom-up models can project technology diffusion due to their higher resolution. The weakness of existing bottom-up models at capturing individual green technology buying behaviour has been identified. Consequently, Markov chains, neural networks and agent-based modelling are proposed as possible methods to incorporate buying behaviour within a domestic energy forecast model. Among the three methods, agent-based models are found to be the most promising, although a successful agent approach requires large amounts of input data. A prototype agent-based model has been developed and tested, which demonstrates the feasibility of an agent approach. This model shows that an agent-based approach is promising as a means to predict the effectiveness of various policy measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to evaluate the future potential benefits of emission regulation on regional air quality, while taking into account the effects of climate change, off-line air quality projection simulations are driven using weather forcing taken from regional climate models. These regional models are themselves driven by simulations carried out using global climate models (GCM) and economical scenarios. Uncertainties and biases in climate models introduce an additional “climate modeling” source of uncertainty that is to be added to all other types of uncertainties in air quality modeling for policy evaluation. In this article we evaluate the changes in air quality-related weather variables induced by replacing reanalyses-forced by GCM-forced regional climate simulations. As an example we use GCM simulations carried out in the framework of the ERA-interim programme and of the CMIP5 project using the Institut Pierre-Simon Laplace climate model (IPSLcm), driving regional simulations performed in the framework of the EURO-CORDEX programme. In summer, we found compensating deficiencies acting on photochemistry: an overestimation by GCM-driven weather due to a positive bias in short-wave radiation, a negative bias in wind speed, too many stagnant episodes, and a negative temperature bias. In winter, air quality is mostly driven by dispersion, and we could not identify significant differences in either wind or planetary boundary layer height statistics between GCM-driven and reanalyses-driven regional simulations. However, precipitation appears largely overestimated in GCM-driven simulations, which could significantly affect the simulation of aerosol concentrations. The identification of these biases will help interpreting results of future air quality simulations using these data. Despite these, we conclude that the identified differences should not lead to major difficulties in using GCM-driven regional climate simulations for air quality projections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Anaerobic digestion (AD) technologies convert organic wastes and crops into methane-rich biogas for heating, electricity generation and vehicle fuel. Farm-based AD has proliferated in some EU countries, driven by favourable policies promoting sustainable energy generation and GHG mitigation. Despite increased state support there are still few AD plants on UK farms leading to a lack of normative data on viability of AD in the whole-farm context. Farmers and lenders are therefore reluctant to fund AD projects and policy makers are hampered in their attempts to design policies that adequately support the industry. Existing AD studies and modelling tools do not adequately capture the farm context within which AD interacts. This paper demonstrates a whole-farm, optimisation modelling approach to assess the viability of AD in a more holistic way, accounting for such issues as: AD scale, synergies and conflicts with other farm enterprises, choice of feedstocks, digestate use and impact on farm Net Margin. This modelling approach demonstrates, for example, that: AD is complementary to dairy enterprises, but competes with arable enterprises for farm resources. Reduced nutrient purchases significantly improve Net Margin on arable farms, but AD scale is constrained by the capacity of farmland to absorb nutrients in AD digestate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Insect pollination benefits over three quarters of the world's major crops. There is growing concern that observed declines in pollinators may impact on production and revenues from animal pollinated crops. Knowing the distribution of pollinators is therefore crucial for estimating their availability to pollinate crops; however, in general, we have an incomplete knowledge of where these pollinators occur. We propose a method to predict geographical patterns of pollination service to crops, novel in two elements: the use of pollinator records rather than expert knowledge to predict pollinator occurrence, and the inclusion of the managed pollinator supply. We integrated a maximum entropy species distribution model (SDM) with an existing pollination service model (PSM) to derive the availability of pollinators for crop pollination. We used nation-wide records of wild and managed pollinators (honey bees) as well as agricultural data from Great Britain. We first calibrated the SDM on a representative sample of bee and hoverfly crop pollinator species, evaluating the effects of different settings on model performance and on its capacity to identify the most important predictors. The importance of the different predictors was better resolved by SDM derived from simpler functions, with consistent results for bees and hoverflies. We then used the species distributions from the calibrated model to predict pollination service of wild and managed pollinators, using field beans as a test case. The PSM allowed us to spatially characterize the contribution of wild and managed pollinators and also identify areas potentially vulnerable to low pollination service provision, which can help direct local scale interventions. This approach can be extended to investigate geographical mismatches between crop pollination demand and the availability of pollinators, resulting from environmental change or policy scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decade the English planning system has placed greater emphasis on the financial viability of development. ‘Calculative’ practices have been used to quantify and capture land value uplifts. Development viability appraisal (DVA) has become a key part of the evidence base used in planning decision-making and informs both ‘site-specific’ negotiations about the level of land value capture for individual schemes and ‘area-wide’ planning policy formation. This paper investigates how implementation of DVA is governed in planning policy formation. It is argued that the increased use of DVA raises important questions about how planning decisions are made and operationalised, not least because DVA is often poorly understood by some key stakeholders. The paper uses the concept of governance to thematically analyse semi-structured interviews conducted with the producers of DVAs and considers key procedural issues including (in)consistencies in appraisal practices, levels of stakeholder consultation and the potential for client and producer bias. Whilst stakeholder consultation is shown to be integral to the appraisal process in order to improve the quality of the appraisals and to legitimise the outputs, participation is restricted to industry experts and excludes some interest groups, including local communities. It is concluded that, largely because of its recent adoption and knowledge asymmetries between local planning authorities and appraisers, DVA is a weakly governed process characterised by emerging and contested guidance and is therefore ‘up for grabs’.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops background considerations to help better framing the results of a CGE exercise. Three main criticisms are usually addressed to CGE efforts. First, they are too aggregate, their conclusions failing to shed light on relevant sectors or issues. Second, they imply huge data requirements. Timeliness is frequently jeopardised by out-dated sources, benchmarks referring to realities gone by. Finally, results are meaningless, as they answer wrong or ill-posed questions. Modelling demands end up by creating a rather artificial context, where the original questions lose content. In spite of a positive outlook on the first two, crucial questions lie in the third point. After elaborating such questions, and trying to answer some, the text argues that CGE models can come closer to reality. If their use is still scarce to give way to a fruitful symbiosis between negotiations and simulation results, they remain the only available technique providing a global, inter-related way of capturing economy-wide effects of several different policies. International organisations can play a major role supporting and encouraging improvements. They are also uniquely positioned to enhance information and data sharing, as well as putting people from various origins together, to share their experiences. A serious and complex homework is however required, to correct, at least, the most dangerous present shortcomings of the technique.