993 resultados para Operational planning
Resumo:
In this paper we propose an alternative model of, what is often called, land value capture in the planning system. Based on development viability models, negotiations and policy formation regarding the level of planning obligations have taken place at the local level with little clear guidance on technique, approach and method. It is argued that current approaches are regressive and fail to reflect how the ability of sites to generate planning gain can vary over time and between sites. The alternative approach suggested here attempts to rationalise rather than replace the existing practice of development viability appraisal. It is based upon the assumption that schemes with similar development values should produce similar levels of return to the landowner, developer and other stakeholders in the development as well as similar levels of planning obligations in all parts of the country. Given the high level of input uncertainty in viability modelling, a simple viability model is ‘good enough’ to quantify the maximum level of planning obligations for a given level of development value. We have argued that such an approach can deliver a more durable, equitable, simpler, consistent and cheaper method for policy formation regarding planning obligations.
Resumo:
Area-wide development viability appraisals are undertaken to determine the economic feasibility of policy targets in relation to planning obligations. Essentially, development viability appraisals consist of a series of residual valuations of hypothetical development sites across a local authority area at a particular point in time. The valuations incorporate the estimated financial implications of the proposed level of planning obligations. To determine viability the output land values are benchmarked against threshold land value and therefore the basis on which this threshold is established and the level at which it is set is critical to development viability appraisal at the policy-setting (area-wide) level. Essentially it is an estimate of the value at which a landowner would be prepared to sell. If the estimated site values are higher than the threshold land value the policy target is considered viable. This paper investigates the effectiveness of existing methods of determining threshold land value. They will be tested against the relationship between development value and costs. Modelling reveals that threshold land value that is not related to shifts in development value renders marginal sites unviable and fails to collect proportionate planning obligations from high value/low cost sites. Testing the model against national average house prices and build costs reveals the high degree of volatility in residual land values over time and underlines the importance of making threshold land value relative to the main driver of this volatility, namely development value.
Resumo:
We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.
Resumo:
This paper discusses concepts of space within the planning literature, the issues they give rise to and the gaps they reveal. It then introduces the notion of 'fractals' borrowed from complexity theory and illustrates how it unconsciously appears in planning practice. It then moves on to abstract the core dynamics through which fractals can be consciously applied and illustrates their working through a reinterpretation of the People's Planning Campaign of Kerala, India. Finally it highlights the key contribution of the fractal concept and the advantages that this conceptualisation brings to planning.
Resumo:
This article reviews the use of complexity theory in planning theory using the theory of metaphors for theory transfer and theory construction. The introduction to the article presents the author's positioning of planning theory. The first section thereafter provides a general background of the trajectory of development of complexity theory and discusses the rationale of using the theory of metaphors for evaluating the use of complexity theory in planning. The second section introduces the workings of metaphors in general and theory-constructing metaphors in particular, drawing out an understanding of how to proceed with an evaluative approach towards an analysis of the use of complexity theory in planning. The third section presents two case studies – reviews of two articles – to illustrate how the framework might be employed. It then discusses the implications of the evaluation for the question ‘can complexity theory contribute to planning?’ The concluding section discusses the employment of the ‘theory of metaphors’ for evaluating theory transfer and draws out normative suggestions for engaging in theory transfer using the metaphorical route.
Resumo:
Planning of autonomous vehicles in the absence of speed lanes is a less-researched problem. However, it is an important step toward extending the possibility of autonomous vehicles to countries where speed lanes are not followed. The advantages of having nonlane-oriented traffic include larger traffic bandwidth and more overtaking, which are features that are highlighted when vehicles vary in terms of speed and size. In the most general case, the road would be filled with a complex grid of static obstacles and vehicles of varying speeds. The optimal travel plan consists of a set of maneuvers that enables a vehicle to avoid obstacles and to overtake vehicles in an optimal manner and, in turn, enable other vehicles to overtake. The desired characteristics of this planning scenario include near completeness and near optimality in real time with an unstructured environment, with vehicles essentially displaying a high degree of cooperation and enabling every possible(safe) overtaking procedure to be completed as soon as possible. Challenges addressed in this paper include a (fast) method for initial path generation using an elastic strip, (re-)defining the notion of completeness specific to the problem, and inducing the notion of cooperation in the elastic strip. Using this approach, vehicular behaviors of overtaking, cooperation, vehicle following,obstacle avoidance, etc., are demonstrated.
Resumo:
This ethnographic inquiry examines how family languages policies are planned and developed in ten Chinese immigrant families in Quebec, Canada, with regard to their children’s language and literacy education in three languages, Chinese, English, and French. The focus is on how multilingualism is perceived and valued, and how these three languages are linked to particular linguistic markets. The parental ideology that underpins the family language policy, the invisible language planning, is the central focus of analysis. The results suggest that family language policies are strongly influenced by socio-political and economical factors. In addition, the study confirms that the parents’ educational background, their immigration experiences and their cultural disposition, in this case pervaded by Confucian thinking, contribute significantly to parental expectations and aspirations and thus to the family language policies.
Resumo:
This paper describes the techniques used to obtain sea surface temperature (SST) retrievals from the Geostationary Operational Environmental Satellite 12 (GOES-12) at the National Oceanic and Atmospheric Administration’s Office of Satellite Data Processing and Distribution. Previous SST retrieval techniques relying on channels at 11 and 12 μm are not applicable because GOES-12 lacks the latter channel. Cloud detection is performed using a Bayesian method exploiting fast-forward modeling of prior clear-sky radiances using numerical weather predictions. The basic retrieval algorithm used at nighttime is based on a linear combination of brightness temperatures at 3.9 and 11 μm. In comparison with traditional split window SSTs (using 11- and 12-μm channels), simulations show that this combination has maximum scatter when observing drier colder scenes, with a comparable overall performance. For daytime retrieval, the same algorithm is applied after estimating and removing the contribution to brightness temperature in the 3.9-μm channel from solar irradiance. The correction is based on radiative transfer simulations and comprises a parameterization for atmospheric scattering and a calculation of ocean surface reflected radiance. Potential use of the 13-μm channel for SST is shown in a simulation study: in conjunction with the 3.9-μm channel, it can reduce the retrieval error by 30%. Some validation results are shown while a companion paper by Maturi et al. shows a detailed analysis of the validation results for the operational algorithms described in this present article.
Resumo:
NOAA's National Environmental Satellite, Data, and Information Service (NESDIS) has generated sea surface temperature (SST) products from Geostationary Operational Environmental Satellite (GOES)-East (E) and GOES-West (W) on an operational basis since December 2000. Since that time, a process of continual development has produced steady improvements in product accuracy. Recent improvements extended the capability to permit generation of operational SST retrievals from the Japanese Multifunction Transport Satellite (MTSAT)-1R and the European Meteosat Second Generation (MSG) satellite, thereby extending spatial coverage. The four geostationary satellites (at longitudes of 75°W, 135°W, 140°E, and 0°) provide high temporal SST retrievals for most of the tropics and midlatitudes, with the exception of a region between 60° and 80°E. Because of ongoing development, the quality of these retrievals now approaches that of SST products from the polar-orbiting Advanced Very High Resolution Radiometer (AVHRR). These products from GOES provide hourly regional imagery, 3-hourly hemispheric imagery, 24-h merged composites, a GOES SST level 2 preprocessed product every 1/2 h for each hemisphere, and a match-up data file for each product. The MTSAT and the MSG products include hourly, 3-hourly, and 24-h merged composites. These products provide the user community with a reliable source of SST observations, with improved accuracy and increased coverage in important oceanographic, meteorological, and climatic regions.
Resumo:
We propose and demonstrate a fully probabilistic (Bayesian) approach to the detection of cloudy pixels in thermal infrared (TIR) imagery observed from satellite over oceans. Using this approach, we show how to exploit the prior information and the fast forward modelling capability that are typically available in the operational context to obtain improved cloud detection. The probability of clear sky for each pixel is estimated by applying Bayes' theorem, and we describe how to apply Bayes' theorem to this problem in general terms. Joint probability density functions (PDFs) of the observations in the TIR channels are needed; the PDFs for clear conditions are calculable from forward modelling and those for cloudy conditions have been obtained empirically. Using analysis fields from numerical weather prediction as prior information, we apply the approach to imagery representative of imagers on polar-orbiting platforms. In comparison with the established cloud-screening scheme, the new technique decreases both the rate of failure to detect cloud contamination and the false-alarm rate by one quarter. The rate of occurrence of cloud-screening-related errors of >1 K in area-averaged SSTs is reduced by 83%. Copyright © 2005 Royal Meteorological Society.
Resumo:
Urban metabolism considers a city as a system with flows of energy and material between it and the environment. Recent advances in bio-physical sciences provide methods and models to estimate local scale energy, water, carbon and pollutant fluxes. However, good communication is required to provide this new knowledge and its implications to endusers (such as urban planners, architects and engineers). The FP7 project BRIDGE (sustainaBle uRban plannIng Decision support accountinG for urban mEtabolism) aimed to address this gap by illustrating the advantages of considering these issues in urban planning. The BRIDGE Decision Support System (DSS) aids the evaluation of the sustainability of urban planning interventions. The Multi Criteria Analysis approach adopted provides a method to cope with the complexity of urban metabolism. In consultation with targeted end-users, objectives were defined in relation to the interactions between the environmental elements (fluxes of energy, water, carbon and pollutants) and socioeconomic components (investment costs, housing, employment, etc.) of urban sustainability. The tool was tested in five case study cities: Helsinki, Athens, London, Florence and Gliwice; and sub-models were evaluated using flux data selected. This overview of the BRIDGE project covers the methods and tools used to measure and model the physical flows, the selected set of sustainability indicators, the methodological framework for evaluating urban planning alternatives and the resulting DSS prototype.
Resumo:
Purpose – The creation of a target market strategy is integral to developing an effective business strategy. The concept of market segmentation is often cited as pivotal to establishing a target market strategy, yet all too often business-to-business marketers utilise little more than trade sectors or product groups as the basis for their groupings of customers, rather than customers' characteristics and buying behaviour. The purpose of this paper is to offer a solution for managers, focusing on customer purchasing behaviour, which evolves from the organisation's existing criteria used for grouping its customers. Design/methodology/approach – One of the underlying reasons managers fail to embrace best practice market segmentation is their inability to manage the transition from how target markets in an organisation are currently described to how they might look when based on customer characteristics, needs, purchasing behaviour and decision-making. Any attempt to develop market segments should reflect the inability of organisations to ignore their existing customer group classification schemes and associated customer-facing operational practices, such as distribution channels and sales force allocations. Findings – A straightforward process has been derived and applied, enabling organisations to practice market segmentation in an evolutionary manner, facilitating the transition to customer-led target market segments. This process also ensures commitment from the managers responsible for implementing the eventual segmentation scheme. This paper outlines the six stages of this process and presents an illustrative example from the agrichemicals sector, supported by other cases. Research implications – The process presented in this paper for embarking on market segmentation focuses on customer purchasing behaviour rather than business sectors or product group classifications - which is true to the concept of market segmentation - but in a manner that participating managers find non-threatening. The resulting market segments have their basis in the organisation's existing customer classification schemes and are an iteration to which most managers readily buy-in. Originality/value – Despite the size of the market segmentation literature, very few papers offer step-by-step guidance for developing customer-focused market segments in business-to-business marketing. The analytical tool for assessing customer purchasing deployed in this paper originally was created to assist in marketing planning programmes, but has since proved its worth as the foundation for creating segmentation schemes in business marketing, as described in this paper.
Resumo:
Following trends in operational weather forecasting, where ensemble prediction systems (EPS) are now increasingly the norm, flood forecasters are beginning to experiment with using similar ensemble methods. Most of the effort to date has focused on the substantial technical challenges of developing coupled rainfall-runoff systems to represent the full cascade of uncertainties involved in predicting future flooding. As a consequence much less attention has been given to the communication and eventual use of EPS flood forecasts. Drawing on interviews and other research with operational flood forecasters from across Europe, this paper highlights a number of challenges to communicating and using ensemble flood forecasts operationally. It is shown that operational flood forecasters understand the skill, operational limitations, and informational value of EPS products in a variety of different and sometimes contradictory ways. Despite the efforts of forecasting agencies to design effective ways to communicate EPS forecasts to non-experts, operational flood forecasters were often skeptical about the ability of forecast recipients to understand or use them appropriately. It is argued that better training and closer contacts between operational flood forecasters and EPS system designers can help ensure the uncertainty represented by EPS forecasts is represented in ways that are most appropriate and meaningful for their intended consumers, but some fundamental political and institutional challenges to using ensembles, such as differing attitudes to false alarms and to responsibility for management of blame in the event of poor or mistaken forecasts are also highlighted. Copyright © 2010 Royal Meteorological Society.
Resumo:
This paper highlights some communicative and institutional challenges to using ensemble prediction systems (EPS) in operational flood forecasting, warning, and civil protection. Focusing in particular on the Swedish experience, as part of the PREVIEW FP6 project, of applying EPS to operational flood forecasting, the paper draws on a wider set of site visits, interviews, and participant observation with flood forecasting centres and civil protection authorities (CPAs) in Sweden and 15 other European states to reflect on the comparative success of Sweden in enabling CPAs to make operational use of EPS for flood risk management. From that experience, the paper identifies four broader lessons for other countries interested in developing the operational capacity to make, communicate, and use EPS for flood forecasting and civil protection. We conclude that effective training and clear communication of EPS, while clearly necessary, are by no means sufficient to ensure effective use of EPS. Attention must also be given to overcoming the institutional obstacles to their use and to identifying operational choices for which EPS is seen to add value rather than uncertainty to operational decision making by CPAs.