81 resultados para time-optimal trajectory planning
Resumo:
In this paper we propose an alternative model of, what is often called, land value capture in the planning system. Based on development viability models, negotiations and policy formation regarding the level of planning obligations have taken place at the local level with little clear guidance on technique, approach and method. It is argued that current approaches are regressive and fail to reflect how the ability of sites to generate planning gain can vary over time and between sites. The alternative approach suggested here attempts to rationalise rather than replace the existing practice of development viability appraisal. It is based upon the assumption that schemes with similar development values should produce similar levels of return to the landowner, developer and other stakeholders in the development as well as similar levels of planning obligations in all parts of the country. Given the high level of input uncertainty in viability modelling, a simple viability model is good enough to quantify the maximum level of planning obligations for a given level of development value. We have argued that such an approach can deliver a more durable, equitable, simpler, consistent and cheaper method for policy formation regarding planning obligations.
Resumo:
Area-wide development viability appraisals are undertaken to determine the economic feasibility of policy targets in relation to planning obligations. Essentially, development viability appraisals consist of a series of residual valuations of hypothetical development sites across a local authority area at a particular point in time. The valuations incorporate the estimated financial implications of the proposed level of planning obligations. To determine viability the output land values are benchmarked against threshold land value and therefore the basis on which this threshold is established and the level at which it is set is critical to development viability appraisal at the policy-setting (area-wide) level. Essentially it is an estimate of the value at which a landowner would be prepared to sell. If the estimated site values are higher than the threshold land value the policy target is considered viable. This paper investigates the effectiveness of existing methods of determining threshold land value. They will be tested against the relationship between development value and costs. Modelling reveals that threshold land value that is not related to shifts in development value renders marginal sites unviable and fails to collect proportionate planning obligations from high value/low cost sites. Testing the model against national average house prices and build costs reveals the high degree of volatility in residual land values over time and underlines the importance of making threshold land value relative to the main driver of this volatility, namely development value.
Resumo:
A Lagrangian model of photochemistry and mixing is described (CiTTyCAT, stemming from the Cambridge Tropospheric Trajectory model of Chemistry And Transport), which is suitable for transport and chemistry studies throughout the troposphere. Over the last five years, the model has been developed in parallel at several different institutions and here those developments have been incorporated into one "community" model and documented for the first time. The key photochemical developments include a new scheme for biogenic volatile organic compounds and updated emissions schemes. The key physical development is to evolve composition following an ensemble of trajectories within neighbouring air-masses, including a simple scheme for mixing between them via an evolving "background profile", both within the boundary layer and free troposphere. The model runs along trajectories pre-calculated using winds and temperature from meteorological analyses. In addition, boundary layer height and precipitation rates, output from the analysis model, are interpolated to trajectory points and used as inputs to the mixing and wet deposition schemes. The model is most suitable in regimes when the effects of small-scale turbulent mixing are slow relative to advection by the resolved winds so that coherent air-masses form with distinct composition and strong gradients between them. Such air-masses can persist for many days while stretching, folding and thinning. Lagrangian models offer a useful framework for picking apart the processes of air-mass evolution over inter-continental distances, without being hindered by the numerical diffusion inherent to global Eulerian models. The model, including different box and trajectory modes, is described and some output for each of the modes is presented for evaluation. The model is available for download from a Subversion-controlled repository by contacting the corresponding authors.
Resumo:
During long-range transport, many distinct processes including photochemistry, deposition, emissions and mixing contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio ( O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing ( O3chem) over the whole simulation is greater than net physical processing ( O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on O3chem than O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is 5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.
Resumo:
We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three cropclimate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.
Resumo:
This article reviews the use of complexity theory in planning theory using the theory of metaphors for theory transfer and theory construction. The introduction to the article presents the author's positioning of planning theory. The first section thereafter provides a general background of the trajectory of development of complexity theory and discusses the rationale of using the theory of metaphors for evaluating the use of complexity theory in planning. The second section introduces the workings of metaphors in general and theory-constructing metaphors in particular, drawing out an understanding of how to proceed with an evaluative approach towards an analysis of the use of complexity theory in planning. The third section presents two case studies reviews of two articles to illustrate how the framework might be employed. It then discusses the implications of the evaluation for the question can complexity theory contribute to planning? The concluding section discusses the employment of the theory of metaphors for evaluating theory transfer and draws out normative suggestions for engaging in theory transfer using the metaphorical route.
Resumo:
Optimal estimation (OE) and probabilistic cloud screening were developed to provide lake surface water temperature (LSWT) estimates from the series of (advanced) along-track scanning radiometers (ATSRs). Variations in physical properties such as elevation, salinity, and atmospheric conditions are accounted for through the forward modelling of observed radiances. Therefore, the OE retrieval scheme developed is generic (i.e., applicable to all lakes). LSWTs were obtained for 258 of Earth's largest lakes from ATSR-2 and AATSR imagery from 1995 to 2009. Comparison to in situ observations from several lakes yields satellite in situ differences of 0.2 0.7 K for daytime and 0.1 0.5 K for nighttime observations (mean standard deviation). This compares with 0.05 0.8 K for daytime and 0.1 0.9 K for nighttime observations for previous methods based on operational sea surface temperature algorithms. The new approach also increases coverage (reducing misclassification of clear sky as cloud) and exhibits greater consistency between retrievals using different channelview combinations. Empirical orthogonal function (EOF) techniques were applied to the LSWT retrievals (which contain gaps due to cloud cover) to reconstruct spatially and temporally complete time series of LSWT. The new LSWT observations and the EOF-based reconstructions offer benefits to numerical weather prediction, lake model validation, and improve our knowledge of the climatology of lakes globally. Both observations and reconstructions are publically available from http://hdl.handle.net/10283/88.
Resumo:
Optimal estimation (OE) is applied as a technique for retrieving sea surface temperature (SST) from thermal imagery obtained by the Spinning Enhanced Visible and Infra-Red Imager (SEVIRI) on Meteosat 9. OE requires simulation of observations as part of the retrieval process, and this is done here using numerical weather prediction fields and a fast radiative transfer model. Bias correction of the simulated brightness temperatures (BTs) is found to be a necessary step before retrieval, and is achieved by filtered averaging of simulations minus observations over a time period of 20 days and spatial scale of 2.5 in latitude and longitude. Throughout this study, BT observations are clear-sky averages over cells of size 0.5 in latitude and longitude. Results for the OE SST are compared to results using a traditional non-linear retrieval algorithm (NLSST), both validated against a set of 30108 night-time matches with drifting buoy observations. For the OE SST the mean difference with respect to drifter SSTs is 0.01 K and the standard deviation is 0.47 K, compared to 0.38 K and 0.70 K respectively for the NLSST algorithm. Perhaps more importantly, systematic biases in NLSST with respect to geographical location, atmospheric water vapour and satellite zenith angle are greatly reduced for the OE SST. However, the OE SST is calculated to have a lower sensitivity of retrieved SST to true SST variations than the NLSST. This feature would be a disadvantage for observing SST fronts and diurnal variability, and raises questions as to how best to exploit OE techniques at SEVIRI's full spatial resolution.
Resumo:
Optimal estimation (OE) improves sea surface temperature (SST) estimated from satellite infrared imagery in the split-window, in comparison to SST retrieved using the usual multi-channel (MCSST) or non-linear (NLSST) estimators. This is demonstrated using three months of observations of the Advanced Very High Resolution Radiometer (AVHRR) on the first Meteorological Operational satellite (Metop-A), matched in time and space to drifter SSTs collected on the global telecommunications system. There are 32,175 matches. The prior for the OE is forecast atmospheric fields from the Mto-France global numerical weather prediction system (ARPEGE), the forward model is RTTOV8.7, and a reduced state vector comprising SST and total column water vapour (TCWV) is used. Operational NLSST coefficients give mean and standard deviation (SD) of the difference between satellite and drifter SSTs of 0.00 and 0.72 K. The best possible NLSST and MCSST coefficients, empirically regressed on the data themselves, give zero mean difference and SDs of 0.66 K and 0.73 K respectively. Significant contributions to the global SD arise from regional systematic errors (biases) of several tenths of kelvin in the NLSST. With no bias corrections to either prior fields or forward model, the SSTs retrieved by OE minus drifter SSTs have mean and SD of 0.16 and 0.49 K respectively. The reduction in SD below the best possible regression results shows that OE deals with structural limitations of the NLSST and MCSST algorithms. Using simple empirical bias corrections to improve the OE, retrieved minus drifter SSTs are obtained with mean and SD of 0.06 and 0.44 K respectively. Regional biases are greatly reduced, such that the absolute bias is less than 0.1 K in 61% of 10-latitude by 30-longitude cells. OE also allows a statistic of the agreement between modelled and measured brightness temperatures to be calculated. We show that this measure is more efficient than the current system of confidence levels at identifying reliable retrievals, and that the best 75% of satellite SSTs by this measure have negligible bias and retrieval error of order 0.25 K.
Resumo:
A Guide to Office Clerical Time Standards is an instructional performance piece based on a corporate manual from 1960. The pamphlet is focused on the time necessary for the accomplishment of minute labour procedures in the office, from the depressing and releasing of typewriter keys to the opening and closing of filing cabinet drawers. In the performance, seven costumed performers represent the different levels of management and employment while performing the actions described in the guide, accompanied by a live musical score. There has been much discussion of the changes to work in the west following the decline of post-Fordist service sector jobs. These increasingly emphasise the specificity of employees knowledge and cognitive skill. However, this greater flexibility and creativity at work has been accompanied by an opposite trajectory. The proletarisation of white collar work has given rise to more bureaucracy, target assessment and control for workers in previously looser creative professions, from academia to the arts. The midcentury office is the meeting point of these cultures, where the assembly line efficiency management of the factory meets the quantifying control of the knowledge economy. A Guide to Office Clerical Time Standards explores the survival of one regime into its successor following the lines of combined and uneven development that have turned the emancipatory promise of immaterial labour into the perma-temp hell of the cognitariat. The movement is accompanied by a score of guitar, bass and drums, the componenets of the rock n roll music that rose from the car factories of the motor city and the cotton fields of the southern states to represent the same junction of expression and control.
Resumo:
The authors model retail rents in the United Kingdom with use of vector-autoregressive and time-series models. Two retail rent series are used, compiled by LaSalle Investment Management and CB Hillier Parker, and the emphasis is on forecasting. The results suggest that the use of the vector-autoregression and time-series models in this paper can pick up important features of the data that are useful for forecasting purposes. The relative forecasting performance of the models appears to be subject to the length of the forecast time-horizon. The results also show that the variables which were appropriate for inclusion in the vector-autoregression systems differ between the two rent series, suggesting that the structure of optimal models for predicting retail rents could be specific to the rent index used. Ex ante forecasts from our time-series suggest that both LaSalle Investment Management and CB Hillier Parker real retail rents will exhibit an annual growth rate above their long-term mean.
Resumo:
One of the most challenging tasks in financial management for large governmental and industrial organizations is Planning and Budgeting (P&B). The processes involved with P&B are cost and time intensive, especially when dealing with uncertainties and budget adjustments during the planning horizon. This work builds on our previous research in which we proposed and evaluated a fuzzy approach that allows optimizing the budget interactively beyond the initial planning stage. In this research we propose an extension that handles financial stress (i.e. drastic budget cuts) occurred during the budget period. This is done by introducing fuzzy stress parameters which are used to re-distribute the budget in order to minimize the negative impact of the financial stress. The benefits and possible issues of this approach are analyzed critically using a real world case study from the Nuremberg Institute of Technology (NIT). Additionally, ongoing and future research directions are presented.
Resumo:
Consultation on the Reform of the Planning System in Northern Ireland commenced on 6 July 2009 with the publication of the long awaited proposals paper: 'Reform of the Planning System in Northern Ireland: Your chance to influence change'. A 12 week consultation period followed during which time a series of consultation roadshow events were undertaken. This report is an account of that strand of the reform consultation and the discussions that took place at the roadshows during a three week period in September 2009. The roadshow events formed the central part in a process of encouraging engagement and response to the Reform Proposals before the closing date of 2 October 2009. They were organised and facilitated by a team of event managers and independent planners who, together with key Planning Service personnel, attended a mixture of day and evening events in each of the new eleven council areas to hear the views and opinions of those who came along. Aside from being publicly advertised, over 1,500 invitations (written and e-invites) were issued to a wide range of sectors, including the business community,environmentalists, councils, community and voluntary groups and other organisations, and 1,000 fliers were issued to libraries, leisure centres, council offices and civic centres. In total almost 500 people took up the invitation and came along to one or more of the events.
Resumo:
This paper discusses ECG signal classification after parametrizing the ECG waveforms in the wavelet domain. Signal decomposition using perfect reconstruction quadrature mirror filter banks can provide a very parsimonious representation of ECG signals. In the current work, the filter parameters are adjusted by a numerical optimization algorithm in order to minimize a cost function associated to the filter cut-off sharpness. The goal consists of achieving a better compromise between frequency selectivity and time resolution at each decomposition level than standard orthogonal filter banks such as those of the Daubechies and Coiflet families. Our aim is to optimally decompose the signals in the wavelet domain so that they can be subsequently used as inputs for training to a neural network classifier.
Resumo:
At present, there is a clarion call for action on climate change across the global health landscape. At the recent WHO-sponsored conference on health and climate (held in Geneva, Switzerland, on Aug 2729, 2014) and the UN Climate Summit (New York, USA, on Sept 23, 2014), participants were encouraged to act decisively to change the current trajectory of climate disruption. Health inequalities, including those related to infectious diseases, have now been pushed to centre stage. This approach represents a step-change in thinking. But as we are urged toward collective action, is it time to rethink our approach to research, especially in relation to climate change and infectious disease?