81 resultados para Metric


Relevância:

10.00% 10.00%

Publicador:

Resumo:

1. Species-based indices are frequently employed as surrogates for wider biodiversity health and measures of environmental condition. Species selection is crucial in determining an indicators metric value and hence the validity of the interpretation of ecosystem condition and function it provides, yet an objective process to identify appropriate indicator species is frequently lacking. 2. An effective indicator needs to (i) be representative, reflecting the status of wider biodiversity; (ii) be reactive, acting as early-warning systems for detrimental changes in environmental conditions; (iii) respond to change in a predictable way. We present an objective, niche-based approach for species' selection, founded on a coarse categorisation of species' niche space and key resource requirements, which ensures the resultant indicator has these key attributes. 3. We use UK farmland birds as a case study to demonstrate this approach, identifying an optimal indicator set containing 12 species. In contrast to the 19 species included in the farmland bird index (FBI), a key UK biodiversity indicator that contributes to one of the UK Government's headline indicators of sustainability, the niche space occupied by these species fully encompasses that occupied by the wider community of 62 species. 4. We demonstrate that the response of these 12 species to land-use change is a strong correlate to that of the wider farmland bird community. Furthermore, the temporal dynamics of the index based on their population trends closely matches the population dynamics of the wider community. However, in both analyses, the magnitude of the change in our indicator was significantly greater, allowing this indicator to act as an early-warning system. 5. Ecological indicators are embedded in environmental management, sustainable development and biodiversity conservation policy and practice where they act as metrics against which progress towards national, regional and global targets can be measured. Adopting this niche-based approach for objective selection of indicator species will facilitate the development of sensitive and representative indices for a range of taxonomic groups, habitats and spatial scales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an era dominated by climate change debate and environmentalism there is a real danger that the important ‘social’ pillar of sustainability drops out of our vocabulary. This can happen at a variety of scales from business level through to building and neighbourhood level regeneration and development. Social sustainability should be at the heart of all housing and mixed-use development but for a variety of reasons tends to be frequently underplayed. The recent English city riots have brought this point back sharply into focus. The relationships between people, places and the local economy all matter and this is as true today as it was in the late 19th century when Patrick Geddes, the great pioneering town planner and ecologist, wrote of ‘place-work-folk’. This paper, commissioned from Tim Dixon, explains what is meant by social sustainability (and how it is linked to concepts such as social capital and social cohesion); why the debate matters during a period when ‘localism’ is dominating political debate; and what is inhibiting its growth and its measurement. The paper reviews best practice in post-occupancy social sustainability metric systems, based on recent research undertaken by the author on Dockside Green in Vancouver, and identifi es some of the key operational issues in mainstreaming the concept within major mixed-use projects. The paper concludes by offering a framework for the key challenges faced in setting strategic corporate goals and objectives; prioritising and selecting the most appropriate investments; and measuring social sustainability performance by identifying the required data sources

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ensemble clustering (EC) can arise in data assimilation with ensemble square root filters (EnSRFs) using non-linear models: an M-member ensemble splits into a single outlier and a cluster of M−1 members. The stochastic Ensemble Kalman Filter does not present this problem. Modifications to the EnSRFs by a periodic resampling of the ensemble through random rotations have been proposed to address it. We introduce a metric to quantify the presence of EC and present evidence to dispel the notion that EC leads to filter failure. Starting from a univariate model, we show that EC is not a permanent but transient phenomenon; it occurs intermittently in non-linear models. We perform a series of data assimilation experiments using a standard EnSRF and a modified EnSRF by a resampling though random rotations. The modified EnSRF thus alleviates issues associated with EC at the cost of traceability of individual ensemble trajectories and cannot use some of algorithms that enhance performance of standard EnSRF. In the non-linear regimes of low-dimensional models, the analysis root mean square error of the standard EnSRF slowly grows with ensemble size if the size is larger than the dimension of the model state. However, we do not observe this problem in a more complex model that uses an ensemble size much smaller than the dimension of the model state, along with inflation and localisation. Overall, we find that transient EC does not handicap the performance of the standard EnSRF.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Metrics are often used to compare the climate impacts of emissions from various sources, sectors or nations. These are usually based on global-mean input, and so there is the potential that important information on smaller scales is lost. Assuming a non-linear dependence of the climate impact on local surface temperature change, we explore the loss of information about regional variability that results from using global-mean input in the specific case of heterogeneous changes in ozone, methane and aerosol concentrations resulting from emissions from road traffic, aviation and shipping. Results from equilibrium simulations with two general circulation models are used. An alternative metric for capturing the regional climate impacts is investigated. We find that the application of a metric that is first calculated locally and then averaged globally captures a more complete and informative signal of climate impact than one that uses global-mean input. The loss of information when heterogeneity is ignored is largest in the case of aviation. Further investigation of the spatial distribution of temperature change indicates that although the pattern of temperature response does not closely match the pattern of the forcing, the forcing pattern still influences the response pattern on a hemispheric scale. When the short-lived transport forcing is superimposed on present-day anthropogenic CO2 forcing, the heterogeneity in the temperature response to CO2 dominates. This suggests that the importance of including regional climate impacts in global metrics depends on whether small sectors are considered in isolation or as part of the overall climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Decadal predictions have a high profile in the climate science community and beyond, yet very little is known about their skill. Nor is there any agreed protocol for estimating their skill. This paper proposes a sound and coordinated framework for verification of decadal hindcast experiments. The framework is illustrated for decadal hindcasts tailored to meet the requirements and specifications of CMIP5 (Coupled Model Intercomparison Project phase 5). The chosen metrics address key questions about the information content in initialized decadal hindcasts. These questions are: (1) Do the initial conditions in the hindcasts lead to more accurate predictions of the climate, compared to un-initialized climate change projections? and (2) Is the prediction model’s ensemble spread an appropriate representation of forecast uncertainty on average? The first question is addressed through deterministic metrics that compare the initialized and uninitialized hindcasts. The second question is addressed through a probabilistic metric applied to the initialized hindcasts and comparing different ways to ascribe forecast uncertainty. Verification is advocated at smoothed regional scales that can illuminate broad areas of predictability, as well as at the grid scale, since many users of the decadal prediction experiments who feed the climate data into applications or decision models will use the data at grid scale, or downscale it to even higher resolution. An overall statement on skill of CMIP5 decadal hindcasts is not the aim of this paper. The results presented are only illustrative of the framework, which would enable such studies. However, broad conclusions that are beginning to emerge from the CMIP5 results include (1) Most predictability at the interannual-to-decadal scale, relative to climatological averages, comes from external forcing, particularly for temperature; (2) though moderate, additional skill is added by the initial conditions over what is imparted by external forcing alone; however, the impact of initialization may result in overall worse predictions in some regions than provided by uninitialized climate change projections; (3) limited hindcast records and the dearth of climate-quality observational data impede our ability to quantify expected skill as well as model biases; and (4) as is common to seasonal-to-interannual model predictions, the spread of the ensemble members is not necessarily a good representation of forecast uncertainty. The authors recommend that this framework be adopted to serve as a starting point to compare prediction quality across prediction systems. The framework can provide a baseline against which future improvements can be quantified. The framework also provides guidance on the use of these model predictions, which differ in fundamental ways from the climate change projections that much of the community has become familiar with, including adjustment of mean and conditional biases, and consideration of how to best approach forecast uncertainty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Fredholm properties of Toeplitz operators on the Bergman space A2 have been well-known for continuous symbols since the 1970s. We investigate the case p=1 with continuous symbols under a mild additional condition, namely that of the logarithmic vanishing mean oscillation in the Bergman metric. Most differences are related to boundedness properties of Toeplitz operators acting on Ap that arise when we no longer have 1

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The global temperature response to increasing atmospheric CO2 is often quantified by metrics such as equilibrium climate sensitivity and transient climate response1. These approaches, however, do not account for carbon cycle feedbacks and therefore do not fully represent the net response of the Earth system to anthropogenic CO2 emissions. Climate–carbon modelling experiments have shown that: (1) the warming per unit CO2 emitted does not depend on the background CO2 concentration2; (2) the total allowable emissions for climate stabilization do not depend on the timing of those emissions3, 4, 5; and (3) the temperature response to a pulse of CO2 is approximately constant on timescales of decades to centuries3, 6, 7, 8. Here we generalize these results and show that the carbon–climate response (CCR), defined as the ratio of temperature change to cumulative carbon emissions, is approximately independent of both the atmospheric CO2 concentration and its rate of change on these timescales. From observational constraints, we estimate CCR to be in the range 1.0–2.1 °C per trillion tonnes of carbon (Tt C) emitted (5th to 95th percentiles), consistent with twenty-first-century CCR values simulated by climate–carbon models. Uncertainty in land-use CO2 emissions and aerosol forcing, however, means that higher observationally constrained values cannot be excluded. The CCR, when evaluated from climate–carbon models under idealized conditions, represents a simple yet robust metric for comparing models, which aggregates both climate feedbacks and carbon cycle feedbacks. CCR is also likely to be a useful concept for climate change mitigation and policy; by combining the uncertainties associated with climate sensitivity, carbon sinks and climate–carbon feedbacks into a single quantity, the CCR allows CO2-induced global mean temperature change to be inferred directly from cumulative carbon emissions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi-gas approaches to climate change policies require a metric establishing ‘equivalences’ among emissions of various species. Climate scientists and economists have proposed four kinds of such metrics and debated their relative merits. We present a unifying framework that clarifies the relationships among them. We show, as have previous authors, that the global warming potential (GWP), used in international law to compare emissions of greenhouse gases, is a special case of the global damage potential (GDP), assuming (1) a finite time horizon, (2) a zero discount rate, (3) constant atmospheric concentrations, and (4) impacts that are proportional to radiative forcing. Both the GWP and GDP follow naturally from a cost–benefit framing of the climate change issue. We show that the global temperature change potential (GTP) is a special case of the global cost potential (GCP), assuming a (slight) fall in the global temperature after the target is reached. We show how the four metrics should be generalized if there are intertemporal spillovers in abatement costs, distinguishing between private (e.g., capital stock turnover) and public (e.g., induced technological change) spillovers. Both the GTP and GCP follow naturally from a cost-effectiveness framing of the climate change issue. We also argue that if (1) damages are zero below a threshold and (2) infinitely large above a threshold, then cost-effectiveness analysis and cost–benefit analysis lead to identical results. Therefore, the GCP is a special case of the GDP. The UN Framework Convention on Climate Change uses the GWP, a simplified cost–benefit concept. The UNFCCC is framed around the ultimate goal of stabilizing greenhouse gas concentrations. Once a stabilization target has been agreed under the convention, implementation is clearly a cost-effectiveness problem. It would therefore be more consistent to use the GCP or its simplification, the GTP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examine the effect of ozone damage to vegetation as caused by anthropogenic emissions of ozone precursor species and quantify it in terms of its impact on terrestrial carbon stores. A simple climate model is then used to assess the expected changes in global surface temperature from the resulting perturbations to atmospheric concentrations of carbon dioxide, methane, and ozone. The concept of global temperature change potential (GTP) metric, which relates the global average surface temperature change induced by the pulse emission of a species to that induced by a unit mass of carbon dioxide, is used to characterize the impact of changes in emissions of ozone precursors on surface temperature as a function of time. For NOx emissions, the longer-timescale methane perturbation is of the opposite sign to the perturbations in ozone and carbon dioxide, so NOx emissions are warming in the short term, but cooling in the long term. For volatile organic compound (VOC), CO, and methane emissions, all the terms are warming for an increase in emissions. The GTPs for the 20 year time horizon are strong functions of emission location, with a large component of the variability owing to the different vegetation responses on different continents. At this time horizon, the induced change in the carbon cycle is the largest single contributor to the GTP metric for NOx and VOC emissions. For NOx emissions, we estimate a GTP20 of −9 (cooling) to +24 (warming) depending on assumptions of the sensitivity of vegetation types to ozone damage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How fast can a mammal evolve from the size of a mouse to the size of an elephant? Achieving such a large transformation calls for major biological reorganization. Thus, the speed at which this occurs has important implications for extensive faunal changes, including adaptive radiations and recovery from mass extinctions. To quantify the pace of large-scale evolution we developed a metric, clade maximum rate, which represents the maximum evolutionary rate of a trait within a clade. We applied this metric to body mass evolution in mammals over the last 70 million years, during which multiple large evolutionary transitions occurred in oceans and on continents and islands. Our computations suggest that it took a minimum of 1.6, 5.1, and 10 million generations for terrestrial mammal mass to increase 100-, and 1,000-, and 5,000- fold, respectively. Values for whales were down to half the length (i.e., 1.1, 3, and 5 million generations), perhaps due to the reduced mechanical constraints of living in an aquatic environment. When differences in generation time are considered, we find an exponential increase in maximum mammal body mass during the 35 million years following the Cretaceous–Paleogene (K–Pg) extinction event. Our results also indicate a basic asymmetry in macroevolution: very large decreases (such as extreme insular dwarfism) can happen at more than 10 times the rate of increases. Our findings allow more rigorous comparisons of microevolutionary and macroevolutionary patterns and processes. Keywords: haldanes, biological time, scaling, pedomorphosis

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important feature of agribusiness promotion programs is their lagged impact on consumption. Efficient investment in advertising requires reliable estimates of these lagged responses and it is desirable from both applied and theoretical standpoints to have a flexible method for estimating them. This note derives an alternative Bayesian methodology for estimating lagged responses when investments occur intermittently within a time series. The method exploits a latent-variable extension of the natural-conjugate, normal-linear model, Gibbs sampling and data augmentation. It is applied to a monthly time series on Turkish pasta consumption (1993:5-1998:3) and three, nonconsecutive promotion campaigns (1996:3, 1997:3, 1997:10). The results suggest that responses were greatest to the second campaign, which allocated its entire budget to television media; that its impact peaked in the sixth month following expenditure; and that the rate of return (measured in metric tons additional consumption per thousand dollars expended) was around a factor of 20.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Emissions of exhaust gases and particles from oceangoing ships are a significant and growing contributor to the total emissions from the transportation sector. We present an assessment of the contribution of gaseous and particulate emissions from oceangoing shipping to anthropogenic emissions and air quality. We also assess the degradation in human health and climate change created by these emissions. Regulating ship emissions requires comprehensive knowledge of current fuel consumption and emissions, understanding of their impact on atmospheric composition and climate, and projections of potential future evolutions and mitigation options. Nearly 70% of ship emissions occur within 400 km of coastlines, causing air quality problems through the formation of ground-level ozone, sulphur emissions and particulate matter in coastal areas and harbours with heavy traffic. Furthermore, ozone and aerosol precursor emissions as well as their derivative species from ships may be transported in the atmosphere over several hundreds of kilometres, and thus contribute to air quality problems further inland, even though they are emitted at sea. In addition, ship emissions impact climate. Recent studies indicate that the cooling due to altered clouds far outweighs the warming effects from greenhouse gases such as carbon dioxide (CO2) or ozone from shipping, overall causing a negative present-day radiative forcing (RF). Current efforts to reduce sulphur and other pollutants from shipping may modify this. However, given the short residence time of sulphate compared to CO2, the climate response from sulphate is of the order decades while that of CO2 is centuries. The climatic trade-off between positive and negative radiative forcing is still a topic of scientific research, but from what is currently known, a simple cancellation of global mean forcing components is potentially inappropriate and a more comprehensive assessment metric is required. The CO2 equivalent emissions using the global temperature change potential (GTP) metric indicate that after 50 years the net global mean effect of current emissions is close to zero through cancellation of warming by CO2 and cooling by sulphate and nitrogen oxides.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A favoured method of assimilating information from state-of-the-art climate models into integrated assessment models of climate impacts is to use the transient climate response (TCR) of the climate models as an input, sometimes accompanied by a pattern matching approach to provide spatial information. More recent approaches to the problem use TCR with another independent piece of climate model output: the land-sea surface warming ratio (φ). In this paper we show why the use of φ in addition to TCR has such utility. Multiple linear regressions of surface temperature change onto TCR and φ in 22 climate models from the CMIP3 multi-model database show that the inclusion of φ explains a much greater fraction of the inter-model variance than using TCR alone. The improvement is particularly pronounced in North America and Eurasia in the boreal summer season, and in the Amazon all year round. The use of φ as the second metric is beneficial for three reasons: firstly it is uncorrelated with TCR in state-of-the-art climate models and can therefore be considered as an independent metric; secondly, because of its projected time-invariance, the magnitude of φ is better constrained than TCR in the immediate future; thirdly, the use of two variables is much simpler than approaches such as pattern scaling from climate models. Finally we show how using the latest estimates of φ from climate models with a mean value of 1.6—as opposed to previously reported values of 1.4—can significantly increase the mean time-integrated discounted damage projections in a state-of-the-art integrated assessment model by about 15 %. When compared to damages calculated without the inclusion of the land-sea warming ratio, this figure rises to 65 %, equivalent to almost 200 trillion dollars over 200 years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research presents a novel multi-functional system for medical Imaging-enabled Assistive Diagnosis (IAD). Although the IAD demonstrator has focused on abdominal images and supports the clinical diagnosis of kidneys using CT/MRI imaging, it can be adapted to work on image delineation, annotation and 3D real-size volumetric modelling of other organ structures such as the brain, spine, etc. The IAD provides advanced real-time 3D visualisation and measurements with fully automated functionalities as developed in two stages. In the first stage, via the clinically driven user interface, specialist clinicians use CT/MRI imaging datasets to accurately delineate and annotate the kidneys and their possible abnormalities, thus creating “3D Golden Standard Models”. Based on these models, in the second stage, clinical support staff i.e. medical technicians interactively define model-based rules and parameters for the integrated “Automatic Recognition Framework” to achieve results which are closest to that of the clinicians. These specific rules and parameters are stored in “Templates” and can later be used by any clinician to automatically identify organ structures i.e. kidneys and their possible abnormalities. The system also supports the transmission of these “Templates” to another expert for a second opinion. A 3D model of the body, the organs and their possible pathology with real metrics is also integrated. The automatic functionality was tested on eleven MRI datasets (comprising of 286 images) and the 3D models were validated by comparing them with the metrics from the corresponding “3D Golden Standard Models”. The system provides metrics for the evaluation of the results, in terms of Accuracy, Precision, Sensitivity, Specificity and Dice Similarity Coefficient (DSC) so as to enable benchmarking of its performance. The first IAD prototype has produced promising results as its performance accuracy based on the most widely deployed evaluation metric, DSC, yields 97% for the recognition of kidneys and 96% for their abnormalities; whilst across all the above evaluation metrics its performance ranges between 96% and 100%. Further development of the IAD system is in progress to extend and evaluate its clinical diagnostic support capability through development and integration of additional algorithms to offer fully computer-aided identification of other organs and their abnormalities based on CT/MRI/Ultra-sound Imaging.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Single-carrier frequency division multiple access (SC-FDMA) has appeared to be a promising technique for high data rate uplink communications. Aimed at SC-FDMA applications, a cyclic prefixed version of the offset quadrature amplitude modulation based OFDM (OQAM-OFDM) is first proposed in this paper. We show that cyclic prefixed OQAMOFDM CP-OQAM-OFDM) can be realized within the framework of the standard OFDM system, and perfect recovery condition in the ideal channel is derived. We then apply CP-OQAMOFDM to SC-FDMA transmission in frequency selective fading channels. Signal model and joint widely linear minimum mean square error (WLMMSE) equalization using a prior information with low complexity are developed. Compared with the existing DFTS-OFDM based SC-FDMA, the proposed SC-FDMA can significantly reduce envelope fluctuation (EF) of the transmitted signal while maintaining the bandwidth efficiency. The inherent structure of CP-OQAM-OFDM enables low-complexity joint equalization in the frequency domain to combat both the multiple access interference and the intersymbol interference. The joint WLMMSE equalization using a prior information guarantees optimal MMSE performance and supports Turbo receiver for improved bit error rate (BER) performance. Simulation resultsconfirm the effectiveness of the proposed SC-FDMA in termsof EF (including peak-to-average power ratio, instantaneous-toaverage power ratio and cubic metric) and BER performances.