209 resultados para ROBUST ESTIMATES


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to influence global policy effectively, conservation scientists need to be able to provide robust predictions of the impact of alternative policies on biodiversity and measure progress towards goals using reliable indicators. We present a framework for using biodiversity indicators predictively to inform policy choices at a global level. The approach is illustrated with two case studies in which we project forwards the impacts of feasible policies on trends in biodiversity and in relevant indicators. The policies are based on targets agreed at the Convention on Biological Diversity (CBD) meeting in Nagoya in October 2010. The first case study compares protected area policies for African mammals, assessed using the Red List Index; the second example uses the Living Planet Index to assess the impact of a complete halt, versus a reduction, in bottom trawling. In the protected areas example, we find that the indicator can aid in decision-making because it is able to differentiate between the impacts of the different policies. In the bottom trawling example, the indicator exhibits some counter-intuitive behaviour, due to over-representation of some taxonomic and functional groups in the indicator, and contrasting impacts of the policies on different groups caused by trophic interactions. Our results support the need for further research on how to use predictive models and indicators to credibly track trends and inform policy. To be useful and relevant, scientists must make testable predictions about the impact of global policy on biodiversity to ensure that targets such as those set at Nagoya catalyse effective and measurable change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new technique for correcting errors in radar estimates of rainfall due to attenuation which is based on the fact that any attenuating target will itself emit, and that this emission can be detected by the increased noise level in the radar receiver. The technique is being installed on the UK operational network, and for the first time, allows radome attenuation to be monitored using the increased noise at the higher beam elevations. This attenuation has a large azimuthal dependence but for an old radome can be up to 4 dB for rainfall rates of just 2–4 mm/h. This effect has been neglected in the past, but may be responsible for significant errors in rainfall estimates and in radar calibrations using gauges. The extra noise at low radar elevations provides an estimate of the total path integrated attenuation of nearby storms; this total attenuation can then be used as a constraint for gate-by-gate or polarimetric correction algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a video surveillance framework that robustly and efficiently detects abandoned objects in surveillance scenes. The framework is based on a novel threat assessment algorithm which combines the concept of ownership with automatic understanding of social relations in order to infer abandonment of objects. Implementation is achieved through development of a logic-based inference engine based on Prolog. Threat detection performance is conducted by testing against a range of datasets describing realistic situations and demonstrates a reduction in the number of false alarms generated. The proposed system represents the approach employed in the EU SUBITO project (Surveillance of Unattended Baggage and the Identification and Tracking of the Owner).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Commercial kitchens often leave a large carbon footprint. A new dataset of energy performance metrics from a leading industrial partner is presented. Categorising these types of buildings is challenging. Electricity use has been analysed using data from automated meter readings (AMR) for the purpose of benchmarking and discussed in terms of factors such as size and food output. From the analysed results, consumption is found to be almost double previous sector estimates of 6480 million kWh per year. Recommendations are made to further improve the current benchmarks in order to attain robust, reliable and transparent figures, such as the introduction of normalised performance indicators to include kitchen size (m2) and kWh per thousand-pound turnover.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate models consistently predict a strengthened Brewer–Dobson circulation in response to greenhouse gas (GHG)-induced climate change. Although the predicted circulation changes are clearly the result of changes in stratospheric wave drag, the mechanism behind the wave-drag changes remains unclear. Here, simulations from a chemistry–climate model are analyzed to show that the changes in resolved wave drag are largely explainable in terms of a simple and robust dynamical mechanism, namely changes in the location of critical layers within the subtropical lower stratosphere, which are known from observations to control the spatial distribution of Rossby wave breaking. In particular, the strengthening of the upper flanks of the subtropical jets that is robustly expected from GHG-induced tropospheric warming pushes the critical layers (and the associated regions of wave drag) upward, allowing more wave activity to penetrate into the subtropical lower stratosphere. Because the subtropics represent the critical region for wave driving of the Brewer–Dobson circulation, the circulation is thereby strengthened. Transient planetary-scale waves and synoptic-scale waves generated by baroclinic instability are both found to play a crucial role in this process. Changes in stationary planetary wave drag are not so important because they largely occur away from subtropical latitudes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the sensitivity of Northern Hemisphere polar ozone recovery to a scenario in which there is rapid loss of Arctic summer sea ice in the first half of the 21st century. The issue is addressed by coupling a chemistry climate model to an ocean general circulation model and performing simulations of ozone recovery with, and without, an external perturbation designed to cause a rapid and complete loss of summertime Arctic sea ice. Under this extreme perturbation, the stratospheric response takes the form of a springtime polar cooling which is dynamical rather than radiative in origin, and is caused by reduced wave forcing from the troposphere. The response lags the onset of the sea-ice perturbation by about one decade and lasts for more than two decades, and is associated with an enhanced weakening of the North Atlantic meridional overturning circulation. The stratospheric dynamical response leads to a 10 DU reduction in polar column ozone, which is statistically robust. While this represents a modest loss, it has the potential to induce a delay of roughly one decade in Arctic ozone recovery estimates made in the 2006 Scientific Assessment of Ozone Depletion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate Fréchet differentiability of the scattered field with respect to variation in the boundary in the case of time–harmonic acoustic scattering by an unbounded, sound–soft, one–dimensional rough surface. We rigorously prove the differentiability of the scattered field and derive a characterization of the Fréchet derivative as the solution to a Dirichlet boundary value problem. As an application of these results we give rigorous error estimates for first–order perturbation theory, justifying small perturbation methods that have a long history in the engineering literature. As an application of our rigorous estimates we show that a plane acoustic wave incident on a sound–soft rough surface can produce an unbounded scattered field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative estimates of temperature and precipitation change during the late Pleistocene and Holocene have been difficult to obtain for much of the lowland Neotropics. Using two published lacustrine pollen records and a climate-vegetation model based on the modern abundance distributions of 154 Neotropical plant families, we demonstrate how family-level counts of fossil pollen can be used to quantitatively reconstruct tropical paleoclimate and provide needed information on historic patterns of climatic change. With this family-level analysis, we show that one area of the lowland tropics, northeastern Bolivia, experienced cooling (1–3 °C) and drying (400 mm/yr), relative to present, during the late Pleistocene (50,000–12,000 calendar years before present [cal. yr B.P.]). Immediately prior to the Last Glacial Maximum (LGM, ca. 21,000 cal. yr B.P.), we observe a distinct transition from cooler temperatures and variable precipitation to a period of warmer temperatures and relative dryness that extends to the middle Holocene (5000–3000 cal. yr B.P.). This prolonged reduction in precipitation occurs against the backdrop of increasing atmospheric CO2 concentrations, indicating that the presence of mixed savanna and dry-forest communities in northeastern Bolivia durng the LGM was not solely the result of low CO2 levels, as suggested previously, but also lower precipitation. The results of our analysis demonstrate the potential for using the distribution and abundance structure of modern Neotropical plant families to infer paleoclimate from the fossil pollen record.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study convergence of the L2-projection onto the space of polynomials up to degree p on a simplex in Rd, d >= 2. Optimal error estimates are established in the case of Sobolev regularity and illustrated on several numerical examples. The proof is based on the collapsed coordinate transform and the expansion into various polynomial bases involving Jacobi polynomials and their antiderivatives. The results of the present paper generalize corresponding estimates for cubes in Rd from [P. Houston, C. Schwab, E. Süli, Discontinuous hp-finite element methods for advection-diffusion-reaction problems. SIAM J. Numer. Anal. 39 (2002), no. 6, 2133-2163].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We obtain sharp estimates for multidimensional generalisations of Vinogradov’s mean value theorem for arbitrary translation-dilation invariant systems, achieving constraints on the number of variables approaching those conjectured to be the best possible. Several applications of our bounds are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many modern statistical applications involve inference for complex stochastic models, where it is easy to simulate from the models, but impossible to calculate likelihoods. Approximate Bayesian computation (ABC) is a method of inference for such models. It replaces calculation of the likelihood by a step which involves simulating artificial data for different parameter values, and comparing summary statistics of the simulated data with summary statistics of the observed data. Here we show how to construct appropriate summary statistics for ABC in a semi-automatic manner. We aim for summary statistics which will enable inference about certain parameters of interest to be as accurate as possible. Theoretical results show that optimal summary statistics are the posterior means of the parameters. Although these cannot be calculated analytically, we use an extra stage of simulation to estimate how the posterior means vary as a function of the data; and we then use these estimates of our summary statistics within ABC. Empirical results show that our approach is a robust method for choosing summary statistics that can result in substantially more accurate ABC analyses than the ad hoc choices of summary statistics that have been proposed in the literature. We also demonstrate advantages over two alternative methods of simulation-based inference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bioaccessibility tests can be used to improve contaminated land risk assessments. For organic pollutants a ‘sink’ is required within these tests to better mimic their desorption under the physiological conditions prevailing in the intestinal tract, where a steep diffusion gradient for the removal of organic pollutants from the soil matrix would exist. This is currently ignored in most PBET systems. By combining the CEPBET bioaccessibility test with an infinite sink, the removal of PAH from spiked solutions was monitored. Less than 10% of spiked PAH remained in the stomach media after 1 h, 10% by 4 h in the small intestine compartment and c.15% after 16 h in the colon. The addition of the infinite sink increased bioaccessibility estimates for field soils by a factor of 1.2–2.8, confirming its importance for robust PBET tests. TOC or BC were not the only factors controlling desorption of the PAH from the soils.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this communication, we describe a new method which has enabled the first patterning of human neurons (derived from the human teratocarcinoma cell line (hNT)) on parylene-C/silicon dioxide substrates. We reveal the details of the nanofabrication processes, cell differentiation and culturing protocols necessary to successfully pattern hNT neurons which are each key aspects of this new method. The benefits in patterning human neurons on silicon chip using an accessible cell line and robust patterning technology are of widespread value. Thus, using a combined technology such as this will facilitate the detailed study of the pathological human brain at both the single cell and network level.