135 resultados para quantifying


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ‘trophic level enrichment’ between diet and body results in an overall increase in nitrogen isotopic values as the food chain is ascended. Quantifying the diet–body Δ15N spacing has proved difficult, particularly for humans. The value is usually assumed to be +3-5‰ in the archaeological literature. We report here the first (to our knowledge) data from humans on isotopically known diets, comparing dietary intake and a body tissue sample, that of red blood cells. Samples were taken from 11 subjects on controlled diets for a 30-d period, where the controlled diets were designed to match each individual’s habitual diet, thus reducing problems with short-term changes in diet causing isotopic changes in the body pool. The Δ15Ndiet-RBC was measured as +3.5‰. Using measured offsets from other studies, we estimate the human Δ15Ndiet-keratin as +5.0-5.3‰, which is in good agreement with values derived from the two other studies using individual diet records. We also estimate a value for Δ15Ndiet-collagen of ≈6‰, again in combination with measured offsets from other studies. This value is larger than usually assumed in palaeodietary studies, which suggests that the proportion of animal protein in prehistoric human diet may have often been overestimated in isotopic studies of palaeodiet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We evaluated the accuracy of six watershed models of nitrogen export in streams (kg km2 yr−1) developed for use in large watersheds and representing various empirical and quasi-empirical approaches described in the literature. These models differ in their methods of calibration and have varying levels of spatial resolution and process complexity, which potentially affect the accuracy (bias and precision) of the model predictions of nitrogen export and source contributions to export. Using stream monitoring data and detailed estimates of the natural and cultural sources of nitrogen for 16 watersheds in the northeastern United States (drainage sizes = 475 to 70,000 km2), we assessed the accuracy of the model predictions of total nitrogen and nitrate-nitrogen export. The model validation included the use of an error modeling technique to identify biases caused by model deficiencies in quantifying nitrogen sources and biogeochemical processes affecting the transport of nitrogen in watersheds. Most models predicted stream nitrogen export to within 50% of the measured export in a majority of the watersheds. Prediction errors were negatively correlated with cultivated land area, indicating that the watershed models tended to over predict export in less agricultural and more forested watersheds and under predict in more agricultural basins. The magnitude of these biases differed appreciably among the models. Those models having more detailed descriptions of nitrogen sources, land and water attenuation of nitrogen, and water flow paths were found to have considerably lower bias and higher precision in their predictions of nitrogen export.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The butanol-HCl spectrophotometric assay is widely used for quantifying extractable and insoluble condensed tannins (CT, syn. proanthocyanidins) in foods, feeds, and foliage of herbaceous and woody plants, but the method underestimates total CT content when applied directly to plant material. To improve CT quantitation, we tested various cosolvents with butanol-HCl and found that acetone increased anthocyanidin yields from two forage Lotus species having contrasting procyanidin and prodelphinidin compositions. A butanol-HCl-iron assay run with 50% (v/v) acetone gave linear responses with Lotus CT standards and increased estimates of total CT in Lotus herbage and leaves by up to 3.2-fold over the conventional method run without acetone. The use of thiolysis to determine the purity of CT standards further improved quantitation. Gel-state 13C and 1H–13C HSQC NMR spectra of insoluble residues collected after butanol-HCl assays revealed that acetone increased anthocyanidin yields by facilitating complete solubilization of CT from tissue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate models predict a large range of possible future temperatures for a particular scenario of future emissions of greenhouse gases and other anthropogenic forcings of climate. Given that further warming in coming decades could threaten increasing risks of climatic disruption, it is important to determine whether model projections are consistent with temperature changes already observed. This can be achieved by quantifying the extent to which increases in well mixed greenhouse gases and changes in other anthropogenic and natural forcings have already altered temperature patterns around the globe. Here, for the first time, we combine multiple climate models into a single synthesized estimate of future warming rates consistent with past temperature changes. We show that the observed evolution of near-surface temperatures appears to indicate lower ranges (5–95%) for warming (0.35–0.82 K and 0.45–0.93 K by the 2020s (2020–9) relative to 1986–2005 under the RCP4.5 and 8.5 scenarios respectively) than the equivalent ranges projected by the CMIP5 climate models (0.48–1.00 K and 0.51–1.16 K respectively). Our results indicate that for each RCP the upper end of the range of CMIP5 climate model projections is inconsistent with past warming.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to validate the reported precision of space‐based atmospheric composition measurements, validation studies often focus on measurements in the tropical stratosphere, where natural variability is weak. The scatter in tropical measurements can then be used as an upper limit on single‐profile measurement precision. Here we introduce a method of quantifying the scatter of tropical measurements which aims to minimize the effects of short‐term atmospheric variability while maintaining large enough sample sizes that the results can be taken as representative of the full data set. We apply this technique to measurements of O3, HNO3, CO, H2O, NO, NO2, N2O, CH4, CCl2F2, and CCl3F produced by the Atmospheric Chemistry Experiment–Fourier Transform Spectrometer (ACE‐FTS). Tropical scatter in the ACE‐FTS retrievals is found to be consistent with the reported random errors (RREs) for H2O and CO at altitudes above 20 km, validating the RREs for these measurements. Tropical scatter in measurements of NO, NO2, CCl2F2, and CCl3F is roughly consistent with the RREs as long as the effect of outliers in the data set is reduced through the use of robust statistics. The scatter in measurements of O3, HNO3, CH4, and N2O in the stratosphere, while larger than the RREs, is shown to be consistent with the variability simulated in the Canadian Middle Atmosphere Model. This result implies that, for these species, stratospheric measurement scatter is dominated by natural variability, not random error, which provides added confidence in the scientific value of single‐profile measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During long-range transport, many distinct processes – including photochemistry, deposition, emissions and mixing – contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio (Δ O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing (Δ O3chem) over the whole simulation is greater than net physical processing (Δ O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on Δ O3chem than Δ O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is −5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Measurements from ground-based magnetometers and riometers at auroral latitudes have demonstrated that energetic (~30-300keV) electron precipitation can be modulated in the presence of magnetic field oscillations at ultra-low frequencies. It has previously been proposed that an ultra-low frequency (ULF) wave would modulate field and plasma properties near the equatorial plane, thus modifying the growth rates of whistler-mode waves. In turn, the resulting whistler-mode waves would mediate the pitch-angle scattering of electrons resulting in ionospheric precipitation. In this paper, we investigate this hypothesis by quantifying the changes to the linear growth rate expected due to a slow change in the local magnetic field strength for parameters typical of the equatorial region around 6.6RE radial distance. To constrain our study, we determine the largest possible ULF wave amplitudes from measurements of the magnetic field at geosynchronous orbit. Using nearly ten years of observations from two satellites, we demonstrate that the variation in magnetic field strength due to oscillations at 2mHz does not exceed ±10% of the background field. Modifications to the plasma density and temperature anisotropy are estimated using idealised models. For low temperature anisotropy, there is little change in the whistler-mode growth rates even for the largest ULF wave amplitude. Only for large temperature anisotropies can whistler-mode growth rates be modulated sufficiently to account for the changes in electron precipitation measured by riometers at auroral latitudes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an efficient graph-based algorithm for quantifying the similarity of household-level energy use profiles, using a notion of similarity that allows for small time–shifts when comparing profiles. Experimental results on a real smart meter data set demonstrate that in cases of practical interest our technique is far faster than the existing method for computing the same similarity measure. Having a fast algorithm for measuring profile similarity improves the efficiency of tasks such as clustering of customers and cross-validation of forecasting methods using historical data. Furthermore, we apply a generalisation of our algorithm to produce substantially better household-level energy use forecasts from historical smart meter data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding how species and ecosystems respond to climate change has become a major focus of ecology and conservation biology. Modelling approaches provide important tools for making future projections, but current models of the climate-biosphere interface remain overly simplistic, undermining the credibility of projections. We identify five ways in which substantial advances could be made in the next few years: (i) improving the accessibility and efficiency of biodiversity monitoring data, (ii) quantifying the main determinants of the sensitivity of species to climate change, (iii) incorporating community dynamics into projections of biodiversity responses, (iv) accounting for the influence of evolutionary processes on the response of species to climate change, and (v) improving the biophysical rule sets that define functional groupings of species in global models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new record of sea surface temperature (SST) for climate applications is described. This record provides independent corroboration of global variations estimated from SST measurements made in situ. Infrared imagery from Along-Track Scanning Radiometers (ATSRs) is used to create a 20 year time series of SST at 0.1° latitude-longitude resolution, in the ATSR Reprocessing for Climate (ARC) project. A very high degree of independence of in situ measurements is achieved via physics-based techniques. Skin SST and SST estimated for 20 cm depth are provided, with grid cell uncertainty estimates. Comparison with in situ data sets establishes that ARC SSTs generally have bias of order 0.1 K or smaller. The precision of the ARC SSTs is 0.14 K during 2003 to 2009, from three-way error analysis. Over the period 1994 to 2010, ARC SSTs are stable, with better than 95% confidence, to within 0.005 K yr−1(demonstrated for tropical regions). The data set appears useful for cleanly quantifying interannual variability in SST and major SST anomalies. The ARC SST global anomaly time series is compared to the in situ-based Hadley Centre SST data set version 3 (HadSST3). Within known uncertainties in bias adjustments applied to in situ measurements, the independent ARC record and HadSST3 present the same variations in global marine temperature since 1996. Since the in situ observing system evolved significantly in its mix of measurement platforms and techniques over this period, ARC SSTs provide an important corroboration that HadSST3 accurately represents recent variability and change in this essential climate variable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Affymetrix GeneChip (R) arrays are used widely to study transcriptional changes in response to developmental and environmental stimuli. GeneChip (R) arrays comprise multiple 25-mer oligonucleotide probes per gene and retain certain advantages over direct sequencing. For plants, there are several public GeneChip (R) arrays whose probes are localised primarily in 39 exons. Plant whole-transcript (WT) GeneChip (R) arrays are not yet publicly available, although WT resolution is needed to study complex crop genomes such as Brassica, which are typified by segmental duplications containing paralogous genes and/or allopolyploidy. Available sequence data were sampled from the Brassica A and C genomes, and 142,997 gene models identified. The assembled gene models were then used to establish a comprehensive public WT exon array for transcriptomics studies. The Affymetrix GeneChip (R) Brassica Exon 1.0 ST Array is a 5 mu M feature size array, containing 2.4 million 25-base oligonucleotide probes representing 135,201 gene models, with 15 probes per gene distributed among exons. Discrimination of the gene models was based on an E-value cut-off of 1E(-5), with <= 98 sequence identity. The 135 k Brassica Exon Array was validated by quantifying transcriptome differences between leaf and root tissue from a reference Brassica rapa line (R-o-18), and categorisation by Gene Ontologies (GO) based on gene orthology with Arabidopsis thaliana. Technical validation involved comparison of the exon array with a 60-mer array platform using the same starting RNA samples. The 135 k Brassica Exon Array is a robust platform. All data relating to the array design and probe identities are available in the public domain and are curated within the BrassEnsembl genome viewer at http://www.brassica.info/BrassEnsembl/index.html.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Methyl benzimidazole carbamate (MBC) fungicides are used to control the oilseed rape pathogen Pyrenopeziza brassicae. Resistance to MBCs has been reported in P. brassicae, but the molecular mechanism(s) associated with reductions in sensitivity have not been verified in this species. Elucidation of the genetic changes responsible for resistance, hypothesised to be target-site mutations in β-tubulin, will enable resistance diagnostics and thereby inform resistance management strategies. RESULTS P. brassicae isolates were classified as sensitive, moderately resistant or resistant to MBCs. Crossing P. brassicae isolates of different MBC sensitivities indicated that resistance was conferred by a single gene. The MBC-target encoding gene β-tubulin was cloned and sequenced. Reduced MBC sensitivity of field isolates correlated with β-tubulin amino acid substitutions L240F and E198A. The highest level of MBC resistance was measured for isolates carrying E198A. Negative cross-resistance between MBCs and the fungicides diethofencarb and zoxamide was only measured in E198A isolates. PCR-RFLP was used to screen isolates for the presence of L240F and E198A. The substitutions E198G and F200Y were also detected in DNA samples from P. brassicae populations after cloning and sequencing of PCR products. The frequencies of L240F and E198A in different P. brassicae populations were quantified by pyrosequencing. There were no differences in the frequencies of these alleles between P. brassicae populations sampled from different locations or after fungicide treatment regimes. CONCLUSIONS The molecular mechanisms affecting sensitivity to MBCs in P. brassicae have been identified. Pyrosequencing assays are a powerful tool for quantifying fungicide-resistant alleles in pathogen populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prism is a modular classification rule generation method based on the ‘separate and conquer’ approach that is alternative to the rule induction approach using decision trees also known as ‘divide and conquer’. Prism often achieves a similar level of classification accuracy compared with decision trees, but tends to produce a more compact noise tolerant set of classification rules. As with other classification rule generation methods, a principle problem arising with Prism is that of overfitting due to over-specialised rules. In addition, over-specialised rules increase the associated computational complexity. These problems can be solved by pruning methods. For the Prism method, two pruning algorithms have been introduced recently for reducing overfitting of classification rules - J-pruning and Jmax-pruning. Both algorithms are based on the J-measure, an information theoretic means for quantifying the theoretical information content of a rule. Jmax-pruning attempts to exploit the J-measure to its full potential because J-pruning does not actually achieve this and may even lead to underfitting. A series of experiments have proved that Jmax-pruning may outperform J-pruning in reducing overfitting. However, Jmax-pruning is computationally relatively expensive and may also lead to underfitting. This paper reviews the Prism method and the two existing pruning algorithms above. It also proposes a novel pruning algorithm called Jmid-pruning. The latter is based on the J-measure and it reduces overfitting to a similar level as the other two algorithms but is better in avoiding underfitting and unnecessary computational effort. The authors conduct an experimental study on the performance of the Jmid-pruning algorithm in terms of classification accuracy and computational efficiency. The algorithm is also evaluated comparatively with the J-pruning and Jmax-pruning algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Black carbon aerosol plays a unique and important role in Earth’s climate system. Black carbon is a type of carbonaceous material with a unique combination of physical properties. This assessment provides an evaluation of black-carbon climate forcing that is comprehensive in its inclusion of all known and relevant processes and that is quantitative in providing best estimates and uncertainties of the main forcing terms: direct solar absorption; influence on liquid, mixed phase, and ice clouds; and deposition on snow and ice. These effects are calculated with climate models, but when possible, they are evaluated with both microphysical measurements and field observations. Predominant sources are combustion related, namely, fossil fuels for transportation, solid fuels for industrial and residential uses, and open burning of biomass. Total global emissions of black carbon using bottom-up inventory methods are 7500 Gg yr�-1 in the year 2000 with an uncertainty range of 2000 to 29000. However, global atmospheric absorption attributable to black carbon is too low in many models and should be increased by a factor of almost 3. After this scaling, the best estimate for the industrial-era (1750 to 2005) direct radiative forcing of atmospheric black carbon is +0.71 W m�-2 with 90% uncertainty bounds of (+0.08, +1.27)Wm�-2. Total direct forcing by all black carbon sources, without subtracting the preindustrial background, is estimated as +0.88 (+0.17, +1.48) W m�-2. Direct radiative forcing alone does not capture important rapid adjustment mechanisms. A framework is described and used for quantifying climate forcings, including rapid adjustments. The best estimate of industrial-era climate forcing of black carbon through all forcing mechanisms, including clouds and cryosphere forcing, is +1.1 W m�-2 with 90% uncertainty bounds of +0.17 to +2.1 W m�-2. Thus, there is a very high probability that black carbon emissions, independent of co-emitted species, have a positive forcing and warm the climate. We estimate that black carbon, with a total climate forcing of +1.1 W m�-2, is the second most important human emission in terms of its climate forcing in the present-day atmosphere; only carbon dioxide is estimated to have a greater forcing. Sources that emit black carbon also emit other short-lived species that may either cool or warm climate. Climate forcings from co-emitted species are estimated and used in the framework described herein. When the principal effects of short-lived co-emissions, including cooling agents such as sulfur dioxide, are included in net forcing, energy-related sources (fossil fuel and biofuel) have an industrial-era climate forcing of +0.22 (�-0.50 to +1.08) W m-�2 during the first year after emission. For a few of these sources, such as diesel engines and possibly residential biofuels, warming is strong enough that eliminating all short-lived emissions from these sources would reduce net climate forcing (i.e., produce cooling). When open burning emissions, which emit high levels of organic matter, are included in the total, the best estimate of net industrial-era climate forcing by all short-lived species from black-carbon-rich sources becomes slightly negative (�-0.06 W m�-2 with 90% uncertainty bounds of �-1.45 to +1.29 W m�-2). The uncertainties in net climate forcing from black-carbon-rich sources are substantial, largely due to lack of knowledge about cloud interactions with both black carbon and co-emitted organic carbon. In prioritizing potential black-carbon mitigation actions, non-science factors, such as technical feasibility, costs, policy design, and implementation feasibility play important roles. The major sources of black carbon are presently in different stages with regard to the feasibility for near-term mitigation. This assessment, by evaluating the large number and complexity of the associated physical and radiative processes in black-carbon climate forcing, sets a baseline from which to improve future climate forcing estimates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Guide to Office Clerical Time Standards is an instructional performance piece based on a corporate manual from 1960. The pamphlet is focused on the time necessary for the accomplishment of minute labour procedures in the office, from the depressing and releasing of typewriter keys to the opening and closing of filing cabinet drawers. In the performance, seven costumed performers represent the different levels of management and employment while performing the actions described in the guide, accompanied by a live musical score. There has been much discussion of the changes to work in the west following the decline of post-Fordist service sector jobs. These increasingly emphasise the specificity of employees’ knowledge and cognitive skill. However, this greater flexibility and creativity at work has been accompanied by an opposite trajectory. The proletarisation of white collar work has given rise to more bureaucracy, target assessment and control for workers in previously looser creative professions, from academia to the arts. The midcentury office is the meeting point of these cultures, where the assembly line efficiency management of the factory meets the quantifying control of the knowledge economy. A Guide to Office Clerical Time Standards explores the survival of one regime into its successor following the lines of combined and uneven development that have turned the emancipatory promise of immaterial labour into the perma-temp hell of the cognitariat. The movement is accompanied by a score of guitar, bass and drums, the componenets of the rock ‘n’ roll music that rose from the car factories of the motor city and the cotton fields of the southern states to represent the same junction of expression and control.