971 resultados para landfill gas emission measurements


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The PEM Flex Solo II (Naviscan, Inc., San Diego, CA) is currently the only commercially-available positron emission mammography (PEM) scanner. This scanner does not apply corrections for count rate effects, attenuation or scatter during image reconstruction, potentially affecting the quantitative accuracy of images. This work measures the overall quantitative accuracy of the PEM Flex system, and determines the contributions of error due to count rate effects, attenuation and scatter. Materials and Methods: Gelatin phantoms were designed to simulate breasts of different sizes (4 – 12 cm thick) with varying uniform background activity concentration (0.007 – 0.5 μCi/cc), cysts and lesions (2:1, 5:1, 10:1 lesion-to-background ratios). The overall error was calculated from ROI measurements in the phantoms with a clinically relevant background activity concentration (0.065 μCi/cc). The error due to count rate effects was determined by comparing the overall error at multiple background activity concentrations to the error at 0.007 μCi/cc. A point source and cold gelatin phantoms were used to assess the errors due to attenuation and scatter. The maximum pixel values in gelatin and in air were compared to determine the effect of attenuation. Scatter was evaluated by comparing the sum of all pixel values in gelatin and in air. Results: The overall error in the background was found to be negative in phantoms of all thicknesses, with the exception of the 4-cm thick phantoms (0%±7%), and it increased with thickness (-34%±6% for the 12-cm phantoms). All lesions exhibited large negative error (-22% for the 2:1 lesions in the 4-cm phantom) which increased with thickness and with lesion-to-background ratio (-85% for the 10:1 lesions in the 12-cm phantoms). The error due to count rate in phantoms with 0.065 μCi/cc background was negative (-23%±6% for 4-cm thickness) and decreased with thickness (-7%±7% for 12 cm). Attenuation was a substantial source of negative error and increased with thickness (-51%±10% to -77% ±4% in 4 to 12 cm phantoms, respectively). Scatter contributed a relatively constant amount of positive error (+23%±11%) for all thicknesses. Conclusion: Applying corrections for count rate, attenuation and scatter will be essential for the PEM Flex Solo II to be able to produce quantitatively accurate images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The responses of carbon dioxide (CO2) and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP) and Global Temperature change Potential (GTP), to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%). The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence) lies within the range of (68 to 117) × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and subjective choice to determine AGWP of CO2 and GWP is the time horizon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methane is an important greenhouse gas, responsible for about 20 of the warming induced by long-lived greenhouse gases since pre-industrial times. By reacting with hydroxyl radicals, methane reduces the oxidizing capacity of the atmosphere and generates ozone in the troposphere. Although most sources and sinks of methane have been identified, their relative contributions to atmospheric methane levels are highly uncertain. As such, the factors responsible for the observed stabilization of atmospheric methane levels in the early 2000s, and the renewed rise after 2006, remain unclear. Here, we construct decadal budgets for methane sources and sinks between 1980 and 2010, using a combination of atmospheric measurements and results from chemical transport models, ecosystem models, climate chemistry models and inventories of anthropogenic emissions. The resultant budgets suggest that data-driven approaches and ecosystem models overestimate total natural emissions. We build three contrasting emission scenarios � which differ in fossil fuel and microbial emissions � to explain the decadal variability in atmospheric methane levels detected, here and in previous studies, since 1985. Although uncertainties in emission trends do not allow definitive conclusions to be drawn, we show that the observed stabilization of methane levels between 1999 and 2006 can potentially be explained by decreasing-to-stable fossil fuel emissions, combined with stable-to-increasing microbial emissions. We show that a rise in natural wetland emissions and fossil fuel emissions probably accounts for the renewed increase in global methane levels after 2006, although the relative contribution of these two sources remains uncertain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For atmospheric CO2 reconstructions using ice cores, the technique to release the trapped air from the ice samples is essential for the precision and accuracy of the measurements. We present here a new dry extraction technique in combination with a new gas analytical system that together show significant improvements with respect to current systems. Ice samples (3–15 g) are pulverised using a novel centrifugal ice microtome (CIM) by shaving the ice in a cooled vacuum chamber (−27 °C) in which no friction occurs due to the use of magnetic bearings. Both, the shaving principle of the CIM and the use of magnetic bearings have not been applied so far in this field. Shaving the ice samples produces finer ice powder and releases a minimum of 90% of the trapped air compared to 50%–70% when needle crushing is employed. In addition, the friction-free motion with an optimized design to reduce contaminations of the inner surfaces of the device result in a reduced system offset of about 2.0 ppmv compared to 4.9 ppmv. The gas analytical part shows a higher precision than the corresponding part of our previous system by a factor of two, and all processes except the loading and cleaning of the CIM now run automatically. Compared to our previous system, the complete system shows a 3 times better measurement reproducibility of about 1.1 ppmv (1 σ) which is similar to the best reproducibility of other systems applied in this field. With this high reproducibility, no replicate measurements are required anymore for most future measurement campaigns resulting in a possible output of 12–20 measurements per day compared to a maximum of 6 with other systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An accurate and coherent chronological framework is essential for the interpretation of climatic and environmental records obtained from deep polar ice cores. Until now, one common ice core age scale had been developed based on an inverse dating method (Datice), combining glaciological modelling with absolute and stratigraphic markers between 4 ice cores covering the last 50 ka (thousands of years before present) (Lemieux-Dudon et al., 2010). In this paper, together with the companion paper of Veres et al. (2013), we present an extension of this work back to 800 ka for the NGRIP, TALDICE, EDML, Vostok and EDC ice cores using an improved version of the Datice tool. The AICC2012 (Antarctic Ice Core Chronology 2012) chronology includes numerous new gas and ice stratigraphic links as well as improved evaluation of background and associated variance scenarios. This paper concentrates on the long timescales between 120–800 ka. In this framework, new measurements of δ18Oatm over Marine Isotope Stage (MIS) 11–12 on EDC and a complete δ18Oatm record of the TALDICE ice cores permit us to derive additional orbital gas age constraints. The coherency of the different orbitally deduced ages (from δ18Oatm, δO2/N2 and air content) has been verified before implementation in AICC2012. The new chronology is now independent of other archives and shows only small differences, most of the time within the original uncertainty range calculated by Datice, when compared with the previous ice core reference age scale EDC3, the Dome F chronology, or using a comparison between speleothems and methane. For instance, the largest deviation between AICC2012 and EDC3 (5.4 ka) is obtained around MIS 12. Despite significant modifications of the chronological constraints around MIS 5, now independent of speleothem records in AICC2012, the date of Termination II is very close to the EDC3 one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE Positron emission tomography (PET)∕computed tomography (CT) measurements on small lesions are impaired by the partial volume effect, which is intrinsically tied to the point spread function of the actual imaging system, including the reconstruction algorithms. The variability resulting from different point spread functions hinders the assessment of quantitative measurements in clinical routine and especially degrades comparability within multicenter trials. To improve quantitative comparability there is a need for methods to match different PET∕CT systems through elimination of this systemic variability. Consequently, a new method was developed and tested that transforms the image of an object as produced by one tomograph to another image of the same object as it would have been seen by a different tomograph. The proposed new method, termed Transconvolution, compensates for differing imaging properties of different tomographs and particularly aims at quantitative comparability of PET∕CT in the context of multicenter trials. METHODS To solve the problem of image normalization, the theory of Transconvolution was mathematically established together with new methods to handle point spread functions of different PET∕CT systems. Knowing the point spread functions of two different imaging systems allows determining a Transconvolution function to convert one image into the other. This function is calculated by convolving one point spread function with the inverse of the other point spread function which, when adhering to certain boundary conditions such as the use of linear acquisition and image reconstruction methods, is a numerically accessible operation. For reliable measurement of such point spread functions characterizing different PET∕CT systems, a dedicated solid-state phantom incorporating (68)Ge∕(68)Ga filled spheres was developed. To iteratively determine and represent such point spread functions, exponential density functions in combination with a Gaussian distribution were introduced. Furthermore, simulation of a virtual PET system provided a standard imaging system with clearly defined properties to which the real PET systems were to be matched. A Hann window served as the modulation transfer function for the virtual PET. The Hann's apodization properties suppressed high spatial frequencies above a certain critical frequency, thereby fulfilling the above-mentioned boundary conditions. The determined point spread functions were subsequently used by the novel Transconvolution algorithm to match different PET∕CT systems onto the virtual PET system. Finally, the theoretically elaborated Transconvolution method was validated transforming phantom images acquired on two different PET systems to nearly identical data sets, as they would be imaged by the virtual PET system. RESULTS The proposed Transconvolution method matched different PET∕CT-systems for an improved and reproducible determination of a normalized activity concentration. The highest difference in measured activity concentration between the two different PET systems of 18.2% was found in spheres of 2 ml volume. Transconvolution reduced this difference down to 1.6%. In addition to reestablishing comparability the new method with its parameterization of point spread functions allowed a full characterization of imaging properties of the examined tomographs. CONCLUSIONS By matching different tomographs to a virtual standardized imaging system, Transconvolution opens a new comprehensive method for cross calibration in quantitative PET imaging. The use of a virtual PET system restores comparability between data sets from different PET systems by exerting a common, reproducible, and defined partial volume effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methane is a strong greenhouse gas and large uncertainties exist concerning the future evolution of its atmospheric abundance. Analyzing methane atmospheric mixing and stable isotope ratios in air trapped in polar ice sheets helps in reconstructing the evolution of its sources and sinks in the past. This is important to improve predictions of atmospheric CH4 mixing ratios in the future under the influence of a changing climate. The aim of this study is to assess whether past atmospheric δ13C(CH4) variations can be reliably reconstructed from firn air measurements. Isotope reconstructions obtained with a state of the art firn model from different individual sites show unexpectedly large discrepancies and are mutually inconsistent. We show that small changes in the diffusivity profiles at individual sites lead to strong differences in the firn fractionation, which can explain a large part of these discrepancies. Using slightly modified diffusivities for some sites, and neglecting samples for which the firn fractionation signals are strongest, a combined multi-site inversion can be performed, which returns an isotope reconstruction that is consistent with firn data. However, the isotope trends are lower than what has been concluded from Southern Hemisphere (SH) archived air samples and high-accumulation ice core data. We conclude that with the current datasets and understanding of firn air transport, a high precision reconstruction of δ13C of CH4 from firn air samples is not possible, because reconstructed atmospheric trends over the last 50 yr of 0.3–1.5 ‰ are of the same magnitude as inherent uncertainties in the method, which are the firn fractionation correction (up to ~2 ‰ at individual sites), the Kr isobaric interference (up to ~0.8 ‰, system dependent), inter-laboratory calibration offsets (~0.2 ‰) and uncertainties in past CH4 levels (~0.5 ‰).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rate of destruction of tropical forests continues to accelerate at an alarming rate contributing to an important fraction of overall greenhouse gas emissions. In recent years, much hope has been vested in the emerging REDD+ framework under the UN Framework Convention on Climate Change (UNFCCC), which aims at creating an international incentive system to reduce emissions from deforestation and forest degradation. This paper argues that in the absence of an international consensus on the design of results-based payments, “bottom-up” initiatives should take the lead and explore new avenues. It suggests that a call for tender for REDD+ credits might both assist in leveraging private investments and spending scarce public funds in a cost-efficient manner. The paper discusses the pros and cons of results-based approaches, provides an overview of the goals and principles that govern public procurement and discusses their relevance for the purchase of REDD+ credits, in particular within the ambit of the European Union.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We performed surface and borehole ground penetrating radar (GPR) tests, together with moisture probe measurements and direct gas sampling to detect areas of biogenic gas accumulation in a northern peatland. The main findings are: (1) shadow zones (signal scattering) observed in surface GPR correlate with areas of elevated CH4 and CO2 concentration; (2) high velocities in zero offset profiles and lower water content inferred from moisture probes correlate with surface GPR shadow zones; (3) zero offset profiles depict depth variable gas accumulation from 0-10% by volume; (4) strong reflectors may represent confining layers restricting upward gas migration. Our results have implications for defining the spatial distribution, volume and movement of biogenic gas in peatlands at multiple scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE Fundus autofluorescence (FAF) cannot only be characterized by the intensity or the emission spectrum, but also by its lifetime. As the lifetime of a fluorescent molecule is sensitive to its local microenvironment, this technique may provide more information than fundus autofluorescence imaging. We report here the characteristics and repeatability of FAF lifetime measurements of the human macula using a new fluorescence lifetime imaging ophthalmoscope (FLIO). METHODS A total of 31 healthy phakic subjects were included in this study with an age range from 22 to 61 years. For image acquisition, a fluorescence lifetime ophthalmoscope based on a Heidelberg Engineering Spectralis system was used. Fluorescence lifetime maps of the retina were recorded in a short- (498-560 nm) and a long- (560-720 nm) spectral channel. For quantification of fluorescence lifetimes a standard ETDRS grid was used. RESULTS Mean fluorescence lifetimes were shortest in the fovea, with 208 picoseconds for the short-spectral channel and 239 picoseconds for the long-spectral channel, respectively. Fluorescence lifetimes increased from the central area to the outer ring of the ETDRS grid. The test-retest reliability of FLIO was very high for all ETDRS areas (Spearman's ρ = 0.80 for the short- and 0.97 for the long-spectral channel, P < 0.0001). Fluorescence lifetimes increased with age. CONCLUSIONS The FLIO allows reproducible measurements of fluorescence lifetimes of the macula in healthy subjects. By using a custom-built software, we were able to quantify fluorescence lifetimes within the ETDRS grid. Establishing a clinically accessible standard against which to measure FAF lifetimes within the retina is a prerequisite for future studies in retinal disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mismatching of alveolar ventilation and perfusion (VA/Q) is the major determinant of impaired gas exchange. The gold standard for measuring VA/Q distributions is based on measurements of the elimination and retention of infused inert gases. Conventional multiple inert gas elimination technique (MIGET) uses gas chromatography (GC) to measure the inert gas partial pressures, which requires tonometry of blood samples with a gas that can then be injected into the chromatograph. The method is laborious and requires meticulous care. A new technique based on micropore membrane inlet mass spectrometry (MMIMS) facilitates the handling of blood and gas samples and provides nearly real-time analysis. In this study we compared MIGET by GC and MMIMS in 10 piglets: 1) 3 with healthy lungs; 2) 4 with oleic acid injury; and 3) 3 with isolated left lower lobe ventilation. The different protocols ensured a large range of normal and abnormal VA/Q distributions. Eight inert gases (SF6, krypton, ethane, cyclopropane, desflurane, enflurane, diethyl ether, and acetone) were infused; six of these gases were measured with MMIMS, and six were measured with GC. We found close agreement of retention and excretion of the gases and the constructed VA/Q distributions between GC and MMIMS, and predicted PaO2 from both methods compared well with measured PaO2. VA/Q by GC produced more widely dispersed modes than MMIMS, explained in part by differences in the algorithms used to calculate VA/Q distributions. In conclusion, MMIMS enables faster measurement of VA/Q, is less demanding than GC, and produces comparable results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The T2K long-baseline neutrino oscillation experiment in Japan needs precise predictions of the initial neutrino flux. The highest precision can be reached based on detailed measurements of hadron emission from the same target as used by T2K exposed to a proton beam of the same kinetic energy of 30 GeV. The corresponding data were recorded in 2007-2010 by the NA61/SHINE experiment at the CERN SPS using a replica of the T2K graphite target. In this paper details of the experiment, data taking, data analysis method and results from the 2007 pilot run are presented. Furthermore, the application of the NA61/SHINE measurements to the predictions of the T2K initial neutrino flux is described and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Noble gas radionuclides, including 81Kr (t1/2 = 229,000 years), 85Kr (t1/2 = 10.8 years), and 39Ar (t1/2 = 269 years), possess nearly ideal chemical and physical properties for studies of earth and environmental processes. Recent advances in Atom Trap Trace Analysis (ATTA), a laser-based atom counting method, have enabled routine measurements of the radiokrypton isotopes, as well as the demonstration of the ability to measure 39Ar in environmental samples. Here we provide an overview of the ATTA technique, and a survey of recent progress made in several laboratories worldwide.We review the application of noble gas radionuclides in the geosciences and discuss how ATTA can help advance these fields, specifically: determination of groundwater residence times using 81Kr, 85Kr, and 39Ar; dating old glacial ice using 81Kr; and an 39Ar survey of the main water masses of the oceans, to study circulation pathways and estimate mean residence times. Other scientific questions involving a deeper circulation of fluids in the Earth's crust and mantle are also within the scope of future applications. We conclude that the geoscience community would greatly benefit from an ATTA facility dedicated to this field, with instrumentation for routine measurements, as well as for research on further development of ATTA methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four different literature parameterizations for the formation and evolution of urban secondary organic aerosol (SOA) frequently used in 3-D models are evaluated using a 0-D box model representing the Los Angeles metropolitan region during the California Research at the Nexus of Air Quality and Climate Change (CalNex) 2010 campaign. We constrain the model predictions with measurements from several platforms and compare predictions with particle- and gas-phase observations from the CalNex Pasadena ground site. That site provides a unique opportunity to study aerosol formation close to anthropogenic emission sources with limited recirculation. The model SOA that formed only from the oxidation of VOCs (V-SOA) is insufficient to explain the observed SOA concentrations, even when using SOA parameterizations with multi-generation oxidation that produce much higher yields than have been observed in chamber experiments, or when increasing yields to their upper limit estimates accounting for recently reported losses of vapors to chamber walls. The Community Multiscale Air Quality (WRF-CMAQ) model (version 5.0.1) provides excellent predictions of secondary inorganic particle species but underestimates the observed SOA mass by a factor of 25 when an older VOC-only parameterization is used, which is consistent with many previous model–measurement comparisons for pre-2007 anthropogenic SOA modules in urban areas. Including SOA from primary semi-volatile and intermediate-volatility organic compounds (P-S/IVOCs) following the parameterizations of Robinson et al. (2007), Grieshop et al. (2009), or Pye and Seinfeld (2010) improves model–measurement agreement for mass concentration. The results from the three parameterizations show large differences (e.g., a factor of 3 in SOA mass) and are not well constrained, underscoring the current uncertainties in this area. Our results strongly suggest that other precursors besides VOCs, such as P-S/IVOCs, are needed to explain the observed SOA concentrations in Pasadena. All the recent parameterizations overpredict urban SOA formation at long photochemical ages (3 days) compared to observations from multiple sites, which can lead to problems in regional and especially global modeling. However, reducing IVOC emissions by one-half in the model to better match recent IVOC measurements improves SOA predictions at these long photochemical ages. Among the explicitly modeled VOCs, the precursor compounds that contribute the greatest SOA mass are methylbenzenes. Measured polycyclic aromatic hydrocarbons (naphthalenes) contribute 0.7% of the modeled SOA mass. The amounts of SOA mass from diesel vehicles, gasoline vehicles, and cooking emissions are estimated to be 16–27, 35–61, and 19–35 %, respectively, depending on the parameterization used, which is consistent with the observed fossil fraction of urban SOA, 71(+-3) %. The relative contribution of each source is uncertain by almost a factor of 2 depending on the parameterization used. In-basin biogenic VOCs are predicted to contribute only a few percent to SOA. A regional SOA background of approximately 2.1 μgm-3 is also present due to the long-distance transport of highly aged OA, likely with a substantial contribution from regional biogenic SOA. The percentage of SOA from diesel vehicle emissions is the same, within the estimated uncertainty, as reported in previous work that analyzed the weekly cycles in OA concentrations (Bahreini et al., 2012; Hayes et al., 2013). However, the modeling work presented here suggests a strong anthropogenic source of modern carbon in SOA, due to cooking emissions, which was not accounted for in those previous studies and which is higher on weekends. Lastly, this work adapts a simple two-parameter model to predict SOA concentration and O/C from urban emissions. This model successfully predicts SOA concentration, and the optimal parameter combination is very similar to that found for Mexico City. This approach provides a computationally inexpensive method for predicting urban SOA in global and climate models. We estimate pollution SOA to account for 26 Tg yr-1 of SOA globally, or 17% of global SOA, one third of which is likely to be non-fossil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Eight surface observation sites providing quasi-continuous measurements of atmospheric methane mixingratios have been operated since the mid-2000’s in Siberia. For the first time in a single work, we assimilate 1 year of these in situ observations in an atmospheric inversion. Our objective is to quantify methane surface fluxes from anthropogenic and wetland sources at the mesoscale in the Siberian lowlands for the year 2010. To do so, we first inquire about the way the inversion uses the observations and the way the fluxes are constrained by the observation sites. As atmospheric inver- sions at the mesoscale suffer from mis-quantified sources of uncertainties, we follow recent innovations in inversion techniques and use a new inversion approach which quantifies the uncertainties more objectively than the previous inversion systems. We find that, due to errors in the representation of the atmospheric transport and redundant pieces of information, only one observation every few days is found valuable by the inversion. The remaining high-resolution quasi-continuous signal is representative of very local emission patterns difficult to analyse with a mesoscale system. An analysis of the use of information by the inversion also reveals that the observation sites constrain methane emissions within a radius of 500 km. More observation sites than the ones currently in operation are then necessary to constrain the whole Siberian lowlands. Still, the fluxes within the constrained areas are quantified with objectified uncertainties. Finally, the tolerance intervals for posterior methane fluxes are of roughly 20 % (resp. 50 %) of the fluxes for anthropogenic (resp. wetland) sources. About 50–70 % of Siberian lowlands emissions are constrained by the inversion on average on an annual basis. Extrapolating the figures on the constrained areas to the whole Siberian lowlands, we find a regional methane budget of 5–28 TgCH4 for the year 2010, i.e. 1–5 % of the global methane emissions. As very few in situ observations are available in the region of interest, observations of methane total columns from the Greenhouse Gas Observing SATellite (GOSAT) are tentatively used for the evaluation of the inversion results, but they exhibit only a marginal signal from the fluxes within the region of interest.