27 resultados para Elemental analyses measurements
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
The reactions of 4,4′-bipyridine with selected trinuclear triangular copper(II) complexes, [Cu3(μ3-OH)(μ-pz)3(RCOO)2Lx], [pz = pyrazolate anion, R = CH3(CH2)n (2 ≤ n ≤ 5); L = H2O, MeOH, EtOH] yielded a series of 1D coordination polymers (CPs) based on the repetition of [Cu3(μ3-OH)(μ-pz)3] secondary building units joined by bipyridine. The CPs were characterized by conventional analytical methods (elemental analyses, ESI-MS, IR spectra) and single crystal XRD determinations. An unprecedented 1D CP, generated through the bipyridine bridging hexanuclear copper clusters moieties, two 1D CPs presenting structural analogies, and two monodimensional tapes having almost exactly superimposable structures, were obtained. In one case, the crystal packing makes evident the presence of small, not-connected pores, accounting for ca. 6% of free cell volume.
Online radiocarbon measurements of small samples using Elemental Analyzer and MICADAS gas ion source
Resumo:
Mass spectrometric analysis of elemental and isotopic compositions of several NIST standards is performed by a miniature laser ablation/ionisation reflectron-type time-of-flight mass spectrometer (LMS) using a fs-laser ablation ion source (775 nm, 190 fs, 1 kHz). The results of the mass spectrometric studies indicate that in a defined range of laser irradiance (fluence) and for a certain number of accumulations of single laser shot spectra, the measurements of isotope abundances can be conducted with a measurement accuracy at the per mill level and at the per cent level for isotope concentrations higher and lower than 100 ppm, respectively. Also the elemental analysis can be performed with a good accuracy. The LMS instrument combined with a fs-laser ablation ion source exhibits similar detection efficiency for both metallic and non-metallic elements. Relative sensitivity coefficients were determined and found to be close to one, which is of considerable importance for the development of standard-less instruments. Negligible thermal effects, sample damage and excellent characteristics of the fs-laser beam are thought to be the main reason for substantial improvement of the instrumental performance compared to other laser ablation mass spectrometers.
Resumo:
Radiocarbon (14C) analysis is a unique tool to distinguish fossil/nonfossil sources of carbonaceous aerosols. We present 14C measurements of organic carbon (OC) and total carbon (TC) on highly time resolved filters (3–4 h, typically 12 h or longer have been reported) from 7 days collected during California Research at the Nexus of Air Quality and Climate Change (CalNex) 2010 in Pasadena. Average nonfossil contributions of 58% ± 15% and 51% ± 15% were found for OC and TC, respectively. Results indicate that nonfossil carbon is a major constituent of the background aerosol, evidenced by its nearly constant concentration (2–3 μgC m−3). Cooking is estimated to contribute at least 25% to nonfossil OC, underlining the importance of urban nonfossil OC sources. In contrast, fossil OC concentrations have prominent and consistent diurnal profiles, with significant afternoon enhancements (~3 μgC m−3), following the arrival of the western Los Angeles (LA) basin plume with the sea breeze. A corresponding increase in semivolatile oxygenated OC and organic vehicular emission markers and their photochemical reaction products occurs. This suggests that the increasing OC is mostly from fresh anthropogenic secondary OC (SOC) from mainly fossil precursors formed in the western LA basin plume. We note that in several European cities where the diesel passenger car fraction is higher, SOC is 20% less fossil, despite 2–3 times higher elemental carbon concentrations, suggesting that SOC formation from gasoline emissions most likely dominates over diesel in the LA basin. This would have significant implications for our understanding of the on-road vehicle contribution to ambient aerosols and merits further study.
Resumo:
While several studies have investigated winter-time air pollution with a wide range of concentration levels, hardly any results are available for longer time periods covering several winter-smog episodes at various locations; e.g., often only a few weeks from a single winter are investigated. Here, we present source apportionment results of winter-smog episodes from 16 air pollution monitoring stations across Switzerland from five consecutive winters. Radiocarbon (14C) analyses of the elemental (EC) and organic (OC) carbon fractions, as well as levoglucosan, major water-soluble ionic species and gas-phase pollutant measurements were used to characterize the different sources of PM10. The most important contributions to PM10 during winter-smog episodes in Switzerland were on average the secondary inorganic constituents (sum of nitrate, sulfate and ammonium = 41 ± 15%) followed by organic matter (OM) (34 ± 13%) and EC (5 ± 2%). The non-fossil fractions of OC (fNF,OC) ranged on average from 69 to 85 and 80 to 95% for stations north and south of the Alps, respectively, showing that traffic contributes on average only up to ~ 30% to OC. The non-fossil fraction of EC (fNF,EC), entirely attributable to primary wood burning, was on average 42 ± 13 and 49 ± 15% for north and south of the Alps, respectively. While a high correlation was observed between fossil EC and nitrogen oxides, both primarily emitted by traffic, these species did not significantly correlate with fossil OC (OCF), which seems to suggest that a considerable amount of OCF is secondary, from fossil precursors. Elevated fNF,EC and fNF,OC values and the high correlation of the latter with other wood burning markers, including levoglucosan and water soluble potassium (K+) indicate that residential wood burning is the major source of carbonaceous aerosols during winter-smog episodes in Switzerland. The inspection of the non-fossil OC and EC levels and the relation with levoglucosan and water-soluble K+ shows different ratios for stations north and south of the Alps (most likely because of differences in burning technologies) for these two regions in Switzerland.
Resumo:
Elemental carbon (EC) or black carbon (BC) in the atmosphere has a strong influence on both climate and human health. In this study, radiocarbon (14C) based source apportionment is used to distinguish between fossil fuel and biomass burning sources of EC isolated from aerosol filter samples collected in Beijing from June 2010 to May 2011. The 14C results demonstrate that EC is consistently dominated by fossil-fuel combustion throughout the whole year with a mean contribution of 79% ± 6% (ranging from 70% to 91%), though EC has a higher mean and peak concentrations in the cold season. The seasonal molecular pattern of hopanes (i.e., a class of organic markers mainly emitted during the combustion of different fossil fuels) indicates that traffic-related emissions are the most important fossil source in the warm period and coal combustion emissions are significantly increased in the cold season. By combining 14C based source apportionment results and picene (i.e., an organic marker for coal emissions) concentrations, relative contributions from coal (mainly from residential bituminous coal) and vehicle to EC in the cold period were estimated as 25 ± 4% and 50 ± 7%, respectively, whereas the coal combustion contribution was negligible or very small in the warm period.
Resumo:
This paper describes informatics for cross-sample analysis with comprehensive two-dimensional gas chromatography (GCxGC) and high-resolution mass spectrometry (HRMS). GCxGC-HRMS analysis produces large data sets that are rich with information, but highly complex. The size of the data and volume of information requires automated processing for comprehensive cross-sample analysis, but the complexity poses a challenge for developing robust methods. The approach developed here analyzes GCxGC-HRMS data from multiple samples to extract a feature template that comprehensively captures the pattern of peaks detected in the retention-times plane. Then, for each sample chromatogram, the template is geometrically transformed to align with the detected peak pattern and generate a set of feature measurements for cross-sample analyses such as sample classification and biomarker discovery. The approach avoids the intractable problem of comprehensive peak matching by using a few reliable peaks for alignment and peak-based retention-plane windows to define comprehensive features that can be reliably matched for cross-sample analysis. The informatics are demonstrated with a set of 18 samples from breast-cancer tumors, each from different individuals, six each for Grades 1-3. The features allow classification that matches grading by a cancer pathologist with 78% success in leave-one-out cross-validation experiments. The HRMS signatures of the features of interest can be examined for determining elemental compositions and identifying compounds.
Resumo:
The mechanism underlying the mineralization of bone is well studied and yet it remains controversial. Inherent difficulties of imaging mineralized tissues and the aqueous solubility of calcium and phosphate, the 2 ions which combine to form bone mineral crystals, limit current analyses of labile diffusible, amorphous, and crystalline intermediates by electron microscopy. To improve the retention of calcium and phosphorus, we developed a pseudo nonaqueous processing approach and used it to characterize biomineralization foci, extracellular sites of hydroxyapatite deposition in osteoblastic cell cultures. Since mineralization of UMR106-01 osteoblasts is temporally synchronized and begins 78 h after plating, we used these cultures to evaluate the effectiveness of our method when applied to cells just prior to the formation of the first mineral crystals. Our approach combines for the first time 3 well-established methods with a fourth one, i.e. dry ultrathin sectioning. Dry ultrathin sectioning with an oscillating diamond knife was used to produce electron spectroscopic images of mineralized biomineralization foci which were high-pressure frozen and freeze substituted. For comparison, cultures were also treated with conventional processing and wet sectioning. The results show that only the use of pseudo nonaqueous processing was able to detect extracellular sites of early calcium and phosphorus enrichment at 76 h, several hours prior to detection of mineral crystals within biomineralization foci.
Resumo:
We present results from the international field campaign DAURE (Detn. of the sources of atm. Aerosols in Urban and Rural Environments in the Western Mediterranean), with the objective of apportioning the sources of fine carbonaceous aerosols. Submicron fine particulate matter (PM1) samples were collected during Feb.-March 2009 and July 2009 at an urban background site in Barcelona (BCN) and at a forested regional background site in Montseny (MSY). We present radiocarbon (14C) anal. for elemental and org. carbon (EC and OC) and source apportionment for these data. We combine the results with those from component anal. of aerosol mass spectrometer (AMS) measurements, and compare to levoglucosan-based ests. of biomass burning OC, source apportionment of filter data with inorg. compn. + EC + OC, submicron bulk potassium (K) concns., and gaseous acetonitrile concns. At BCN, 87 % and 91 % of the EC on av., in winter and summer, resp., had a fossil origin, whereas at MSY these fractions were 66 % and 79 %. The contribution of fossil sources to org. carbon (OC) at BCN was 40 % and 48 %, in winter and summer, resp., and 31 % and 25 % at MSY. The combination of results obtained using the 14C technique, AMS data, and the correlations between fossil OC and fossil EC imply that the fossil OC at Barcelona is ∼47 % primary whereas at MSY the fossil OC is mainly secondary (∼85 %). Day-to-day variation in total carbonaceous aerosol loading and the relative contributions of different sources predominantly depended on the meteorol. transport conditions. The estd. biogenic secondary OC at MSY only increased by ∼40 % compared to the order-of-magnitude increase obsd. for biogenic volatile org. compds. (VOCs) between winter and summer, which highlights the uncertainties in the estn. of that component. Biomass burning contributions estd. using the 14C technique ranged from similar to slightly higher than when estd. using other techniques, and the different estns. were highly or moderately correlated. Differences can be explained by the contribution of secondary org. matter (not included in the primary biomass burning source ests.), and/or by an over-estn. of the biomass burning OC contribution by the 14C technique if the estd. biomass burning EC/OC ratio used for the calcns. is too high for this region. Acetonitrile concns. correlate well with the biomass burning EC detd. by 14C. K is a noisy tracer for biomass burning. [on SciFinder(R)]
Resumo:
Microarrays have established as instrumental for bacterial detection, identification, and genotyping as well as for transcriptomic studies. For gene expression analyses using limited numbers of bacteria (derived from in vivo or ex vivo origin, for example), RNA amplification is often required prior to labeling and hybridization onto microarrays. Evaluation of the fidelity of the amplification methods is crucial for the robustness and reproducibility of microarray results. We report here the first utilization of random primers and the highly processive Phi29 phage polymerase to amplify material for transcription profiling analyses. We compared two commercial amplification methods (GenomiPhi and MessageAmp kits) with direct reverse-transcription as the reference method, focusing on the robustness of mRNA quantification using either microarrays or quantitative RT-PCR. Both amplification methods using either poly-A tailing followed by in vitro transcription, or direct strand displacement polymerase, showed appreciable linearity. Strand displacement technique was particularly affordable compared to in vitro transcription-based (IVT) amplification methods and consisted in a single tube reaction leading to high amplification yields. Real-time measurements using low-, medium-, and highly expressed genes revealed that this simple method provided linear amplification with equivalent results in terms of relative messenger abundance as those obtained by conventional direct reverse-transcription.
Resumo:
Exposimeters are increasingly applied in bioelectromagnetic research to determine personal radiofrequency electromagnetic field (RF-EMF) exposure. The main advantages of exposimeter measurements are their convenient handling for study participants and the large amount of personal exposure data, which can be obtained for several RF-EMF sources. However, the large proportion of measurements below the detection limit is a challenge for data analysis. With the robust ROS (regression on order statistics) method, summary statistics can be calculated by fitting an assumed distribution to the observed data. We used a preliminary sample of 109 weekly exposimeter measurements from the QUALIFEX study to compare summary statistics computed by robust ROS with a naïve approach, where values below the detection limit were replaced by the value of the detection limit. For the total RF-EMF exposure, differences between the naïve approach and the robust ROS were moderate for the 90th percentile and the arithmetic mean. However, exposure contributions from minor RF-EMF sources were considerably overestimated with the naïve approach. This results in an underestimation of the exposure range in the population, which may bias the evaluation of potential exposure-response associations. We conclude from our analyses that summary statistics of exposimeter data calculated by robust ROS are more reliable and more informative than estimates based on a naïve approach. Nevertheless, estimates of source-specific medians or even lower percentiles depend on the assumed data distribution and should be considered with caution.
Resumo:
Quantitative meta-analyses of randomized clinical trials investigating the specific therapeutic efficacy of homeopathic remedies yielded statistically significant differences compared to placebo. Since the remedies used contained mostly only very low concentrations of pharmacologically active compounds, these effects cannot be accounted for within the framework of current pharmacology. Theories to explain clinical effects of homeopathic remedies are partially based upon changes in diluent structure. To investigate the latter, we measured for the first time high-field (600/500 MHz) 1H T1 and T2 nuclear magnetic resonance relaxation times of H2O in homeopathic preparations with concurrent contamination control by inductively coupled plasma mass spectrometry (ICP-MS). Homeopathic preparations of quartz (10c–30c, n = 21, corresponding to iterative dilutions of 100−10–100−30), sulfur (13x–30x, n = 18, 10−13–10−30), and copper sulfate (11c–30c, n = 20, 100−11–100−30) were compared to n = 10 independent controls each (analogously agitated dilution medium) in randomized and blinded experiments. In none of the samples, the concentration of any element analyzed by ICP-MS exceeded 10 ppb. In the first measurement series (600 MHz), there was a significant increase in T1 for all samples as a function of time, and there were no significant differences between homeopathic potencies and controls. In the second measurement series (500 MHz) 1 year after preparation, we observed statistically significant increased T1 relaxation times for homeopathic sulfur preparations compared to controls. Fifteen out of 18 correlations between sample triplicates were higher for controls than for homeopathic preparations. No conclusive explanation for these phenomena can be given at present. Possible hypotheses involve differential leaching from the measurement vessel walls or a change in water molecule dynamics, i.e., in rotational correlation time and/or diffusion. Homeopathic preparations thus may exhibit specific physicochemical properties that need to be determined in detail in future investigations.
Resumo:
The ActiGraph accelerometer is commonly used to measure physical activity in children. Count cut-off points are needed when using accelerometer data to determine the time a person spent in moderate or vigorous physical activity. For the GT3X accelerometer no cut-off points for young children have been published yet. The aim of the current study was thus to develop and validate count cut-off points for young children. Thirty-two children aged 5 to 9 years performed four locomotor and four play activities. Activity classification into the light-, moderate- or vigorous-intensity category was based on energy expenditure measurements with indirect calorimetry. Vertical axis as well as vector magnitude cut-off points were determined through receiver operating characteristic curve analyses with the data of two thirds of the study group and validated with the data of the remaining third. The vertical axis cut-off points were 133 counts per 5 sec for moderate to vigorous physical activity (MVPA), 193 counts for vigorous activity (VPA) corresponding to a metabolic threshold of 5 MET and 233 for VPA corresponding to 6 MET. The vector magnitude cut-off points were 246 counts per 5 sec for MVPA, 316 counts for VPA - 5 MET and 381 counts for VPA - 6 MET. When validated, the current cut-off points generally showed high recognition rates for each category, high sensitivity and specificity values and moderate agreement in terms of the Kappa statistic. These results were similar for vertical axis and vector magnitude cut-off points. The current cut-off points adequately reflect MVPA and VPA in young children. Cut-off points based on vector magnitude counts did not appear to reflect the intensity categories better than cut-off points based on vertical axis counts alone.
Resumo:
We present a conceptual prototype model of a focal plane array unit for the STEAMR instrument, highlighting the challenges presented by the required high relative beam proximity of the instrument and focus on how edge-diffraction effects contribute to the array's performance. The analysis was carried out as a comparative process using both PO & PTD and MoM techniques. We first highlight general differences between these computational techniques, with the discussion focusing on diffractive edge effects for near-field imaging reflectors with high truncation. We then present the results of in-depth modeling analyses of the STEAMR focal plane array followed by near-field antenna measurements of a breadboard model of the array. The results of these near-field measurements agree well with both simulation techniques although MoM shows slightly higher complex beam coupling to the measurements than PO & PTD.