7 resultados para two-step chemical reaction model
em Helda - Digital Repository of University of Helsinki
Resumo:
Comprehensive two-dimensional gas chromatography (GC×GC) offers enhanced separation efficiency, reliability in qualitative and quantitative analysis, capability to detect low quantities, and information on the whole sample and its components. These features are essential in the analysis of complex samples, in which the number of compounds may be large or the analytes of interest are present at trace level. This study involved the development of instrumentation, data analysis programs and methodologies for GC×GC and their application in studies on qualitative and quantitative aspects of GC×GC analysis. Environmental samples were used as model samples. Instrumental development comprised the construction of three versions of a semi-rotating cryogenic modulator in which modulation was based on two-step cryogenic trapping with continuously flowing carbon dioxide as coolant. Two-step trapping was achieved by rotating the nozzle spraying the carbon dioxide with a motor. The fastest rotation and highest modulation frequency were achieved with a permanent magnetic motor, and modulation was most accurate when the motor was controlled with a microcontroller containing a quartz crystal. Heated wire resistors were unnecessary for the desorption step when liquid carbon dioxide was used as coolant. With use of the modulators developed in this study, the narrowest peaks were 75 ms at base. Three data analysis programs were developed allowing basic, comparison and identification operations. Basic operations enabled the visualisation of two-dimensional plots and the determination of retention times, peak heights and volumes. The overlaying feature in the comparison program allowed easy comparison of 2D plots. An automated identification procedure based on mass spectra and retention parameters allowed the qualitative analysis of data obtained by GC×GC and time-of-flight mass spectrometry. In the methodological development, sample preparation (extraction and clean-up) and GC×GC methods were developed for the analysis of atmospheric aerosol and sediment samples. Dynamic sonication assisted extraction was well suited for atmospheric aerosols collected on a filter. A clean-up procedure utilising normal phase liquid chromatography with ultra violet detection worked well in the removal of aliphatic hydrocarbons from a sediment extract. GC×GC with flame ionisation detection or quadrupole mass spectrometry provided good reliability in the qualitative analysis of target analytes. However, GC×GC with time-of-flight mass spectrometry was needed in the analysis of unknowns. The automated identification procedure that was developed was efficient in the analysis of large data files, but manual search and analyst knowledge are invaluable as well. Quantitative analysis was examined in terms of calibration procedures and the effect of matrix compounds on GC×GC separation. In addition to calibration in GC×GC with summed peak areas or peak volumes, simplified area calibration based on normal GC signal can be used to quantify compounds in samples analysed by GC×GC so long as certain qualitative and quantitative prerequisites are met. In a study of the effect of matrix compounds on GC×GC separation, it was shown that quality of the separation of PAHs is not significantly disturbed by the amount of matrix and quantitativeness suffers only slightly in the presence of matrix and when the amount of target compounds is low. The benefits of GC×GC in the analysis of complex samples easily overcome some minor drawbacks of the technique. The developed instrumentation and methodologies performed well for environmental samples, but they could also be applied for other complex samples.
Resumo:
Miniaturization of analytical instrumentation is attracting growing interest in response to the explosive demand for rapid, yet sensitive analytical methods and low-cost, highly automated instruments for pharmaceutical and bioanalyses and environmental monitoring. Microfabrication technology in particular, has enabled fabrication of low-cost microdevices with a high degree of integrated functions, such as sample preparation, chemical reaction, separation, and detection, on a single microchip. These miniaturized total chemical analysis systems (microTAS or lab-on-a-chip) can also be arrayed for parallel analyses in order to accelerate the sample throughput. Other motivations include reduced sample consumption and waste production as well as increased speed of analysis. One of the most promising hyphenated techniques in analytical chemistry is the combination of a microfluidic separation chip and mass spectrometer (MS). In this work, the emerging polymer microfabrication techniques, ultraviolet lithography in particular, were exploited to develop a capillary electrophoresis (CE) separation chip which incorporates a monolithically integrated electrospray ionization (ESI) emitter for efficient coupling with MS. An epoxy photoresist SU-8 was adopted as structural material and characterized with respect to its physicochemical properties relevant to chip-based CE and ESI/MS, namely surface charge, surface interactions, heat transfer, and solvent compatibility. As a result, SU-8 was found to be a favorable material to substitute for the more commonly used glass and silicon in microfluidic applications. In addition, an infrared (IR) thermography was introduced as direct, non-intrusive method to examine the heat transfer and thermal gradients during microchip-CE. The IR data was validated through numerical modeling. The analytical performance of SU-8-based microchips was established for qualitative and quantitative CE-ESI/MS analysis of small drug compounds, peptides, and proteins. The CE separation efficiency was found to be similar to that of commercial glass microchips and conventional CE systems. Typical analysis times were only 30-90 s per sample indicating feasibility for high-throughput analysis. Moreover, a mass detection limit at the low-attomole level, as low as 10E+5 molecules, was achieved utilizing MS detection. The SU-8 microchips developed in this work could also be mass produced at low cost and with nearly identical performance from chip to chip. Until this work, the attempts to combine CE separation with ESI in a chip-based system, amenable to batch fabrication and capable of high, reproducible analytical performance, have not been successful. Thus, the CE-ESI chip developed in this work is a substantial step toward lab-on-a-chip technology.
Resumo:
Objectives of this study were to determine secular trends of diabetes prevalence in China and develop simple risk assessment algorithms for screening individuals with high-risk for diabetes or with undiagnosed diabetes in Chinese and Indian adults. Two consecutive population based surveys in Chinese and a prospective study in Mauritian Indians were involved in this study. The Chinese surveys were conducted in randomly selected populations aged 20-74 years in 2001-2002 (n=14 592) and 35-74 years in 2006 (n=4416). A two-step screening strategy using fasting capillary plasma glucose (FCG) as first-line screening test followed by standard 2-hour 75g oral glucose tolerance tests (OGTTs) was applied to 12 436 individuals in 2001, while OGTTs were administrated to all participants together with FCG in 2006 and to 2156 subjects in 2002. In Mauritius, two consecutive population based surveys were conducted in Mauritian Indians aged 20-65 years in 1987 and 1992; 3094 Indians (1141 men), who were not diagnosed as diabetes at baseline, were reexamined with OGTTs in 1992 and/or 1998. Diabetes and pre-diabetes was defined following 2006 World Health Organization/ International Diabetes Federation Criteria. Age-standardized, as well as age- and sex-specific, prevalence of diabetes and pre-diabetes in adult Chinese was significantly increased from 12.2% and 15.4% in 2001 to 16.0% and 21.2% in 2006, respectively. A simple Chinese diabetes risk score was developed based on the data of Chinese survey 2001-2002 and validated in the population of survey 2006. The risk scores based on β coefficients derived from the final Logistic regression model ranged from 3 – 32. When the score was applied to the population of survey 2006, the area under operating characteristic curve (AUC) of the score for screening undiagnosed diabetes was 0.67 (95% CI, 0.65-0.70), which was lower than the AUC of FCG (0.76 [0.74-0.79]), but similar to that of HbA1c (0.68 [0.65-0.71]). At a cut-off point of 14, the sensitivity and specificity of the risk score in screening undiagnosed diabetes was 0.84 (0.81-0.88) and 0.40 (0.38-0.41). In Mauritian Indian, body mass index (BMI), waist girth, family history of diabetes (FH), and glucose was confirmed to be independent risk predictors for developing diabetes. Predicted probabilities for developing diabetes derived from a simple Cox regression model fitted with sex, FH, BMI and waist girth ranged from 0.05 to 0.64 in men and 0.03 to 0.49 in women. To predict the onset of diabetes, the AUC of the predicted probabilities was 0.62 (95% CI, 0.56-0.68) in men and 0.64(0.59-0.69) in women. At a cut-off point of 0.12, the sensitivity and specificity was 0.72(0.71-0.74) and 0.47(0.45-0.49) in men; and 0.77(0.75-0.78) and 0.50(0.48-0.52) in women, respectively. In conclusion, there was a rapid increase in prevalence of diabetes in Chinese adults from 2001 to 2006. The simple risk assessment algorithms based on age, obesity and family history of diabetes showed a moderate discrimination of diabetes from non-diabetes, which may be used as first line screening tool for diabetes and pre-diabetes, and for health promotion purpose in Chinese and Indians.
Resumo:
This dissertation examines the short- and long-run impacts of timber prices and other factors affecting NIPF owners' timber harvesting and timber stocking decisions. The utility-based Faustmann model provides testable hypotheses of the exogenous variables retained in the timber supply analysis. The timber stock function, derived from a two-period biomass harvesting model, is estimated using a two-step GMM estimator based on balanced panel data from 1983 to 1991. Timber supply functions are estimated using a Tobit model adjusted for heteroscedasticity and nonnormality of errors based on panel data from 1994 to 1998. Results show that if specification analysis of the Tobit model is ignored, inconsistency and biasedness can have a marked effect on parameter estimates. The empirical results show that owner's age is the single most important factor determining timber stock; timber price is the single most important factor in harvesting decision. The results of the timber supply estimations can be interpreted using utility-based Faustmann model of a forest owner who values a growing timber in situ.
Resumo:
Rare-gas chemistry is of growing interest, and the recent advances include the "insertion" of a Xe atom into OH and water in the rare-gas hydrides HXeO and HXeOH. The insertion of Xe atoms into the H-C bonds of hydrocarbons was also demonstrated for HXeCC, HXeCCH and HXeCCXeH, the last of which was the first rare-gas hydride containing two rare-gas atoms. We describe the preparation and characterization of a new rare-gas compound, HXeOXeH. HXeOXeH was prepared in solid xenon by photolysis of a suitable precursor, for example water, and subsequent mobilization of the photoproducts. The experimental identification was carried out by FTIR spectroscopy, isotopic substitution and by use of various precursors. The photolytical and thermal stability of the new rare-gas hydride was also studied. The experimental work was supported by extensive quantum chemical calculations provided by our co-workers. HXeOXeH forms in a cryogenic xenon matrix from neutral O and H atoms in a two-step diffusion-controlled process involving HXeO as an intermediate [reactions (1) and (2)]. This formation mechanism is unique in that a rare-gas hydride is formed from another rare-gas hydride. H + Xe + O → HXeO (1) HXeO + Xe + H → HXeOXeH (2) Similarly to other rare-gas hydrides, HXeOXeH has a strongly IR-active H-Xe stretching vibration, allowing its spectral detection at 1379.3 cm-1. HXeOXeH is a very high-energy metastable species, yet thermally more stable than many other rare-gas hydrides. The calculated bending barrier of 0.57 eV, is not enough to explain the observed stability, and HXeOXeH might be affected by additional stabilization from the solid xenon environment. Chemical bonding between xenon and environmentally abundant species like water is of particular importance due to the “missing-xenon” problem. The relatively high thermal stability of HXeOXeH compared to other oxygen containing rare-gas compounds is relevant in this respect. Our work also raises the possibility of polymeric (–Xe–O)n networks, similarly to the computationally studied (XeCC)n polymers.
Resumo:
Among the most striking natural phenomena affecting ozone are solar proton events (SPE), during which high-energy protons precipitate into the middle atmosphere in the polar regions. Ionisation caused by the protons results in changes in the lower ionosphere, and in production of neutral odd nitrogen and odd hydrogen species which then destroy ozone in well-known catalytic chemical reaction chains. Large SPEs are able to decrease the ozone concentration of upper stratosphere and mesosphere, but are not expected to significantly affect the ozone layer at 15--30~km altitude. In this work we have used the Sodankylä Ion and Neutral Chemistry Model (SIC) in studies of the short-term effects caused by SPEs. The model results were found to be in a good agreement with ionospheric observations from incoherent scatter radars, riometers, and VLF radio receivers as well as with measurements from the GOMOS/Envisat satellite instrument. For the first time, GOMOS was able to observe the SPE effects on odd nitrogen and ozone in the winter polar region. Ozone observations from GOMOS were validated against those from MIPAS/Envisat instrument, and a good agreement was found throughout the middle atmosphere. For the case of the SPE of October/November 2003, long-term ozone depletion was observed in the upper stratosphere. The depletion was further enhanced by the descent of odd nitrogen from the mesosphere inside the polar vortex, until the recovery occurred in late December. During the event, substantial diurnal variation of ozone depletion was seen in the mesosphere, caused mainly by the the strong diurnal cycle of the odd hydrogen species. In the lower ionosphere, SPEs increase the electron density which is very low in normal conditions. Therefore, SPEs make radar observations easier. In the case of the SPE of October, 1989, we studied the sunset transition of negative charge from electrons to ions, a long-standing problem. The observed phenomenon, which is controlled by the amount of solar radiation, was successfully explained by considering twilight changes in both the rate of photodetachment of negative ions and concentrations of minor neutral species. Changes in the magnetic field of the Earth control the extent of SPE-affected area. For the SPE of November 2001, the results indicated that for low and middle levels of geomagnetic disturbance the estimated cosmic radio noise absorption levels based on a magnetic field model are in a good agreement with ionospheric observations. For high levels of disturbance, the model overestimates the stretching of the geomagnetic field and the geographical extent of SPE-affected area. This work shows the importance of ionosphere-atmosphere interaction for SPE studies. By using both ionospheric and atmospheric observations, we have been able to cover for the most part the whole chain of SPE-triggered processes, from proton-induced ionisation to depletion of ozone.
Resumo:
Atmospheric particles affect the radiation balance of the Earth and thus the climate. New particle formation from nucleation has been observed in diverse atmospheric conditions but the actual formation path is still unknown. The prevailing conditions can be exploited to evaluate proposed formation mechanisms. This study aims to improve our understanding of new particle formation from the view of atmospheric conditions. The role of atmospheric conditions on particle formation was studied by atmospheric measurements, theoretical model simulations and simulations based on observations. Two separate column models were further developed for aerosol and chemical simulations. Model simulations allowed us to expand the study from local conditions to varying conditions in the atmospheric boundary layer, while the long-term measurements described especially characteristic mean conditions associated with new particle formation. The observations show statistically significant difference in meteorological and back-ground aerosol conditions between observed event and non-event days. New particle formation above boreal forest is associated with strong convective activity, low humidity and low condensation sink. The probability of a particle formation event is predicted by an equation formulated for upper boundary layer conditions. The model simulations call into question if kinetic sulphuric acid induced nucleation is the primary particle formation mechanism in the presence of organic vapours. Simultaneously the simulations show that ignoring spatial and temporal variation in new particle formation studies may lead to faulty conclusions. On the other hand, the theoretical simulations indicate that short-scale variations in temperature and humidity unlikely have a significant effect on mean binary water sulphuric acid nucleation rate. The study emphasizes the significance of mixing and fluxes in particle formation studies, especially in the atmospheric boundary layer. The further developed models allow extensive aerosol physical and chemical studies in the future.