10 resultados para radiation absorption analysis
em Helda - Digital Repository of University of Helsinki
Resumo:
It is essential to have a thorough understanding of the sources and sinks of oxidized nitrogen (NOy) in the atmosphere, since it has a strong influence on the tropospheric chemistry and the eutrophication of ecosystems. One unknown component in the balance of gaseous oxidized nitrogen is vegetation. Plants absorb nitrogenous species from the air via the stomata, but it is not clear whether plants can also emit them at low ambient concentrations. The possible emissions are small and difficult to measure. The aim of this thesis was to analyse an observation made in southern Finland at the SMEAR II station: solar ultraviolet radiation (UV) induced NOy emissions in chambers measuring the gas exchange of Scots pine (Pinus sylvestris L.) shoots. Both measuring and modelling approaches were used in the study. The measurements were performed under noncontrolled field conditions at low ambient NOy concentrations. The chamber blank i.e. artefact NOy emissions from the chamber walls, was dependent on the UV irradiance and increased with time after renewing the Teflon film on chamber surfaces. The contribution of each pine shoot to the total NOy emissions in the chambers was determined by testing whether the emissions decrease when the shoots are removed from their chambers. Emissions did decrease, but only when the chamber interior was exposed to UV radiation. It was concluded that also the pine shoots emit NOy. The possible effects of transpiration on the chamber blank are discussed in the summary part of the thesis, based on previously unpublished data. The possible processes underlying the UV-induced NOy emissions were reviewed. Surface reactions were more likely than metabolic processes. Photolysis of nitrate deposited on the needles may have generated the NOy emissions; the measurements supported this hypothesis. In that case, the emissions apparently would consist mainly of nitrogen dioxide (NO2), nitric oxide (NO) and nitrous acid (HONO). Within studies on NOy exchange of plants, the gases most frequently studied are NO2 and NO (=NOx). In the present work, the implications of the emissions for the NOx exchange of pine were analysed with a model including both NOy emissions and NOy absorption. The model suggested that if the emissions exist, pines can act as an NOx source rather than a sink, even under relatively high ambient concentrations.
Resumo:
Radioactive particles from three locations were investigated for elemental composition, oxidation states of matrix elements, and origin. Instrumental techniques applied to the task were scanning electron microscopy, X-ray and gamma-ray spectrometry, secondary ion mass spectrometry, and synchrotron radiation based microanalytical techniques comprising X-ray fluorescence spectrometry, X-ray fluorescence tomography, and X-ray absorption near-edge structure spectroscopy. Uranium-containing low activity particles collected from Irish Sea sediments were characterized in terms of composition and distribution of matrix elements and the oxidation states of uranium. Indications of the origin were obtained from the intensity ratios and the presence of thorium, uranium, and plutonium. Uranium in the particles was found to exist mostly as U(IV). Studies on plutonium particles from Runit Island (Marshall Islands) soil indicated that the samples were weapon fuel fragments originating from two separate detonations: a safety test and a low-yield test. The plutonium in the particles was found to be of similar age. The distribution and oxidation states of uranium and plutonium in the matrix of weapon fuel particles from Thule (Greenland) sediments were investigated. The variations in intensity ratios observed with different techniques indicated more than one origin. Uranium in particle matrixes was mostly U(IV), but plutonium existed in some particles mainly as Pu(IV), and in others mainly as oxidized Pu(VI). The results demonstrated that the various techniques were effectively applied in the characterization of environmental radioactive particles. An on-line method was developed for separating americium from environmental samples. The procedure utilizes extraction chromatography to separate americium from light lanthanides, and cation exchange to concentrate americium before the final separation in an ion chromatography column. The separated radiochemically pure americium fraction is measured by alpha spectrometry. The method was tested with certified sediment and soil samples and found to be applicable for the analysis of environmental samples containing a wide range of Am-241 activity. Proceeding from the on-line method developed for americium, a method was also developed for separating plutonium and americium. Plutonium is reduced to Pu(III), and separated together with Am(III) throughout the procedure. Pu(III) and Am(III) are eluted from the ion chromatography column as anionic dipicolinate and oxalate complexes, respectively, and measured by alpha spectrometry.
Resumo:
Several studies link the consumption of whole-grain products to a lowered risk of chronic diseases, such as certain types of cancer, type II diabetes, and cardiovascular diseases. However, the final conclusions of the exact protective mechanisms remain unclear, partly due to a lack of a suitable biomarker for the whole-grain cereals intake. Alkylresorcinols (AR) are phenolic lipids abundant in the outer parts of wheat and rye grains usually with homologues of C15:0- C25:0 alkyl chains, and are suggested to function as whole-grain biomarkers. Mammalian lignan enterolactone has also previously been studied as a potential whole-grain biomarker. In the present work a quantified gas chromatography-mass spectrometry method for the analysis of AR in plasma, erythrocytes, and lipoproteins was developed. The method was used to determine human and pig plasma AR concentrations after the intake of whole-grain wheat and rye products compared to low-fibre wheat bread diets to assess the usability of AR as biomarkers of whole-grain intake. AR plasma concentrations were compared to serum ENL concentrations. AR absorption and elimination kinetics were investigated in a pig model. AR occurrence in human erythrocyte membranes and plasma lipoproteins were determined, and the distribution of AR in blood was evaluated. Plasma AR seem to be absorbed via the lymphatic system from the small intestine, like many other lipophilic compounds. Their apparent elimination half-life is relatively short and is similar to that of tocopherols, which have a similar chemical structure. Plasma AR concentrations increased significantly after a one- to eight-week intake of whole-grain wheat and further on with whole-grain rye bread. The concentrations were also higher after habitual Finnish diet compared to diet with low-fibre bread. Inter-individual variation after a one-week intake of the same amount of bread was high, but the mean plasma AR concentrations increased with increasing AR intake. AR are incorporated into erythrocyte membranes and plasma lipoproteins, and VLDL and HDL were the main AR carriers in human plasma. Based on these studies, plasma AR could function as specific biomarkers of dietary whole-grain products. AR are exclusively found in whole-grains and are more suitable as specific biomarkers of whole-grain intake than previously investigated mammalian lignan enterolactone, that is formed from several plants other than cereals in the diet. Plasma AR C17:0/C21:0 -ratio could distinguish whether whole-grain products in the diet are mainly wheat or rye. AR could be used in epidemiological studies to determine whole-grain intake and to better assess the role of whole-grains in disease prevention.
Resumo:
The structure and operation of CdTe, CdZnTe and Si pixel detectors based on crystalline semiconductors, bump bonding and CMOS technology and developed mainly at Oy Simage Ltd. And Oy Ajat Ltd., Finland for X- and gamma ray imaging are presented. This detector technology evolved from the development of Si strip detectors at the Finnish Research Institute for High Energy Physics (SEFT) which later merged with other physics research units to form the Helsinki Institute of Physics (HIP). General issues of X-ray imaging such as the benefits of the method of direct conversion of X-rays to signal charge in comparison to the indirect method and the pros and cons of photon counting vs. charge integration are discussed. A novel design of Si and CdTe pixel detectors and the analysis of their imaging performance in terms of SNR, MTF, DQE and dynamic range are presented in detail. The analysis shows that directly converting crystalline semiconductor pixel detectors operated in the charge integration mode can be used in X-ray imaging very close to the theoretical performance limits in terms of efficiency and resolution. Examples of the application of the developed imaging technology to dental intra oral and panoramic and to real time X-ray imaging are given. A CdTe photon counting gamma imager is introduced. A physical model to calculate the photo peak efficiency of photon counting CdTe pixel detectors is developed and described in detail. Simulation results indicates that the charge sharing phenomenon due to diffusion of signal charge carriers limits the pixel size of photon counting detectors to about 250 μm. Radiation hardness issues related to gamma and X-ray imaging detectors are discussed.
Resumo:
The Transition Radiation Tracker (TRT) of the ATLAS experiment at the LHC is part of the Inner Detector. It is designed as a robust and powerful gaseous detector that provides tracking through individual drift-tubes (straws) as well as particle identification via transition radiation (TR) detection. The straw tubes are operated with Xe-CO2-O2 70/27/3, a gas that combines the advantages of efficient TR absorption, a short electron drift time and minimum ageing effects. The modules of the barrel part of the TRT were built in the United States while the end-cap wheels are assembled at two Russian institutes. Acceptance tests of barrel modules and end-cap wheels are performed at CERN before assembly and integration with the Semiconductor Tracker (SCT) and the Pixel Detector. This thesis first describes simulations the TRT straw tube. The argon-based acceptance gas mixture as well as two xenon-based operating gases are examined for its properties. Drift velocities and Townsend coefficients are computed with the help of the program Magboltz and used to study electron drift and multiplication in the straw using the software Garfield. The inclusion of Penning transfers in the avalanche process leads to remarkable agreements with experimental data. A high level of cleanliness in the TRT s acceptance test gas system is indispensable. To monitor gas purity, a small straw tube detector has been constructed and extensively used to study the ageing behaviour of the straw tube in Ar-CO2. A variety of ageing tests are presented and discussed. Acceptance tests for the TRT survey dimensions, wire tension, gas-tightness, high-voltage stability and gas gain uniformity along each individual straw. The thesis gives details on acceptance criteria and measurement methods in the case of the end-cap wheels. Special focus is put on wire tension and straw straightness. The effect of geometrically deformed straws on gas gain and energy resolution is examined in an experimental setup and compared to simulation studies. An overview of the most important results from the end-cap wheels tested up to this point is presented.
Resumo:
This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we estimate the ability of the Bertini cascade to simulate Compact Muon Solenoid (CMS) hadron calorimeter HCAL. LHC test beam activity has a tightly coupled cycle of simulation-to-data analysis. Typically, a Geant4 computer experiment is used to understand test beam measurements. Thus an another aspect of this thesis is a description of studies related to developing new CMS H2 test beam data analysis tools and performing data analysis on the basis of CMS Monte Carlo events. These events have been simulated in detail using Geant4 physics models, full CMS detector description, and event reconstruction. Using the ROOT data analysis framework we have developed an offline ANN-based approach to tag b-jets associated with heavy neutral Higgs particles, and we show that this kind of NN methodology can be successfully used to separate the Higgs signal from the background in the CMS experiment.
Resumo:
A novel method for functional lung imaging was introduced by adapting the K-edge subtraction method (KES) to in vivo studies of small animals. In this method two synchrotron radiation energies, which bracket the K-edge of the contrast agent, are used for simultaneous recording of absorption-contrast images. Stable xenon gas is used as the contrast agent, and imaging is performed in projection or computed tomography (CT) mode. Subtraction of the two images yields the distribution of xenon, while removing practically all features due to other structures, and the xenon density can be calculated quantitatively. Because the images are recorded simultaneously, there are no movement artifacts in the subtraction image. Time resolution for a series of CT images is one image/s, which allows functional studies. Voxel size is 0.1mm3, which is an order better than in traditional lung imaging methods. KES imaging technique was used in studies of ventilation distribution and the effects of histamine-induced airway narrowing in healthy, mechanically ventilated, and anaesthetized rabbits. First, the effect of tidal volume on ventilation was studied, and the results show that an increase in tidal volume without an increase in minute ventilation results a proportional increase in regional ventilation. Second, spiral CT was used to quantify the airspace volumes in lungs in normal conditions and after histamine aerosol inhalation, and the results showed large patchy filling defects in peripheral lungs following histamine provocation. Third, the kinetics of proximal and distal airway response to histamine aerosol were examined, and the findings show that the distal airways react immediately to histamine and start to recover, while the reaction and the recovery in proximal airways is slower. Fourth, the fractal dimensions of lungs was studied, and it was found that the fractal dimension is higher at the apical part of the lungs compared to the basal part, indicating structural differences between apical and basal lung level. These results provide new insights to lung function and the effects of drug challenge studies. Nowadays the technique is available at synchrotron radiation facilities, but the compact synchrotron radiation sources are being developed, and in relatively near future the method may be used at hospitals.
Resumo:
This thesis describes current and past n-in-one methods and presents three early experimental studies using mass spectrometry and the triple quadrupole instrument on the application of n-in-one in drug discovery. N-in-one strategy pools and mix samples in drug discovery prior to measurement or analysis. This allows the most promising compounds to be rapidly identified and then analysed. Nowadays properties of drugs are characterised earlier and in parallel with pharmacological efficacy. Studies presented here use in vitro methods as caco-2 cells and immobilized artificial membrane chromatography for drug absorption and lipophilicity measurements. The high sensitivity and selectivity of liquid chromatography mass spectrometry are especially important for new analytical methods using n-in-one. In the first study, the fragmentation patterns of ten nitrophenoxy benzoate compounds, serial homology, were characterised and the presence of the compounds was determined in a combinatorial library. The influence of one or two nitro substituents and the alkyl chain length of methyl to pentyl on collision-induced fragmentation was studied, and interesting structurefragmentation relationships were detected. Two nitro group compounds increased fragmentation compared to one nitro group, whereas less fragmentation was noted in molecules with a longer alkyl chain. The most abundant product ions were nitrophenoxy ions, which were also tested in the precursor ion screening of the combinatorial library. In the second study, the immobilized artificial membrane chromatographic method was transferred from ultraviolet detection to mass spectrometric analysis and a new method was developed. Mass spectra were scanned and the chromatographic retention of compounds was analysed using extract ion chromatograms. When changing detectors and buffers and including n-in-one in the method, the results showed good correlation. Finally, the results demonstrated that mass spectrometric detection with gradient elution can provide a rapid and convenient n-in-one method for ranking the lipophilic properties of several structurally diverse compounds simultaneously. In the final study, a new method was developed for caco-2 samples. Compounds were separated by liquid chromatography and quantified by selected reaction monitoring using mass spectrometry. This method was used for caco-2 samples, where absorption of ten chemically and physiologically different compounds was screened using both single and nin- one approaches. These three studies used mass spectrometry for compound identification, method transfer and quantitation in the area of mixture analysis. Different mass spectrometric scanning modes for the triple quadrupole instrument were used in each method. Early drug discovery with n-in-one is area where mass spectrometric analysis, its possibilities and proper use, is especially important.
Resumo:
X-ray synchrotron radiation was used to study the nanostructure of cellulose in Norway spruce stem wood and powders of cobalt nanoparticles in cellulose support. Furthermore, the growth of metallic clusters was modelled and simulated in the mesoscopic size scale. Norway spruce was characterized with x-ray microanalysis at beamline ID18F of the European Synchrotron Radiation Facility in Grenoble. The average dimensions and the orientation of cellulose crystallites was determined using x-ray microdiffraction. In addition, the nutrient element content was determined using x-ray fluorescence spectroscopy. Diffraction patterns and fluorescence spectra were simultaneously acquired. Cobalt nanoparticles in cellulose support were characterized with x-ray absorption spectroscopy at beamline X1 of the Deutsches Elektronen-Synchrotron in Hamburg, complemented by home lab experiments including x-ray diffraction, electron microscopy and measurement of magnetic properties with a vibrating sample magnetometer. Extended x-ray absorption fine structure spectroscopy (EXAFS) and x-ray diffraction were used to solve the atomic arrangement of the cobalt nanoparticles. Scanning- and transmission electron microscopy were used to image the surfaces of the cellulose fibrils, where the growth of nanoparticles takes place. The EXAFS experiment was complemented by computational coordination number calculations on ideal spherical nanocrystals. The growth process of metallic nanoclusters on cellulose matrix is assumed to be rather complicated, affected not only by the properties of the clusters themselves, but essentially depending on the cluster-fiber interfaces as well as the morphology of the fiber surfaces. The final favored average size for nanoclusters, if such exists, is most probably a consequence of these two competing tendencies towards size selection, one governed by pore sizes, the other by the cluster properties. In this thesis, a mesoscopic model for the growth of metallic nanoclusters on porous cellulose fiber (or inorganic) surfaces is developed. The first step in modelling was to evaluate the special case of how the growth proceeds on flat or wedged surfaces.
Resumo:
In meteorology, observations and forecasts of a wide range of phenomena for example, snow, clouds, hail, fog, and tornados can be categorical, that is, they can only have discrete values (e.g., "snow" and "no snow"). Concentrating on satellite-based snow and cloud analyses, this thesis explores methods that have been developed for evaluation of categorical products and analyses. Different algorithms for satellite products generate different results; sometimes the differences are subtle, sometimes all too visible. In addition to differences between algorithms, the satellite products are influenced by physical processes and conditions, such as diurnal and seasonal variation in solar radiation, topography, and land use. The analysis of satellite-based snow cover analyses from NOAA, NASA, and EUMETSAT, and snow analyses for numerical weather prediction models from FMI and ECMWF was complicated by the fact that we did not have the true knowledge of snow extent, and we were forced simply to measure the agreement between different products. The Sammon mapping, a multidimensional scaling method, was then used to visualize the differences between different products. The trustworthiness of the results for cloud analyses [EUMETSAT Meteorological Products Extraction Facility cloud mask (MPEF), together with the Nowcasting Satellite Application Facility (SAFNWC) cloud masks provided by Météo-France (SAFNWC/MSG) and the Swedish Meteorological and Hydrological Institute (SAFNWC/PPS)] compared with ceilometers of the Helsinki Testbed was estimated by constructing confidence intervals (CIs). Bootstrapping, a statistical resampling method, was used to construct CIs, especially in the presence of spatial and temporal correlation. The reference data for validation are constantly in short supply. In general, the needs of a particular project drive the requirements for evaluation, for example, for the accuracy and the timeliness of the particular data and methods. In this vein, we discuss tentatively how data provided by general public, e.g., photos shared on the Internet photo-sharing service Flickr, can be used as a new source for validation. Results show that they are of reasonable quality and their use for case studies can be warmly recommended. Last, the use of cluster analysis on meteorological in-situ measurements was explored. The Autoclass algorithm was used to construct compact representations of synoptic conditions of fog at Finnish airports.