281 resultados para teoreettinen fysiikka


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Transition Radiation Tracker (TRT) of the ATLAS experiment at the LHC is part of the Inner Detector. It is designed as a robust and powerful gaseous detector that provides tracking through individual drift-tubes (straws) as well as particle identification via transition radiation (TR) detection. The straw tubes are operated with Xe-CO2-O2 70/27/3, a gas that combines the advantages of efficient TR absorption, a short electron drift time and minimum ageing effects. The modules of the barrel part of the TRT were built in the United States while the end-cap wheels are assembled at two Russian institutes. Acceptance tests of barrel modules and end-cap wheels are performed at CERN before assembly and integration with the Semiconductor Tracker (SCT) and the Pixel Detector. This thesis first describes simulations the TRT straw tube. The argon-based acceptance gas mixture as well as two xenon-based operating gases are examined for its properties. Drift velocities and Townsend coefficients are computed with the help of the program Magboltz and used to study electron drift and multiplication in the straw using the software Garfield. The inclusion of Penning transfers in the avalanche process leads to remarkable agreements with experimental data. A high level of cleanliness in the TRT s acceptance test gas system is indispensable. To monitor gas purity, a small straw tube detector has been constructed and extensively used to study the ageing behaviour of the straw tube in Ar-CO2. A variety of ageing tests are presented and discussed. Acceptance tests for the TRT survey dimensions, wire tension, gas-tightness, high-voltage stability and gas gain uniformity along each individual straw. The thesis gives details on acceptance criteria and measurement methods in the case of the end-cap wheels. Special focus is put on wire tension and straw straightness. The effect of geometrically deformed straws on gas gain and energy resolution is examined in an experimental setup and compared to simulation studies. An overview of the most important results from the end-cap wheels tested up to this point is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Differentiation of various types of soft tissues is of high importance in medical imaging, because changes in soft tissue structure are often associated with pathologies, such as cancer. However, the densities of different soft tissues may be very similar, making it difficult to distinguish them in absorption images. This is especially true when the consideration of patient dose limits the available signal-to-noise ratio. Refraction is more sensitive than absorption to changes in the density, and small angle x-ray scattering on the other hand contains information about the macromolecular structure of the tissues. Both of these can be used as potential sources of contrast when soft tissues are imaged, but little is known about the visibility of the signals in realistic imaging situations. In this work the visibility of small-angle scattering and refraction in the context of medical imaging has been studied using computational methods. The work focuses on the study of analyzer based imaging, where the information about the sample is recorded in the rocking curve of the analyzer crystal. Computational phantoms based on simple geometrical shapes with differing material properties are used. The objects have realistic dimensions and attenuation properties that could be encountered in real imaging situations. The scattering properties mimic various features of measured small-angle scattering curves. Ray-tracing methods are used to calculate the refraction and attenuation of the beam, and a scattering halo is accumulated, including the effect of multiple scattering. The changes in the shape of the rocking curve are analyzed with different methods, including diffraction enhanced imaging (DEI), extended DEI (E-DEI) and multiple image radiography (MIR). A wide angle DEI, called W-DEI, is introduced and its performance is compared with that of the established methods. The results indicate that the differences in scattered intensities from healthy and malignant breast tissues are distinguishable to some extent with reasonable dose. Especially the fraction of total scattering has large enough differences that it can serve as a useful source of contrast. The peaks related to the macromolecular structure come to angles that are rather large, and have intensities that are only a small fraction of the total scattered intensity. It is found that such peaks seem to have only limited usefulness in medical imaging. It is also found that W-DEI performs rather well when most of the intensity remains in the direct beam, indicating that dark field imaging methods may produce the best results when scattering is weak. Altogether, it is found that the analysis of scattered intensity is a viable option even in medical imaging where the patient dose is the limiting factor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Boron neutron capture therapy (BNCT) is a form of chemically targeted radiotherapy that utilises the high neutron capture cross-section of boron-10 isotope to achieve a preferential dose increase in the tumour. The BNCT dosimetry poses a special challenge as the radiation dose absorbed by the irradiated tissues consists of several dose different components. Dosimetry is important as the effect of the radiation on the tissue is correlated with the radiation dose. Consistent and reliable radiation dose delivery and dosimetry are thus basic requirements for radiotherapy. The international recommendations for are not directly applicable to BNCT dosimetry. The existing dosimetry guidance for BNCT provides recommendations but also calls for investigating for complementary methods for comparison and improved accuracy. In this thesis the quality assurance and stability measurements of the neutron beam monitors used in dose delivery are presented. The beam monitors were found not to be affected by the presence of a phantom in the beam and that the effect of the reactor core power distribution was less than 1%. The weekly stability test with activation detectors has been generally reproducible within the recommended tolerance value of 2%. An established toolkit for epithermal neutron beams for determination of the dose components is presented and applied in an international dosimetric intercomparison. The measured quantities (neutron flux, fast neutron and photon dose) by the groups in the intercomparison were generally in agreement within the stated uncertainties. However, the uncertainties were large, ranging from 3-30% (1 standard deviation), emphasising the importance of dosimetric intercomparisons if clinical data is to be compared between different centers. Measurements with the Exradin type 2M ionisation chamber have been repeated in the epithermal neutron beam in the same measurement configuration over the course of 10 years. The presented results exclude severe sensitivity changes to thermal neutrons that have been reported for this type of chamber. Microdosimetry and polymer gel dosimetry as complementary methods for epithermal neutron beam dosimetry are studied. For microdosimetry the comparison of results with ionisation chambers and computer simulation showed that the photon dose measured with microdosimetry was lower than with the two other methods. The disagreement was within the uncertainties. For neutron dose the simulation and microdosimetry results agreed within 10% while the ionisation chamber technique gave 10-30% lower neutron dose rates than the two other methods. The response of the BANG-3 gel was found to be linear for both photon and epithermal neutron beam irradiation. The dose distribution normalised to dose maximum measured by MAGIC polymer gel was found to agree well with the simulated result near the dose maximum while the spatial difference between measured and simulated 30% isodose line was more than 1 cm. In both the BANG-3 and MAGIC gel studies, the interpretation of the results was complicated by the presence of high-LET radiation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The methods for estimating patient exposure in x-ray imaging are based on the measurement of radiation incident on the patient. In digital imaging, the useful dose range of the detector is large and excessive doses may remain undetected. Therefore, real-time monitoring of radiation exposure is important. According to international recommendations, the measurement uncertainty should be lower than 7% (confidence level 95%). The kerma-area product (KAP) is a measurement quantity used for monitoring patient exposure to radiation. A field KAP meter is typically attached to an x-ray device, and it is important to recognize the effect of this measurement geometry on the response of the meter. In a tandem calibration method, introduced in this study, a field KAP meter is used in its clinical position and calibration is performed with a reference KAP meter. This method provides a practical way to calibrate field KAP meters. However, the reference KAP meters require comprehensive calibration. In the calibration laboratory it is recommended to use standard radiation qualities. These qualities do not entirely correspond to the large range of clinical radiation qualities. In this work, the energy dependence of the response of different KAP meter types was examined. According to our findings, the recommended accuracy in KAP measurements is difficult to achieve with conventional KAP meters because of their strong energy dependence. The energy dependence of the response of a novel large KAP meter was found out to be much lower than with a conventional KAP meter. The accuracy of the tandem method can be improved by using this meter type as a reference meter. A KAP meter cannot be used to determine the radiation exposure of patients in mammography, in which part of the radiation beam is always aimed directly at the detector without attenuation produced by the tissue. This work assessed whether pixel values from this detector area could be used to monitor the radiation beam incident on the patient. The results were congruent with the tube output calculation, which is the method generally used for this purpose. The recommended accuracy can be achieved with the studied method. New optimization of radiation qualities and dose level is needed when other detector types are introduced. In this work, the optimal selections were examined with one direct digital detector type. For this device, the use of radiation qualities with higher energies was recommended and appropriate image quality was achieved by increasing the low dose level of the system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is widely accepted that the global climate is heating up due to human activities, such as burning of fossil fuels. Therefore we find ourselves forced to make decisions on what measures, if any, need to be taken to decrease our warming effect on the planet before any irrevocable damage occurs. Research is being conducted in a variety of fields to better understand all relevant processes governing Earth s climate, and to assess the relative roles of anthropogenic and biogenic emissions into the atmosphere. One of the least well quantified problems is the impact of small aerosol particles (both of anthropogenic and biogenic origin) on climate, through reflecting solar radiation and their ability to act as condensation nuclei for cloud droplets. In this thesis, the compounds driving the biogenic formation of new particles in the atmosphere have been examined through detailed measurements. As directly measuring the composition of these newly formed particles is extremely difficult, the approach was to indirectly study their different characteristics by measuring the hygroscopicity (water uptake) and volatility (evaporation) of particles between 10 and 50 nm. To study the first steps of the formation process in the sub-3 nm range, the nucleation of gaseous precursors to small clusters, the chemical composition of ambient naturally charged ions were measured. The ion measurements were performed with a newly developed mass spectrometer, which was first characterized in the laboratory before being deployed at a boreal forest measurement site. It was also successfully compared to similar, low-resolution instruments. The ambient measurements showed that sulfuric acid clusters dominate the negative ion spectrum during new particle formation events. Sulfuric acid/ammonia clusters were detected in ambient air for the first time in this work. Even though sulfuric acid is believed to be the most important gas phase precursor driving the initial cluster formation, measurements of the hygroscopicity and volatility of growing 10-50 nm particles in Hyytiälä showed an increasing role of organic vapors of a variety of oxidation levels. This work has provided additional insights into the compounds participating both in the initial formation and subsequent growth of atmospheric new aerosol particles. It will hopefully prove an important step in understanding atmospheric gas-to-particle conversion, which, by influencing cloud properties, can have important climate impacts. All available knowledge needs to be constantly updated, summarized, and brought to the attention of our decision-makers. Only by increasing our understanding of all the relevant processes can we build reliable models to predict the long-term effects of decisions made today.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thin film applications have become increasingly important in our search for multifunctional and economically viable technological solutions of the future. Thin film coatings can be used for a multitude of purposes, ranging from a basic enhancement of aesthetic attributes to the addition of a complex surface functionality. Anything from electronic or optical properties, to an increased catalytic or biological activity, can be added or enhanced by the deposition of a thin film, with a thickness of only a few atomic layers at the best, on an already existing surface. Thin films offer both a means of saving in materials and the possibility for improving properties without a critical enlargement of devices. Nanocluster deposition is a promising new method for the growth of structured thin films. Nanoclusters are small aggregates of atoms or molecules, ranging in sizes from only a few nanometers up to several hundreds of nanometers in diameter. Due to their large surface to volume ratio, and the confinement of atoms and electrons in all three dimensions, nanoclusters exhibit a wide variety of exotic properties that differ notably from those of both single atoms and bulk materials. Nanoclusters are a completely new type of building block for thin film deposition. As preformed entities, clusters provide a new means of tailoring the properties of thin films before their growth, simply by changing the size or composition of the clusters that are to be deposited. Contrary to contemporary methods of thin film growth, which mainly rely on the deposition of single atoms, cluster deposition also allows for a more precise assembly of thin films, as the configuration of single atoms with respect to each other is already predetermined in clusters. Nanocluster deposition offers a possibility for the coating of virtually any material with a nanostructured thin film, and therein the enhancement of already existing physical or chemical properties, or the addition of some exciting new feature. A clearer understanding of cluster-surface interactions, and the growth of thin films by cluster deposition, must, however, be achieved, if clusters are to be successfully used in thin film technologies. Using a combination of experimental techniques and molecular dynamics simulations, both the deposition of nanoclusters, and the growth and modification of cluster-assembled thin films, are studied in this thesis. Emphasis is laid on an understanding of the interaction between metal clusters and surfaces, and therein the behaviour of these clusters during deposition and thin film growth. The behaviour of single metal clusters, as they impact on clean metal surfaces, is analysed in detail, from which it is shown that there exists a cluster size and deposition energy dependent limit, below which epitaxial alignment occurs. If larger clusters are deposited at low energies, or cluster-surface interactions are weaker, non-epitaxial deposition will take place, resulting in the formation of nanocrystalline structures. The effect of cluster size and deposition energy on the morphology of cluster-assembled thin films is also determined, from which it is shown that nanocrystalline cluster-assembled films will be porous. Modification of these thin films, with the purpose of enhancing their mechanical properties and durability, without destroying their nanostructure, is presented. Irradiation with heavy ions is introduced as a feasible method for increasing the density, and therein the mechanical stability, of cluster-assembled thin films, without critically destroying their nanocrystalline properties. The results of this thesis demonstrate that nanocluster deposition is a suitable technique for the growth of nanostructured thin films. The interactions between nanoclusters and their supporting surfaces must, however, be carefully considered, if a controlled growth of cluster-assembled thin films, with precisely tailored properties, is to be achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aerosol particles play a role in the earth ecosystem and affect human health. A significant pathway of producing aerosol particles in the atmosphere is new particle formation, where condensable vapours nucleate and these newly formed clusters grow by condensation and coagulation. However, this phenomenon is still not fully understood. This thesis brings an insight to new particle formation from an experimental point of view. Laboratory experiments were conducted both on the nucleation process and physicochemical properties related to new particle formation. Nucleation rate measurements are used to test nucleation theories. These theories, in turn, are used to predict nucleation rates in atmospheric conditions. However, the nucleation rate measurements have proven quite difficult to conduct, as different devices can yield nucleation rates with differences of several orders of magnitude for the same substances. In this thesis, work has been done to have a greater understanding in nucleation measurements, especially those conducted in a laminar flow diffusion chamber. Systematic studies of nucleation were also made for future verification of nucleation theories. Surface tensions and densities of substances related to atmospheric new particle formation were measured. Ternary sulphuric acid + ammonia + water is a proposed candidate to participate in atmospheric nucleation. Surface tensions of an alternative candidate to nucleate in boreal forest areas, sulphuric acid + dimethylamine + water, were also measured. Binary compounds, consisting of organic acids + water are possible candidates to participate in the early growth of freshly nucleated particles. All the measured surface tensions and densities were fitted with equations, thermodynamically consistent if possible, to be easily applied to atmospheric model calculations of nucleation and subsequent evolution of particle size.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis reports investigations into the paper wetting process and its effects on the surface roughness and the out-of-plane (ZD) stiffness of machine-made paper. The aim of this work was to test the feasibility of employing air-borne ultrasound methods to determine surface roughness (by reflection) and ZD stiffness (by through transmission) of paper during penetration of distilled water, isopropanol and their mixtures. Air-borne ultrasound provides a non-contacting way to evaluate sample structure and mechanics during the liquid penetration event. Contrary to liquid immersion techniques, an air-borne measurement allows studying partial wetting of paper. In addition, two optical methods were developed to reveal the liquid location in paper during wetting. The laser light through transmission method was developed to monitor the liquid location in partially wetted paper. The white light reflection method was primarily used to monitor the penetration of the liquid front in the thickness direction. In the latter experiment the paper was fully wetted. The main results of the thesis were: 1) Liquid penetration induced surface roughening was quantified by monitoring the ultrasound reflection from the paper surface. 2) Liquid penetration induced stiffness alteration in the ZD of paper could be followed by measuring the change in the ultrasound ZD resonance in paper. 3) Through transmitted light revealed the liquid location in the partially wetted paper. 4) Liquid movement in the ZD of the paper could be observed by light reflection. The results imply that the presented ultrasonic means can without contact measure the alteration of paper roughness and stiffness during liquid transport. These methods can help avoiding over engineering the paper which reduces raw material and energy consumption in paper manufacturing. The presented optical means can estimate paper specific wetting properties, such as liquid penetration speed, transport mechanisms and liquid location within the paper structure. In process monitoring, these methods allow process tuning and manufacturing of paper with engineered liquid transport characteristics. With such knowledge the paper behaviour during printing can be predicted. These findings provide new methods for paper printing, surface sizing, and paper coating research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Among the most striking natural phenomena affecting ozone are solar proton events (SPE), during which high-energy protons precipitate into the middle atmosphere in the polar regions. Ionisation caused by the protons results in changes in the lower ionosphere, and in production of neutral odd nitrogen and odd hydrogen species which then destroy ozone in well-known catalytic chemical reaction chains. Large SPEs are able to decrease the ozone concentration of upper stratosphere and mesosphere, but are not expected to significantly affect the ozone layer at 15--30~km altitude. In this work we have used the Sodankylä Ion and Neutral Chemistry Model (SIC) in studies of the short-term effects caused by SPEs. The model results were found to be in a good agreement with ionospheric observations from incoherent scatter radars, riometers, and VLF radio receivers as well as with measurements from the GOMOS/Envisat satellite instrument. For the first time, GOMOS was able to observe the SPE effects on odd nitrogen and ozone in the winter polar region. Ozone observations from GOMOS were validated against those from MIPAS/Envisat instrument, and a good agreement was found throughout the middle atmosphere. For the case of the SPE of October/November 2003, long-term ozone depletion was observed in the upper stratosphere. The depletion was further enhanced by the descent of odd nitrogen from the mesosphere inside the polar vortex, until the recovery occurred in late December. During the event, substantial diurnal variation of ozone depletion was seen in the mesosphere, caused mainly by the the strong diurnal cycle of the odd hydrogen species. In the lower ionosphere, SPEs increase the electron density which is very low in normal conditions. Therefore, SPEs make radar observations easier. In the case of the SPE of October, 1989, we studied the sunset transition of negative charge from electrons to ions, a long-standing problem. The observed phenomenon, which is controlled by the amount of solar radiation, was successfully explained by considering twilight changes in both the rate of photodetachment of negative ions and concentrations of minor neutral species. Changes in the magnetic field of the Earth control the extent of SPE-affected area. For the SPE of November 2001, the results indicated that for low and middle levels of geomagnetic disturbance the estimated cosmic radio noise absorption levels based on a magnetic field model are in a good agreement with ionospheric observations. For high levels of disturbance, the model overestimates the stretching of the geomagnetic field and the geographical extent of SPE-affected area. This work shows the importance of ionosphere-atmosphere interaction for SPE studies. By using both ionospheric and atmospheric observations, we have been able to cover for the most part the whole chain of SPE-triggered processes, from proton-induced ionisation to depletion of ozone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radiation therapy (RT) plays currently significant role in curative treatments of several cancers. External beam RT is carried out mostly by using megavoltage beams of linear accelerators. Tumor eradication and normal tissue complications correlate to dose absorbed in tissues. Normally this dependence is steep and it is crucial that actual dose within patient accurately correspond to the planned dose. All factors in a RT procedure contain uncertainties requiring strict quality assurance. From hospital physicist´s point of a view, technical quality control (QC), dose calculations and methods for verification of correct treatment location are the most important subjects. Most important factor in technical QC is the verification that radiation production of an accelerator, called output, is within narrow acceptable limits. The output measurements are carried out according to a locally chosen dosimetric QC program defining measurement time interval and action levels. Dose calculation algorithms need to be configured for the accelerators by using measured beam data. The uncertainty of such data sets limits for best achievable calculation accuracy. All these dosimetric measurements require good experience, are workful, take up resources needed for treatments and are prone to several random and systematic sources of errors. Appropriate verification of treatment location is more important in intensity modulated radiation therapy (IMRT) than in conventional RT. This is due to steep dose gradients produced within or close to healthy tissues locating only a few millimetres from the targeted volume. The thesis was concentrated in investigation of the quality of dosimetric measurements, the efficacy of dosimetric QC programs, the verification of measured beam data and the effect of positional errors on the dose received by the major salivary glands in head and neck IMRT. A method was developed for the estimation of the effect of the use of different dosimetric QC programs on the overall uncertainty of dose. Data were provided to facilitate the choice of a sufficient QC program. The method takes into account local output stability and reproducibility of the dosimetric QC measurements. A method based on the model fitting of the results of the QC measurements was proposed for the estimation of both of these factors. The reduction of random measurement errors and optimization of QC procedure were also investigated. A method and suggestions were presented for these purposes. The accuracy of beam data was evaluated in Finnish RT centres. Sufficient accuracy level was estimated for the beam data. A method based on the use of reference beam data was developed for the QC of beam data. Dosimetric and geometric accuracy requirements were evaluated for head and neck IMRT when function of the major salivary glands is intended to be spared. These criteria are based on the dose response obtained for the glands. Random measurement errors could be reduced enabling lowering of action levels and prolongation of measurement time interval from 1 month to even 6 months simultaneously maintaining dose accuracy. The combined effect of the proposed methods, suggestions and criteria was found to facilitate the avoidance of maximal dose errors of up to even about 8 %. In addition, their use may make the strictest recommended overall dose accuracy level of 3 % (1SD) achievable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aerosol particles in the atmosphere are known to significantly influence ecosystems, to change air quality and to exert negative health effects. Atmospheric aerosols influence climate through cooling of the atmosphere and the underlying surface by scattering of sunlight, through warming of the atmosphere by absorbing sun light and thermal radiation emitted by the Earth surface and through their acting as cloud condensation nuclei. Aerosols are emitted from both natural and anthropogenic sources. Depending on their size, they can be transported over significant distances, while undergoing considerable changes in their composition and physical properties. Their lifetime in the atmosphere varies from a few hours to a week. New particle formation is a result of gas-to-particle conversion. Once formed, atmospheric aerosol particles may grow due to condensation or coagulation, or be removed by deposition processes. In this thesis we describe analyses of air masses, meteorological parameters and synoptic situations to reveal conditions favourable for new particle formation in the atmosphere. We studied the concentration of ultrafine particles in different types of air masses, and the role of atmospheric fronts and cloudiness in the formation of atmospheric aerosol particles. The dominant role of Arctic and Polar air masses causing new particle formation was clearly observed at Hyytiälä, Southern Finland, during all seasons, as well as at other measurement stations in Scandinavia. In all seasons and on multi-year average, Arctic and North Atlantic areas were the sources of nucleation mode particles. In contrast, concentrations of accumulation mode particles and condensation sink values in Hyytiälä were highest in continental air masses, arriving at Hyytiälä from Eastern Europe and Central Russia. The most favourable situation for new particle formation during all seasons was cold air advection after cold-front passages. Such a period could last a few days until the next front reached Hyytiälä. The frequency of aerosol particle formation relates to the frequency of low-cloud-amount days in Hyytiälä. Cloudiness of less than 5 octas is one of the factors favouring new particle formation. Cloudiness above 4 octas appears to be an important factor that prevents particle growth, due to the decrease of solar radiation, which is one of the important meteorological parameters in atmospheric particle formation and growth. Keywords: Atmospheric aerosols, particle formation, air mass, atmospheric front, cloudiness

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Luonnosta haihtuvat orgaaniset yhdisteet, joita pääsee ilmaan etenkin metsistä, voivat vaikuttaa paikalliseen ja alueelliseen ilmanlaatuun, koska ne reagoivat ilmakehässä. Niiden reaktiotuotteet voivat myös osallistua uusien hiukkasten muodostumiseen ja kasvuun, millä voi olla vaikutusta ilmakehän säteilytaseeseen ja tätä kautta myös ilmastoon. Hiukkaset absorboivat ja sirottavat auringon säteilyä ja maapallon lämpösäteilyä minkä lisäksi ne vaikuttavat pilvien säteilyominaisuuksiin, määrään ja elinikään. Koko maapallon mittakaavassa luonnosta tulevat hiilivetypäästöt ylittävät ihmistoiminnan aiheuttamat päästöt moninkertaisesti. Tämän vuoksi luonnon päästöjen arviointi on tärkeää kun halutaan kehittää tehokkaita ilmanlaatu- ja ilmastostrategioita. Tämä tutkimus käsittelee boreaalisen metsän hiilivetypäästöjä. Boreaalinen metsä eli pohjoinen havumetsä on suurin maanpäällinen ekosysteemi, ja se ulottuu lähes yhtenäisenä nauhana koko pohjoisen pallonpuoliskon ympäri. Sille on tyypillistä puulajien suhteellisen pieni kirjo sekä olosuhteiden ja kasvun voimakkaat vuodenaikaisvaihtelut. Työssä on tutkittu Suomen yleisimmän boreaalisen puun eli männyn hiilivetypäästöjen vuodenaikaisvaihtelua sekä päästöjen riippuvuutta lämpötilasta ja valosta. Saatuja tuloksia on käytetty yhdessä muiden boreaalisilla puilla tehtyjen päästömittaustulosten kanssa Suomen metsiä varten kehitetyssä päästömallissa. Malli perustuu lisäksi maankäyttötietoihin, suomen metsille kehitettyyn luokitukseen ja meteorologisiin tietoihin, joiden avulla se laskee metsien hiilivetypäästöt kasvukauden aikana. Suomen metsien päästöt koostuvat koko kasvukauden ajan suurelta osin alfa- ja beta-pineenistä sekä delta-kareenista. Kesällä ja syksyllä päästöissä on myös paljon sabineenia, jota tulee etenkin lehtipuista. Päästöt seuraavat lämpötilan keskimääräistä vaihtelua, ovat suurimmillaan maan eteläosissa ja laskevat tasaisesti pohjoiseen siirryttäessä. Metsän isopreenipäästö on suhteellisen pieni – Suomessa tärkein isopreeniä päästävä puu on vähäpäästöinen kuusi, koska runsaspäästöisten pajun ja haavan osuus metsän lehtimassasta on hyvin pieni. Tässä työssä on myös laskettu ensimmäinen arvio metsän seskviterpeenipäästöistä. Seskviterpeenipäästöt alkavat Juhannuksen jälkeen ja ovat kasvukauden aikana samaa suuruusluokkaa kuin isopreenipäästöt. Vuositasolla Suomen metsien hiilivetypäästöt ovat noin kaksinkertaiset ihmistoiminnasta aiheutuviin päästöihin verrattuna.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nucleation is the first step in the formation of a new phase inside a mother phase. Two main forms of nucleation can be distinguished. In homogeneous nucleation, the new phase is formed in a uniform substance. In heterogeneous nucleation, on the other hand, the new phase emerges on a pre-existing surface (nucleation site). Nucleation is the source of about 30% of all atmospheric aerosol which in turn has noticeable health effects and a significant impact on climate. Nucleation can be observed in the atmosphere, studied experimentally in the laboratory and is the subject of ongoing theoretical research. This thesis attempts to be a link between experiment and theory. By comparing simulation results to experimental data, the aim is to (i) better understand the experiments and (ii) determine where the theory needs improvement. Computational fluid dynamics (CFD) tools were used to simulate homogeneous onecomponent nucleation of n-alcohols in argon and helium as carrier gases, homogeneous nucleation in the water-sulfuric acid-system, and heterogeneous nucleation of water vapor on silver particles. In the nucleation of n-alcohols, vapor depletion, carrier gas effect and carrier gas pressure effect were evaluated, with a special focus on the pressure effect whose dependence on vapor and carrier gas properties could be specified. The investigation of nucleation in the water-sulfuric acid-system included a thorough analysis of the experimental setup, determining flow conditions, vapor losses, and nucleation zone. Experimental nucleation rates were compared to various theoretical approaches. We found that none of the considered theoretical descriptions of nucleation captured the role of water in the process at all relative humidities. Heterogeneous nucleation was studied in the activation of silver particles in a TSI 3785 particle counter which uses water as its working fluid. The role of the contact angle was investigated and the influence of incoming particle concentrations and homogeneous nucleation on counting efficiency determined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we estimate the ability of the Bertini cascade to simulate Compact Muon Solenoid (CMS) hadron calorimeter HCAL. LHC test beam activity has a tightly coupled cycle of simulation-to-data analysis. Typically, a Geant4 computer experiment is used to understand test beam measurements. Thus an another aspect of this thesis is a description of studies related to developing new CMS H2 test beam data analysis tools and performing data analysis on the basis of CMS Monte Carlo events. These events have been simulated in detail using Geant4 physics models, full CMS detector description, and event reconstruction. Using the ROOT data analysis framework we have developed an offline ANN-based approach to tag b-jets associated with heavy neutral Higgs particles, and we show that this kind of NN methodology can be successfully used to separate the Higgs signal from the background in the CMS experiment.