25 resultados para DOUBLE-SLIT EXPERIMENTS
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
In this thesis three experiments with atomic hydrogen (H) at low temperatures T<1 K are presented. Experiments were carried out with two- (2D) and three-dimensional (3D) H gas, and with H atoms trapped in solid H2 matrix. The main focus of this work is on interatomic interactions, which have certain specific features in these three systems considered. A common feature is the very high density of atomic hydrogen, the systems are close to quantum degeneracy. Short range interactions in collisions between atoms are important in gaseous H. The system of H in H2 differ dramatically because atoms remain fixed in the H2 lattice and properties are governed by long-range interactions with the solid matrix and with H atoms. The main tools in our studies were the methods of magnetic resonance, with electron spin resonance (ESR) at 128 GHz being used as the principal detection method. For the first time in experiments with H in high magnetic fields and at low temperatures we combined ESR and NMR to perform electron-nuclear double resonance (ENDOR) as well as coherent two-photon spectroscopy. This allowed to distinguish between different types of interactions in the magnetic resonance spectra. Experiments with 2D H gas utilized the thermal compression method in homogeneous magnetic field, developed in our laboratory. In this work methods were developed for direct studies of 3D H at high density, and for creating high density samples of H in H2. We measured magnetic resonance line shifts due to collisions in the 2D and 3D H gases. First we observed that the cold collision shift in 2D H gas composed of atoms in a single hyperfine state is much smaller than predicted by the mean-field theory. This motivated us to carry out similar experiments with 3D H. In 3D H the cold collision shift was found to be an order of magnitude smaller for atoms in a single hyperfine state than that for a mixture of atoms in two different hyperfine states. The collisional shifts were found to be in fair agreement with the theory, which takes into account symmetrization of the wave functions of the colliding atoms. The origin of the small shift in the 2D H composed of single hyperfine state atoms is not yet understood. The measurement of the shift in 3D H provides experimental determination for the difference of the scattering lengths of ground state atoms. The experiment with H atoms captured in H2 matrix at temperatures below 1 K originated from our work with H gas. We found out that samples of H in H2 were formed during recombination of gas phase H, enabling sample preparation at temperatures below 0.5 K. Alternatively, we created the samples by electron impact dissociation of H2 molecules in situ in the solid. By the latter method we reached highest densities of H atoms reported so far, 3.5(5)x1019 cm-3. The H atoms were found to be stable for weeks at temperatures below 0.5 K. The observation of dipolar interaction effects provides a verification for the density measurement. Our results point to two different sites for H atoms in H2 lattice. The steady-state nuclear polarizations of the atoms were found to be non-thermal. The possibility for further increase of the impurity H density is considered. At higher densities and lower temperatures it might be possible to observe phenomena related to quantum degeneracy in solid.
Resumo:
Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.
Resumo:
Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.
Resumo:
Selostus: Koejärjestelyt kohonneen lämpötilan ja CO2-tason vaikutusten simuloimiseksi peltokasveilla Suomessa
Resumo:
The future of high technology welded constructions will be characterised by higher strength materials and improved weld quality with respect to fatigue resistance. The expected implementation of high quality high strength steel welds will require that more attention be given to the issues of crack initiation and mechanical mismatching. Experiments and finite element analyses were performed within the framework of continuum damage mechanics to investigate the effect of mismatching of welded joints on void nucleation and coalescence during monotonic loading. It was found that the damage of undermatched joints mainly occurred in the sandwich layer and the damageresistance of the joints decreases with the decrease of the sandwich layer width. The damage of over-matched joints mainly occurred in the base metal adjacent to the sandwich layer and the damage resistance of the joints increases with thedecrease of the sandwich layer width. The mechanisms of the initiation of the micro voids/cracks were found to be cracking of the inclusions or the embrittled second phase, and the debonding of the inclusions from the matrix. Experimental fatigue crack growth rate testing showed that the fatigue life of under-matched central crack panel specimens is longer than that of over-matched and even-matched specimens. Further investigation by the elastic-plastic finite element analysis indicated that fatigue crack closure, which originated from the inhomogeneousyielding adjacent to the crack tip, played an important role in the fatigue crack propagation. The applicability of the J integral concept to the mismatched specimens with crack extension under cyclic loading was assessed. The concept of fatigue class used by the International Institute of Welding was introduced in the parametric numerical analysis of several welded joints. The effect of weld geometry and load condition on fatigue strength of ferrite-pearlite steel joints was systematically evaluated based on linear elastic fracture mechanics. Joint types included lap joints, angle joints and butt joints. Various combinations of the tensile and bending loads were considered during the evaluation with the emphasis focused on the existence of both root and toe cracks. For a lap joint with asmall lack-of-penetration, a reasonably large weld leg and smaller flank angle were recommended for engineering practice in order to achieve higher fatigue strength. It was found that the fatigue strength of the angle joint depended strongly on the location and orientation of the preexisting crack-like welding defects, even if the joint was welded with full penetration. It is commonly believed that the double sided butt welds can have significantly higher fatigue strength than that of a single sided welds, but fatigue crack initiation and propagation can originate from the weld root if the welding procedure results in a partial penetration. It is clearly shown that the fatigue strength of the butt joint could be improved remarkably by ensuring full penetration. Nevertheless, increasing the fatigue strength of a butt joint by increasing the size of the weld is an uneconomical alternative.
Resumo:
Digitoitu 12. 10. 2007.
Resumo:
Solid-state silicon detectors have replaced conventional ones in almost all recent high-energy physics experiments. Pixel silicon sensors don't have any alternative in the area near the interaction point because of their high resolution and fast operation speed. However, present detectors hardly withstand high radiation doses. Forthcoming upgrade of the LHC in 2014 requires development of a new generation of pixel detectors which will be able to operate under ten times increased luminosity. A planar fabrication technique has some physical limitations; an improvement of the radiation hardness will reduce sensitivity of a detector. In that case a 3D pixel detector seems to be the most promising device which can overcome these difficulties. The objective of this work was to model a structure of the 3D stripixel detector and to simulate electrical characteristics of the device. Silvaco Atlas software has been used for these purposes. The structures of single and double sided dual column detectors with active edges were described using special command language. Simulations of these detectors have shown that electric field inside an active area has more uniform distribution in comparison to the planar structure. A smaller interelectrode space leads to a stronger field and also decreases the collection time. This makes the new type of detectors more radiation resistant. Other discovered advantages are the lower full depletion voltage and increased charge collection efficiency. So the 3D stripixel detectors have demonstrated improved characteristics and will be a suitable replacement for the planar ones.
Resumo:
This dissertation is based on 5 articles which deal with reaction mechanisms of the following selected industrially important organic reactions: 1. dehydrocyclization of n-butylbenzene to produce naphthalene 2. dehydrocyclization of 1-(p-tolyl)-2-methylbutane (MB) to produce 2,6-dimethylnaphthalene 3. esterification of neopentyl glycol (NPG) with different carboxylic acids to produce monoesters 4. skeletal isomerization of 1-pentene to produce 2-methyl-1-butene and 2-methyl-2-butene The results of initial- and integral-rate experiments of n-butylbenzene dehydrocyclization over selfmade chromia/alumina catalyst were applied when investigating reaction 2. Reaction 2 was performed using commercial chromia/alumina of different acidity, platina on silica and vanadium/calcium/alumina as catalysts. On all catalysts used for the dehydrocyclization, major reactions were fragmentation of MB and 1-(p-tolyl)-2-methylbutenes (MBes), dehydrogenation of MB, double bond transfer, hydrogenation and 1,6-cyclization of MBes. Minor reactions were 1,5-cyclization of MBes and methyl group fragmentation of 1,6- cyclization products. Esterification reactions of NPG were performed using three different carboxylic acids: propionic, isobutyric and 2-ethylhexanoic acid. Commercial heterogeneous gellular (Dowex 50WX2), macroreticular (Amberlyst 15) type resins and homogeneous para-toluene sulfonic acid were used as catalysts. At first NPG reacted with carboxylic acids to form corresponding monoester and water. Then monoester esterified with carboxylic acid to form corresponding diester. In disproportionation reaction two monoester molecules formed NPG and corresponding diester. All these three reactions can attain equilibrium. Concerning esterification, water was removed from the reactor in order to prevent backward reaction. Skeletal isomerization experiments of 1-pentene were performed over HZSM-22 catalyst. Isomerization reactions of three different kind were detected: double bond, cis-trans and skeletal isomerization. Minor side reaction were dimerization and fragmentation. Monomolecular and bimolecular reaction mechanisms for skeletal isomerization explained experimental results almost equally well. Pseudohomogeneous kinetic parameters of reactions 1 and 2 were estimated by usual least squares fitting. Concerning reactions 3 and 4 kinetic parameters were estimated by the leastsquares method, but also the possible cross-correlation and identifiability of parameters were determined using Markov chain Monte Carlo (MCMC) method. Finally using MCMC method, the estimation of model parameters and predictions were performed according to the Bayesian paradigm. According to the fitting results suggested reaction mechanisms explained experimental results rather well. When the possible cross-correlation and identifiability of parameters (Reactions 3 and 4) were determined using MCMC method, the parameters identified well, and no pathological cross-correlation could be seen between any parameter pair.
Resumo:
Particle Image Velocimetry, PIV, is an optical measuring technique to obtain velocity information of a flow in interest. With PIV it is possible to achieve two or three dimensional velocity vector fields from a measurement area instead of a single point in a flow. Measured flow can be either in liquid or in gas form. PIV is nowadays widely applied to flow field studies. The need for PIV is to obtain validation data for Computational Fluid Dynamics calculation programs that has been used to model blow down experiments in PPOOLEX test facility in the Lappeenranta University of Technology. In this thesis PIV and its theoretical background are presented. All the subsystems that can be considered to be part of a PIV system are presented as well with detail. Emphasis is also put to the mathematics behind the image evaluation. The work also included selection and successful testing of a PIV system, as well as the planning of the installation to the PPOOLEX facility. Already in the preliminary testing PIV was found to be good addition to the measuring equipment for Nuclear Safety Research Unit of LUT. The installation to PPOOLEX facility was successful even though there were many restrictions considering it. All parts of the PIV system worked and they were found out to be appropriate for the planned use. Results and observations presented in this thesis are a good background to further PIV use.
Resumo:
The purpose of this report is to disseminate the best practices of double degree programmes’ organization, implementation and development between Russian and European universities. The findings reveal good developments in the field of double degree cooperation between Russian and European universities and a high motivation from both parties. The report depicts different models of building a joint curriculum and organizing academic mobility. Foreign language skills improvement for students and university staff, involvement of international companies, and joint strategy and actions in marketing and quality assurance are some redevelopments points recommended in the report.
Resumo:
Protein engineering aims to improve the properties of enzymes and affinity reagents by genetic changes. Typical engineered properties are affinity, specificity, stability, expression, and solubility. Because proteins are complex biomolecules, the effects of specific genetic changes are seldom predictable. Consequently, a popular strategy in protein engineering is to create a library of genetic variants of the target molecule, and render the population in a selection process to sort the variants by the desired property. This technique, called directed evolution, is a central tool for trimming protein-based products used in a wide range of applications from laundry detergents to anti-cancer drugs. New methods are continuously needed to generate larger gene repertoires and compatible selection platforms to shorten the development timeline for new biochemicals. In the first study of this thesis, primer extension mutagenesis was revisited to establish higher quality gene variant libraries in Escherichia coli cells. In the second study, recombination was explored as a method to expand the number of screenable enzyme variants. A selection platform was developed to improve antigen binding fragment (Fab) display on filamentous phages in the third article and, in the fourth study, novel design concepts were tested by two differentially randomized recombinant antibody libraries. Finally, in the last study, the performance of the same antibody repertoire was compared in phage display selections as a genetic fusion to different phage capsid proteins and in different antibody formats, Fab vs. single chain variable fragment (ScFv), in order to find out the most suitable display platform for the library at hand. As a result of the studies, a novel gene library construction method, termed selective rolling circle amplification (sRCA), was developed. The method increases mutagenesis frequency close to 100% in the final library and the number of transformants over 100-fold compared to traditional primer extension mutagenesis. In the second study, Cre/loxP recombination was found to be an appropriate tool to resolve the DNA concatemer resulting from error-prone RCA (epRCA) mutagenesis into monomeric circular DNA units for higher efficiency transformation into E. coli. Library selections against antigens of various size in the fourth study demonstrated that diversity placed closer to the antigen binding site of antibodies supports generation of antibodies against haptens and peptides, whereas diversity at more peripheral locations is better suited for targeting proteins. The conclusion from a comparison of the display formats was that truncated capsid protein three (p3Δ) of filamentous phage was superior to the full-length p3 and protein nine (p9) in obtaining a high number of uniquely specific clones. Especially for digoxigenin, a difficult hapten target, the antibody repertoire as ScFv-p3Δ provided the clones with the highest affinity for binding. This thesis on the construction, design, and selection of gene variant libraries contributes to the practical know-how in directed evolution and contains useful information for scientists in the field to support their undertakings.