32 resultados para kinetic method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ever-increasing demand for faster computers in various areas, ranging from entertaining electronics to computational science, is pushing the semiconductor industry towards its limits on decreasing the sizes of electronic devices based on conventional materials. According to the famous law by Gordon E. Moore, a co-founder of the world s largest semiconductor company Intel, the transistor sizes should decrease to the atomic level during the next few decades to maintain the present rate of increase in the computational power. As leakage currents become a problem for traditional silicon-based devices already at sizes in the nanometer scale, an approach other than further miniaturization is needed to accomplish the needs of the future electronics. A relatively recently proposed possibility for further progress in electronics is to replace silicon with carbon, another element from the same group in the periodic table. Carbon is an especially interesting material for nanometer-sized devices because it forms naturally different nanostructures. Furthermore, some of these structures have unique properties. The most widely suggested allotrope of carbon to be used for electronics is a tubular molecule having an atomic structure resembling that of graphite. These carbon nanotubes are popular both among scientists and in industry because of a wide list of exciting properties. For example, carbon nanotubes are electronically unique and have uncommonly high strength versus mass ratio, which have resulted in a multitude of proposed applications in several fields. In fact, due to some remaining difficulties regarding large-scale production of nanotube-based electronic devices, fields other than electronics have been faster to develop profitable nanotube applications. In this thesis, the possibility of using low-energy ion irradiation to ease the route towards nanotube applications is studied through atomistic simulations on different levels of theory. Specifically, molecular dynamic simulations with analytical interaction models are used to follow the irradiation process of nanotubes to introduce different impurity atoms into these structures, in order to gain control on their electronic character. Ion irradiation is shown to be a very efficient method to replace carbon atoms with boron or nitrogen impurities in single-walled nanotubes. Furthermore, potassium irradiation of multi-walled and fullerene-filled nanotubes is demonstrated to result in small potassium clusters in the hollow parts of these structures. Molecular dynamic simulations are further used to give an example on using irradiation to improve contacts between a nanotube and a silicon substrate. Methods based on the density-functional theory are used to gain insight on the defect structures inevitably created during the irradiation. Finally, a new simulation code utilizing the kinetic Monte Carlo method is introduced to follow the time evolution of irradiation-induced defects on carbon nanotubes on macroscopic time scales. Overall, the molecular dynamic simulations presented in this thesis show that ion irradiation is a promisingmethod for tailoring the nanotube properties in a controlled manner. The calculations made with density-functional-theory based methods indicate that it is energetically favorable for even relatively large defects to transform to keep the atomic configuration as close to the pristine nanotube as possible. The kinetic Monte Carlo studies reveal that elevated temperatures during the processing enhance the self-healing of nanotubes significantly, ensuring low defect concentrations after the treatment with energetic ions. Thereby, nanotubes can retain their desired properties also after the irradiation. Throughout the thesis, atomistic simulations combining different levels of theory are demonstrated to be an important tool for determining the optimal conditions for irradiation experiments, because the atomic-scale processes at short time scales are extremely difficult to study by any other means.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aerosols impact the planet and our daily lives through various effects, perhaps most notably those related to their climatic and health-related consequences. While there are several primary particle sources, secondary new particle formation from precursor vapors is also known to be a frequent, global phenomenon. Nevertheless, the formation mechanism of new particles, as well as the vapors participating in the process, remain a mystery. This thesis consists of studies on new particle formation specifically from the point of view of numerical modeling. A dependence of formation rate of 3 nm particles on the sulphuric acid concentration to the power of 1-2 has been observed. This suggests nucleation mechanism to be of first or second order with respect to the sulphuric acid concentration, in other words the mechanisms based on activation or kinetic collision of clusters. However, model studies have had difficulties in replicating the small exponents observed in nature. The work done in this thesis indicates that the exponents may be lowered by the participation of a co-condensing (and potentially nucleating) low-volatility organic vapor, or by increasing the assumed size of the critical clusters. On the other hand, the presented new and more accurate method for determining the exponent indicates high diurnal variability. Additionally, these studies included several semi-empirical nucleation rate parameterizations as well as a detailed investigation of the analysis used to determine the apparent particle formation rate. Due to their high proportion of the earth's surface area, oceans could potentially prove to be climatically significant sources of secondary particles. In the lack of marine observation data, new particle formation events in a coastal region were parameterized and studied. Since the formation mechanism is believed to be similar, the new parameterization was applied in a marine scenario. The work showed that marine CCN production is feasible in the presence of additional vapors contributing to particle growth. Finally, a new method to estimate concentrations of condensing organics was developed. The algorithm utilizes a Markov chain Monte Carlo method to determine the required combination of vapor concentrations by comparing a measured particle size distribution with one from an aerosol dynamics process model. The evaluation indicated excellent agreement against model data, and initial results with field data appear sound as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nucleation is the first step of the process by which gas molecules in the atmosphere condense to form liquid or solid particles. Despite the importance of atmospheric new-particle formation for both climate and health-related issues, little information exists on its precise molecular-level mechanisms. In this thesis, potential nucleation mechanisms involving sulfuric acid together with either water and ammonia or reactive biogenic molecules are studied using quantum chemical methods. Quantum chemistry calculations are based on the numerical solution of Schrödinger's equation for a system of atoms and electrons subject to various sets of approximations, the precise details of which give rise to a large number of model chemistries. A comparison of several different model chemistries indicates that the computational method must be chosen with care if accurate results for sulfuric acid - water - ammonia clusters are desired. Specifically, binding energies are incorrectly predicted by some popular density functionals, and vibrational anharmonicity must be accounted for if quantitatively reliable formation free energies are desired. The calculations reported in this thesis show that a combination of different high-level energy corrections and advanced thermochemical analysis can quantitatively replicate experimental results concerning the hydration of sulfuric acid. The role of ammonia in sulfuric acid - water nucleation was revealed by a series of calculations on molecular clusters of increasing size with respect to all three co-ordinates; sulfuric acid, water and ammonia. As indicated by experimental measurements, ammonia significantly assists the growth of clusters in the sulfuric acid - co-ordinate. The calculations presented in this thesis predict that in atmospheric conditions, this effect becomes important as the number of acid molecules increases from two to three. On the other hand, small molecular clusters are unlikely to contain more than one ammonia molecule per sulfuric acid. This implies that the average NH3:H2SO4 mole ratio of small molecular clusters in atmospheric conditions is likely to be between 1:3 and 1:1. Calculations on charged clusters confirm the experimental result that the HSO4- ion is much more strongly hydrated than neutral sulfuric acid. Preliminary calculations on HSO4- NH3 clusters indicate that ammonia is likely to play at most a minor role in ion-induced nucleation in the sulfuric acid - water system. Calculations of thermodynamic and kinetic parameters for the reaction of stabilized Criegee Intermediates with sulfuric acid demonstrate that quantum chemistry is a powerful tool for investigating chemically complicated nucleation mechanisms. The calculations indicate that if the biogenic Criegee Intermediates have sufficiently long lifetimes in atmospheric conditions, the studied reaction may be an important source of nucleation precursors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a search for associated production of the standard model (SM) Higgs boson and a $Z$ boson where the $Z$ boson decays to two leptons and the Higgs decays to a pair of $b$ quarks in $p\bar{p}$ collisions at the Fermilab Tevatron. We use event probabilities based on SM matrix elements to construct a likelihood function of the Higgs content of the data sample. In a CDF data sample corresponding to an integrated luminosity of 2.7 fb$^{-1}$ we see no evidence of a Higgs boson with a mass between 100 GeV$/c^2$ and 150 GeV$/c^2$. We set 95% confidence level (C.L.) upper limits on the cross-section for $ZH$ production as a function of the Higgs boson mass $m_H$; the limit is 8.2 times the SM prediction at $m_H = 115$ GeV$/c^2$.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sulfotransferases (SULTs) and UDP-glucuronosyltransferases (UGTs) are important detoxification enzymes and they contribute to bioavailability and elimination of many drugs. SULT1A3 is an extrahepatic enzyme responsible for the sulfonation of dopamine, which is often used as its probe substrate. A new method for analyzing dopamine-3-O-sulfate and dopamine-4-O-sulfate by high-performance liquid chromatography was developed and the enzyme kinetic parameters for their formation were determined using purified recombinant human SULT1A3. The results show that SULT1A3 strongly favors the 3-hydroxy group of dopamine, which indicates that it may be the major enzyme responsible for the difference between the circulating levels of dopamine sulfates in human blood. All 19 known human UGTs were expressed as recombinant enzymes in baculovirus infected insect cells and their activities toward dopamine and estradiol were studied. UGT1A10 was identified as the only UGT capable of dopamine glucuronidation at a substantial level. The results were supported by studies with human intestinal and liver microsomes. The affinity was low indicating that UGT1A10 is not an important enzyme in dopamine metabolism in vivo. Despite the low affinity, dopamine is a potential new probe substrate for UGT1A10 due to its selectivity. Dopamine was used to study the importance of phenylalanines 90 and 93 in UGT1A10. The results revealed distinct effects that are dependent on differences in the size of the side chain and on the differences in their position within the protein. Examination of twelve mutants revealed lower activity in all of them. However, the enzyme kinetic studies of four mutants showed that their affinities were similar to that of UGT1A10 suggesting that F90 and F93 are not directly involved in dopamine binding in the active site. The glucuronidation of β-estradiol and epiestradiol (α-estradiol) was studied to elucidate how the orientation of the 17-OH group affects conjugation at the 3-OH or the 17-OH of either diastereomer. The results show that there are clear differences in the regio- and stereoselectivities of UGTs. The most active isoforms were UGT1A10 and UGT2B7 demonstrating opposite regioselectivity. The stereoselectivities of UGT2Bs were more complex than those of UGT1As. The amino acid sequences of the human UGTs 1A9 and 1A10 are 93% identical, yet there are large differences in their activity and substrate selectivity. Several mutants were constructed to identify the residues responsible for the activity differences. The results revealed that the residues between Leu86 and Tyr176 of UGT1A9 determine the differences between UGT1A9 and UGT1A10. Phe117 of UGT1A9 participated in 1-naphthol binding and the residues at positions 152 and 169 contributed to the higher glucuronidation rates of UGT1A10. In summary, the results emphasize that the substrate selectivities, including regio- and stereoselectivities, of UGTs are complex and they are controlled by many amino acids rather than one critical residue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A precision measurement of the top quark mass m_t is obtained using a sample of ttbar events from ppbar collisions at the Fermilab Tevatron with the CDF II detector. Selected events require an electron or muon, large missing transverse energy, and exactly four high-energy jets, at least one of which is tagged as coming from a b quark. A likelihood is calculated using a matrix element method with quasi-Monte Carlo integration taking into account finite detector resolution and jet mass effects. The event likelihood is a function of m_t and a parameter DJES to calibrate the jet energy scale /in situ/. Using a total of 1087 events, a value of m_t = 173.0 +/- 1.2 GeV/c^2 is measured.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report a measurement of the top quark mass, m_t, obtained from ppbar collisions at sqrt(s) = 1.96 TeV at the Fermilab Tevatron using the CDF II detector. We analyze a sample corresponding to an integrated luminosity of 1.9 fb^-1. We select events with an electron or muon, large missing transverse energy, and exactly four high-energy jets in the central region of the detector, at least one of which is tagged as coming from a b quark. We calculate a signal likelihood using a matrix element integration method, with effective propagators to take into account assumptions on event kinematics. Our event likelihood is a function of m_t and a parameter JES that determines /in situ/ the calibration of the jet energies. We use a neural network discriminant to distinguish signal from background events. We also apply a cut on the peak value of each event likelihood curve to reduce the contribution of background and badly reconstructed events. Using the 318 events that pass all selection criteria, we find m_t = 172.7 +/- 1.8 (stat. + JES) +/- 1.2 (syst.) GeV/c^2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a measurement of the top quark mass with t-tbar dilepton events produced in p-pbar collisions at the Fermilab Tevatron $\sqrt{s}$=1.96 TeV and collected by the CDF II detector. A sample of 328 events with a charged electron or muon and an isolated track, corresponding to an integrated luminosity of 2.9 fb$^{-1}$, are selected as t-tbar candidates. To account for the unconstrained event kinematics, we scan over the phase space of the azimuthal angles ($\phi_{\nu_1},\phi_{\nu_2}$) of neutrinos and reconstruct the top quark mass for each $\phi_{\nu_1},\phi_{\nu_2}$ pair by minimizing a $\chi^2$ function in the t-tbar dilepton hypothesis. We assign $\chi^2$-dependent weights to the solutions in order to build a preferred mass for each event. Preferred mass distributions (templates) are built from simulated t-tbar and background events, and parameterized in order to provide continuous probability density functions. A likelihood fit to the mass distribution in data as a weighted sum of signal and background probability density functions gives a top quark mass of $165.5^{+{3.4}}_{-{3.3}}$(stat.)$\pm 3.1$(syst.) GeV/$c^2$.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a new flexible delexicalization method based on glottal excited parametric speech synthesis scheme. The system utilizes inverse filtered glottal flow and all-pole modelling of the vocal tract. The method provides a possibil- ity to retain and manipulate all relevant prosodic features of any kind of speech. Most importantly, the features include voice quality, which has not been properly modeled in earlier delex- icalization methods. The functionality of the new method was tested in a prosodic tagging experiment aimed at providing word prominence data for a text-to-speech synthesis system. The ex- periment confirmed the usefulness of the method and further corroborated earlier evidence that linguistic factors influence the perception of prosodic prominence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study we explore the concurrent, combined use of three research methods, statistical corpus analysis and two psycholinguistic experiments (a forced-choice and an acceptability rating task), using verbal synonymy in Finnish as a case in point. In addition to supporting conclusions from earlier studies concerning the relationships between corpus-based and ex- perimental data (e. g., Featherston 2005), we show that each method adds to our understanding of the studied phenomenon, in a way which could not be achieved through any single method by itself. Most importantly, whereas relative rareness in a corpus is associated with dispreference in selection, such infrequency does not categorically always entail substantially lower acceptability. Furthermore, we show that forced-choice and acceptability rating tasks pertain to distinct linguistic processes, with category-wise in- commensurable scales of measurement, and should therefore be merged with caution, if at all.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When authors of scholarly articles decide where to submit their manuscripts for peer review and eventual publication, they often base their choice of journals on very incomplete information abouthow well the journals serve the authors’ purposes of informing about their research and advancing their academic careers. The purpose of this study was to develop and test a new method for benchmarking scientific journals, providing more information to prospective authors. The method estimates a number of journal parameters, including readership, scientific prestige, time from submission to publication, acceptance rate and service provided by the journal during the review and publication process. Data directly obtainable from the web, data that can be calculated from such data, data obtained from publishers and editors, and data obtained using surveys with authors are used in the method, which has been tested on three different sets of journals, each from a different discipline. We found a number of problems with the different data acquisition methods, which limit the extent to which the method can be used. Publishers and editors are reluctant to disclose important information they have at hand (i.e. journal circulation, web downloads, acceptance rate). The calculation of some important parameters (for instance average time from submission to publication, regional spread of authorship) can be done but requires quite a lot of work. It can be difficult to get reasonable response rates to surveys with authors. All in all we believe that the method we propose, taking a “service to authors” perspective as a basis for benchmarking scientific journals, is useful and can provide information that is valuable to prospective authors in selected scientific disciplines.