33 resultados para batch method
Resumo:
The main obstacle for the application of high quality diamond-like carbon (DLC) coatings has been the lack of adhesion to the substrate as the coating thickness is increased. The aim of this study was to improve the filtered pulsed arc discharge (FPAD) method. With this method it is possible to achieve high DLC coating thicknesses necessary for practical applications. The energy of the carbon ions was measured with an optoelectronic time-of-flight method. An in situ cathode polishing system used for stabilizing the process yield and the carbon ion energies is presented. Simultaneously the quality of the coatings can be controlled. To optimise the quality of the deposition process a simple, fast and inexpensive method using silicon wafers as test substrates was developed. This method was used for evaluating the suitability of a simplified arc-discharge set-up for the deposition of the adhesion layer of DLC coatings. A whole new group of materials discovered by our research group, the diamond-like carbon polymer hybrid (DLC-p-h) coatings, is also presented. The parent polymers used in these novel coatings were polydimethylsiloxane (PDMS) and polytetrafluoroethylene (PTFE). The energy of the plasma ions was found to increase when the anode-cathode distance and the arc voltage were increased. A constant deposition rate for continuous coating runs was obtained with an in situ cathode polishing system. The novel DLC-p-h coatings were found to be water and oil repellent and harder than any polymers. The lowest sliding angle ever measured from a solid surface, 0.15 ± 0.03°, was measured on a DLC-PDMS-h coating. In the FPAD system carbon ions can be accelerated to high energies (≈ 1 keV) necessary for the optimal adhesion (the substrate is broken in the adhesion and quality test) of ultra thick (up to 200 µm) DLC coatings by increasing the anode-cathode distance and using high voltages (up to 4 kV). An excellent adhesion can also be obtained with the simplified arc-discharge device. To maintain high process yield (5µm/h over a surface area of 150 cm2) and to stabilize the carbon ion energies and the high quality (sp3 fraction up to 85%) of the resulting coating, an in situ cathode polishing system must be used. DLC-PDMS-h coating is the superior candidate coating material for anti-soiling applications where also hardness is required.
Resumo:
By detecting leading protons produced in the Central Exclusive Diffractive process, p+p → p+X+p, one can measure the missing mass, and scan for possible new particle states such as the Higgs boson. This process augments - in a model independent way - the standard methods for new particle searches at the Large Hadron Collider (LHC) and will allow detailed analyses of the produced central system, such as the spin-parity properties of the Higgs boson. The exclusive central diffractive process makes possible precision studies of gluons at the LHC and complements the physics scenarios foreseen at the next e+e− linear collider. This thesis first presents the conclusions of the first systematic analysis of the expected precision measurement of the leading proton momentum and the accuracy of the reconstructed missing mass. In this initial analysis, the scattered protons are tracked along the LHC beam line and the uncertainties expected in beam transport and detection of the scattered leading protons are accounted for. The main focus of the thesis is in developing the necessary radiation hard precision detector technology for coping with the extremely demanding experimental environment of the LHC. This will be achieved by using a 3D silicon detector design, which in addition to the radiation hardness of up to 5×10^15 neutrons/cm2, offers properties such as a high signal-to- noise ratio, fast signal response to radiation and sensitivity close to the very edge of the detector. This work reports on the development of a novel semi-3D detector design that simplifies the 3D fabrication process, but conserves the necessary properties of the 3D detector design required in the LHC and in other imaging applications.
Resumo:
We present a search for associated production of the standard model (SM) Higgs boson and a $Z$ boson where the $Z$ boson decays to two leptons and the Higgs decays to a pair of $b$ quarks in $p\bar{p}$ collisions at the Fermilab Tevatron. We use event probabilities based on SM matrix elements to construct a likelihood function of the Higgs content of the data sample. In a CDF data sample corresponding to an integrated luminosity of 2.7 fb$^{-1}$ we see no evidence of a Higgs boson with a mass between 100 GeV$/c^2$ and 150 GeV$/c^2$. We set 95% confidence level (C.L.) upper limits on the cross-section for $ZH$ production as a function of the Higgs boson mass $m_H$; the limit is 8.2 times the SM prediction at $m_H = 115$ GeV$/c^2$.
Resumo:
A precision measurement of the top quark mass m_t is obtained using a sample of ttbar events from ppbar collisions at the Fermilab Tevatron with the CDF II detector. Selected events require an electron or muon, large missing transverse energy, and exactly four high-energy jets, at least one of which is tagged as coming from a b quark. A likelihood is calculated using a matrix element method with quasi-Monte Carlo integration taking into account finite detector resolution and jet mass effects. The event likelihood is a function of m_t and a parameter DJES to calibrate the jet energy scale /in situ/. Using a total of 1087 events, a value of m_t = 173.0 +/- 1.2 GeV/c^2 is measured.
Resumo:
We report a measurement of the top quark mass, m_t, obtained from ppbar collisions at sqrt(s) = 1.96 TeV at the Fermilab Tevatron using the CDF II detector. We analyze a sample corresponding to an integrated luminosity of 1.9 fb^-1. We select events with an electron or muon, large missing transverse energy, and exactly four high-energy jets in the central region of the detector, at least one of which is tagged as coming from a b quark. We calculate a signal likelihood using a matrix element integration method, with effective propagators to take into account assumptions on event kinematics. Our event likelihood is a function of m_t and a parameter JES that determines /in situ/ the calibration of the jet energies. We use a neural network discriminant to distinguish signal from background events. We also apply a cut on the peak value of each event likelihood curve to reduce the contribution of background and badly reconstructed events. Using the 318 events that pass all selection criteria, we find m_t = 172.7 +/- 1.8 (stat. + JES) +/- 1.2 (syst.) GeV/c^2.
Resumo:
We present a measurement of the top quark mass with t-tbar dilepton events produced in p-pbar collisions at the Fermilab Tevatron $\sqrt{s}$=1.96 TeV and collected by the CDF II detector. A sample of 328 events with a charged electron or muon and an isolated track, corresponding to an integrated luminosity of 2.9 fb$^{-1}$, are selected as t-tbar candidates. To account for the unconstrained event kinematics, we scan over the phase space of the azimuthal angles ($\phi_{\nu_1},\phi_{\nu_2}$) of neutrinos and reconstruct the top quark mass for each $\phi_{\nu_1},\phi_{\nu_2}$ pair by minimizing a $\chi^2$ function in the t-tbar dilepton hypothesis. We assign $\chi^2$-dependent weights to the solutions in order to build a preferred mass for each event. Preferred mass distributions (templates) are built from simulated t-tbar and background events, and parameterized in order to provide continuous probability density functions. A likelihood fit to the mass distribution in data as a weighted sum of signal and background probability density functions gives a top quark mass of $165.5^{+{3.4}}_{-{3.3}}$(stat.)$\pm 3.1$(syst.) GeV/$c^2$.
New Method for Delexicalization and its Application to Prosodic Tagging for Text-to-Speech Synthesis
Resumo:
This paper describes a new flexible delexicalization method based on glottal excited parametric speech synthesis scheme. The system utilizes inverse filtered glottal flow and all-pole modelling of the vocal tract. The method provides a possibil- ity to retain and manipulate all relevant prosodic features of any kind of speech. Most importantly, the features include voice quality, which has not been properly modeled in earlier delex- icalization methods. The functionality of the new method was tested in a prosodic tagging experiment aimed at providing word prominence data for a text-to-speech synthesis system. The ex- periment confirmed the usefulness of the method and further corroborated earlier evidence that linguistic factors influence the perception of prosodic prominence.
Resumo:
In this study we explore the concurrent, combined use of three research methods, statistical corpus analysis and two psycholinguistic experiments (a forced-choice and an acceptability rating task), using verbal synonymy in Finnish as a case in point. In addition to supporting conclusions from earlier studies concerning the relationships between corpus-based and ex- perimental data (e. g., Featherston 2005), we show that each method adds to our understanding of the studied phenomenon, in a way which could not be achieved through any single method by itself. Most importantly, whereas relative rareness in a corpus is associated with dispreference in selection, such infrequency does not categorically always entail substantially lower acceptability. Furthermore, we show that forced-choice and acceptability rating tasks pertain to distinct linguistic processes, with category-wise in- commensurable scales of measurement, and should therefore be merged with caution, if at all.
Resumo:
When authors of scholarly articles decide where to submit their manuscripts for peer review and eventual publication, they often base their choice of journals on very incomplete information abouthow well the journals serve the authors’ purposes of informing about their research and advancing their academic careers. The purpose of this study was to develop and test a new method for benchmarking scientific journals, providing more information to prospective authors. The method estimates a number of journal parameters, including readership, scientific prestige, time from submission to publication, acceptance rate and service provided by the journal during the review and publication process. Data directly obtainable from the web, data that can be calculated from such data, data obtained from publishers and editors, and data obtained using surveys with authors are used in the method, which has been tested on three different sets of journals, each from a different discipline. We found a number of problems with the different data acquisition methods, which limit the extent to which the method can be used. Publishers and editors are reluctant to disclose important information they have at hand (i.e. journal circulation, web downloads, acceptance rate). The calculation of some important parameters (for instance average time from submission to publication, regional spread of authorship) can be done but requires quite a lot of work. It can be difficult to get reasonable response rates to surveys with authors. All in all we believe that the method we propose, taking a “service to authors” perspective as a basis for benchmarking scientific journals, is useful and can provide information that is valuable to prospective authors in selected scientific disciplines.
Resumo:
Radiometric determination methods, such as alpha spectrometry require long counting times when low activities are to be determined. Mass spectrometric techniques as Inductively Coupled Plasma Mass Spectrometry (ICP-MS), Thermal Ionisation Mass Spectrometry (TIMS) and Accelerator Mass Spectrometry (AMS) have shown several advantages compared to traditional methods when measuring long-lived radionuclides. Mass spectrometric methods for determination of very low concentrations of elemental isotopes, and thereby isotopic ratios, have been developed using a variety of ion sources. Although primarily applied to the determination of the lighter stable element isotopes and radioactive isotopes in geological studies, the techniques can equally well be applied to the measurement of activity concentrations of long-lived low-level radionuclides in various samples using “isotope dilution” methods such as those applied in inductively coupled plasma mass spectrometry (ICP-MS). Due to the low specific activity of long-lived radionuclides, many of these are more conveniently detected using mass spectrometric techniques. Mass spectrometry also enables the individual determination of Pu-239 and Pu-240, which cannot be obtained by alpha spectrometry. Inductively Coupled Plasma Mass Spectrometry (ICP-MS) are rapidly growing techniques for the ultra-trace analytical determination of stable and long-lived isotopes and have a wide potential within environmental science, including ecosystem tracers and radio ecological studies. Such instrumentation, of course needs good radiochemical separation, to give best performance. The objectives of the project is to identify current needs and problems within low-level determination of long-lived radioisotopes by ICP-MS, to perform intercalibration and development and improvement of ICP-MS methods for the measurement of radionuclides and isotope ratios and to develop new methods based on modified separation chemistry applied to new auxiliary equipment.
Resumo:
Drug induced liver injury is one of the frequent reasons for the drug removal from the market. During the recent years there has been a pressure to develop more cost efficient, faster and easier ways to investigate drug-induced toxicity in order to recognize hepatotoxic drugs in the earlier phases of drug development. High Content Screening (HCS) instrument is an automated microscope equipped with image analysis software. It makes the image analysis faster and decreases the risk for an error caused by a person by analyzing the images always in the same way. Because the amount of drug and time needed in the analysis are smaller and multiple parameters can be analyzed from the same cells, the method should be more sensitive, effective and cheaper than the conventional assays in cytotoxicity testing. Liver cells are rich in mitochondria and many drugs target their toxicity to hepatocyte mitochondria. Mitochondria produce the majority of the ATP in the cell through oxidative phosphorylation. They maintain biochemical homeostasis in the cell and participate in cell death. Mitochondria is divided into two compartments by inner and outer mitochondrial membranes. The oxidative phosphorylation happens in the inner mitochondrial membrane. A part of the respiratory chain, a protein called cytochrome c, activates caspase cascades when released. This leads to apoptosis. The aim of this study was to implement, optimize and compare mitochondrial toxicity HCS assays in live cells and fixed cells in two cellular models: human HepG2 hepatoma cell line and rat primary hepatocytes. Three different hepato- and mitochondriatoxic drugs (staurosporine, rotenone and tolcapone) were used. Cells were treated with the drugs, incubated with the fluorescent probes and then the images were analyzed using Cellomics ArrayScan VTI reader. Finally the results obtained after optimizing methods were compared to each other and to the results of the conventional cytotoxicity assays, ATP and LDH measurements. After optimization the live cell method and rat primary hepatocytes were selected to be used in the experiments. Staurosporine was the most toxic of the three drugs and caused most damage to the cells most quickly. Rotenone was not that toxic, but the results were more reproducible and thus it would serve as a good positive control in the screening. Tolcapone was the least toxic. So far the conventional analysis of cytotoxicity worked better than the HCS methods. More optimization needs to be done to get the HCS method more sensitive. This was not possible in this study due to time limit.