42 resultados para Precision Xtra®
Resumo:
ALICE (A Large Ion Collider Experiment) is the LHC (Large Hadron Collider) experiment devoted to investigating the strongly interacting matter created in nucleus-nucleus collisions at the LHC energies. The ALICE ITS, Inner Tracking System, consists of six cylindrical layers of silicon detectors with three different technologies; in the outward direction: two layers of pixel detectors, two layers each of drift, and strip detectors. The number of parameters to be determined in the spatial alignment of the 2198 sensor modules of the ITS is about 13,000. The target alignment precision is well below 10 micron in some cases (pixels). The sources of alignment information include survey measurements, and the reconstructed tracks from cosmic rays and from proton-proton collisions. The main track-based alignment method uses the Millepede global approach. An iterative local method was developed and used as well. We present the results obtained for the ITS alignment using about 10^5 charged tracks from cosmic rays that have been collected during summer 2008, with the ALICE solenoidal magnet switched off.
Resumo:
The International Large Detector (ILD) is a concept for a detector at the International Linear Collider, ILC. The ILC will collide electrons and positrons at energies of initially 500 GeV, upgradeable to 1 TeV. The ILC has an ambitious physics program, which will extend and complement that of the Large Hadron Collider (LHC). A hallmark of physics at the ILC is precision. The clean initial state and the comparatively benign environment of a lepton collider are ideally suited to high precision measurements. To take full advantage of the physics potential of ILC places great demands on the detector performance. The design of ILD is driven by these requirements. Excellent calorimetry and tracking are combined to obtain the best possible overall event reconstruction, including the capability to reconstruct individual particles within jets for particle ow calorimetry. This requires excellent spatial resolution for all detector systems. A highly granular calorimeter system is combined with a central tracker which stresses redundancy and efficiency. In addition, efficient reconstruction of secondary vertices and excellent momentum resolution for charged particles are essential for an ILC detector. The interaction region of the ILC is designed to host two detectors, which can be moved into the beam position with a push-pull scheme. The mechanical design of ILD and the overall integration of subdetectors takes these operational conditions into account.
Resumo:
We report a measurement of the top quark mass $M_t$ in the dilepton decay channel $t\bar{t}\to b\ell'^{+}\nu'_\ell\bar{b}\ell^{-}\bar{\nu}_{\ell}$. Events are selected with a neural network which has been directly optimized for statistical precision in top quark mass using neuroevolution, a technique modeled on biological evolution. The top quark mass is extracted from per-event probability densities that are formed by the convolution of leading order matrix elements and detector resolution functions. The joint probability is the product of the probability densities from 344 candidate events in 2.0 fb$^{-1}$ of $p\bar{p}$ collisions collected with the CDF II detector, yielding a measurement of $M_t= 171.2\pm 2.7(\textrm{stat.})\pm 2.9(\textrm{syst.})\mathrm{GeV}/c^2$.
Resumo:
Thin films are the basis of much of recent technological advance, ranging from coatings with mechanical or optical benefits to platforms for nanoscale electronics. In the latter, semiconductors have been the norm ever since silicon became the main construction material for a multitude of electronical components. The array of characteristics of silicon-based systems can be widened by manipulating the structure of the thin films at the nanoscale - for instance, by making them porous. The different characteristics of different films can then to some extent be combined by simple superposition. Thin films can be manufactured using many different methods. One emerging field is cluster beam deposition, where aggregates of hundreds or thousands of atoms are deposited one by one to form a layer, the characteristics of which depend on the parameters of deposition. One critical parameter is deposition energy, which dictates how porous, if at all, the layer becomes. Other parameters, such as sputtering rate and aggregation conditions, have an effect on the size and consistency of the individual clusters. Understanding nanoscale processes, which cannot be observed experimentally, is fundamental to optimizing experimental techniques and inventing new possibilities for advances at this scale. Atomistic computer simulations offer a window to the world of nanometers and nanoseconds in a way unparalleled by the most accurate of microscopes. Transmission electron microscope image simulations can then bridge this gap by providing a tangible link between the simulated and the experimental. In this thesis, the entire process of cluster beam deposition is explored using molecular dynamics and image simulations. The process begins with the formation of the clusters, which is investigated for Si/Ge in an Ar atmosphere. The structure of the clusters is optimized to bring it as close to the experimental ideal as possible. Then, clusters are deposited, one by one, onto a substrate, until a sufficiently thick layer has been produced. Finally, the concept is expanded by further deposition with different parameters, resulting in multiple superimposed layers of different porosities. This work demonstrates how the aggregation of clusters is not entirely understood within the scope of the approximations used in the simulations; yet, it is also shown how the continued deposition of clusters with a varying deposition energy can lead to a novel kind of nanostructured thin film: a multielemental porous multilayer. According to theory, these new structures have characteristics that can be tailored for a variety of applications, with precision heretofore unseen in conventional multilayer manufacture.
Resumo:
In this thesis, the possibility of extending the Quantization Condition of Dirac for Magnetic Monopoles to noncommutative space-time is investigated. The three publications that this thesis is based on are all in direct link to this investigation. Noncommutative solitons have been found within certain noncommutative field theories, but it is not known whether they possesses only topological charge or also magnetic charge. This is a consequence of that the noncommutative topological charge need not coincide with the noncommutative magnetic charge, although they are equivalent in the commutative context. The aim of this work is to begin to fill this gap of knowledge. The method of investigation is perturbative and leaves open the question of whether a nonperturbative source for the magnetic monopole can be constructed, although some aspects of such a generalization are indicated. The main result is that while the noncommutative Aharonov-Bohm effect can be formulated in a gauge invariant way, the quantization condition of Dirac is not satisfied in the case of a perturbative source for the point-like magnetic monopole.
Resumo:
Pre-eclampsia is a pregnancy complication that affects about 5% of all pregnancies. It is known to be associated with alterations in angiogenesis -related factors, such as vascular endothelial growth factor (VEGF). An excess of antiangiogenic substances, especially the soluble receptor-1 of VEGF (sVEGFR-1), has been observed in maternal circulation after the onset of the disease, probably reflecting their increased placental production. Smoking reduces circulating concentrations of sVEGFR-1 in non-pregnant women, and in pregnant women it reduces the risk of pre-eclampsia. Soluble VEGFR-1 acts as a natural antagonist of VEGF and placental growth factor (PlGF) in human circulation, holding a promise for potential therapeutic use. In fact, it has been used as a model to generate a fusion protein, VEGF Trap , which has been found effective in anti-angiogenic treatment of certain tumors and ocular diseases. In the present study, we evaluated the potential use of maternal serum sVEGFR-1, Angiopoietin-2 (Ang-2) and endostatin, three central anti-angiogenic markers, in early prediction of subsequent pre-eclampsia. We also studied whether smoking affects circulating sVEGFR-1 concentrations in pregnant women or their first trimester placental secretion and expression in vitro. Last, in order to allow future discussion on the potential therapy based on sVEGFR-1, we determined the biological half-life of endogenous sVEGFR-1 in human circulation, and measured the concomitant changes in free VEGF concentrations. Blood or placental samples were collected from a total of 268 pregnant women between the years 2001 2007 in Helsinki University Central Hospital for the purposes above. The biomarkers were measured using commercially available enzyme-linked immunosorbent assays (ELISA). For the analyses of sVEGFR-1, Ang-2 and endostatin, a total of 3 240 pregnant women in the Helsinki area were admitted to blood sample collection during two routine ultrasoundscreening visits at 13.7 ± 0.5 (mean ± SD) and 19.2 ± 0.6 weeks of gestation. Of them, 49 women later developing pre-eclampsia were included in the study. Their disease was further classified as mild in 29 and severe in 20 patients. Isolated early-onset intrauterine growth retardation (IUGR) was diagnosed in 16 women with otherwise normal medical histories and uncomplicated pregnancies. Fifty-nine women remaining normotensive, non-proteinuric and finally giving birth to normal-weight infants were picked to serve as the control population of the study. Maternal serum concentrations of Ang-2, endostatin and sVEGFR-1, were increased already at 16 20 weeks of pregnancy, about 13 weeks before the clinical manifestation of preeclampsia. In addition, these biomarkers could be used to identify women at risk with a moderate precision. However, larger patient series are needed to determine whether these markers could be applied for clinical use to predict preeclampsia. Intrauterine growth retardation (IUGR), especially if noted at early stages of pregnancy and not secondary to any other pregnancy complication, has been suggested to be a form of preeclampsia compromising only the placental sufficiency and the fetus, but not affecting the maternal endothelium. In fact, IUGR and preeclampsia have been proposed to share a common vascular etiology in which factors regulating early placental angiogenesis are likely to play a central role. Thus, these factors have been suggested to be involved in the pathogenesis of IUGR. However, circulating sVEGFR-1, Ang-2 and endostatin concentrations were unaffected by subsequent IUGR at early second trimester. Furthermore, smoking was not associated with alterations in maternal circulating sVEGFR-1 or its placental production. The elimination of endogenous sVEGFR-1 after pregnancy was calculated from serial samples of eight pregnant women undergoing elective Caesarean section. As typical for proteins in human compartments, the elimination of sVEGFR-1 was biphasic, containing a rapid halflife of 3.4 h and a slow one of 29 h. The decline in sVEGFR-1 concentrations after mid-trimester legal termination of pregnancy was accompanied with a simultaneous increase in the serum levels of free VEGF so that within a few days after pregnancy VEGF dominated in the maternal circulation. Our study provides novel information on the kinetics of endogenous sVEGFR-1, which serves as a potential tool in the development of new strategies against diseases associated with angiogenic imbalance and alterations in VEGF signaling.
Resumo:
The methods of secondary wood processing are assumed to evolve over time and to affect the requirements set for the wood material and its suppliers. The study aimed at analysing the industrial operating modes applied by joinery and furniture manufacturers as sawnwood users. Industrial operating mode was defined as a pattern of important decisions and actions taken by a company which describes the company's level of adjustment in the late-industrial transition. A non-probabilistic sample of 127 companies was interviewed, including companies from Denmark, Germany, the Netherlands, and Finland. Fifty-two of the firms were furniture manufacturers and the other 75 were producing windows and doors. Variables related to business philosophy, production operations, and supplier choice criteria were measured and used as a basis for a customer typology; variables related to wood usage and perceived sawmill performance were measured to be used to profile the customer types. Factor analysis was used to determine the latent dimensions of industrial operating mode. Canonical correlations analysis was applied in developing the final base for classifying the observations. Non-hierarchical cluster analysis was employed to build a five-group typology of secondary wood processing firms; these ranged from traditional mass producers to late-industrial flexible manufacturers. There is a clear connection between the amount of late-industrial elements in a company and the share of special and customised sawnwood it uses. Those joinery or furniture manufacturers that are more late-industrial also are likely to use more component-type wood material and to appreciate customer-oriented technical precision. The results show that the change is towards the use of late-industrial sawnwood materials and late-industrial supplier relationships.
Resumo:
Human sport doping control analysis is a complex and challenging task for anti-doping laboratories. The List of Prohibited Substances and Methods, updated annually by World Anti-Doping Agency (WADA), consists of hundreds of chemically and pharmacologically different low and high molecular weight compounds. This poses a considerable challenge for laboratories to analyze for them all in a limited amount of time from a limited sample aliquot. The continuous expansion of the Prohibited List obliges laboratories to keep their analytical methods updated and to research new available methodologies. In this thesis, an accurate mass-based analysis employing liquid chromatography - time-of-flight mass spectrometry (LC-TOFMS) was developed and validated to improve the power of doping control analysis. New analytical methods were developed utilizing the high mass accuracy and high information content obtained by TOFMS to generate comprehensive and generic screening procedures. The suitability of LC-TOFMS for comprehensive screening was demonstrated for the first time in the field with mass accuracies better than 1 mDa. Further attention was given to generic sample preparation, an essential part of screening analysis, to rationalize the whole work flow and minimize the need for several separate sample preparation methods. Utilizing both positive and negative ionization allowed the detection of almost 200 prohibited substances. Automatic data processing produced a Microsoft Excel based report highlighting the entries fulfilling the criteria of the reverse data base search (retention time (RT), mass accuracy, isotope match). The quantitative performance of LC-TOFMS was demonstrated with morphine, codeine and their intact glucuronide conjugates. After a straightforward sample preparation the compounds were analyzed directly without the need for hydrolysis, solvent transfer, evaporation or reconstitution. The hydrophilic interaction technique (HILIC) provided good chromatographic separation, which was critical for the morphine glucuronide isomers. A wide linear range (50-5000 ng/ml) with good precision (RSD<10%) and accuracy (±10%) was obtained, showing comparable or better performance to other methods used. In-source collision-induced dissociation (ISCID) allowed confirmation analysis with three diagnostic ions with a median mass accuracy of 1.08 mDa and repeatable ion ratios fulfilling WADA s identification criteria. The suitability of LC-TOFMS for screening of high molecular weight doping agents was demonstrated with plasma volume expanders (PVE), namely dextran and hydroxyethylstarch (HES). Specificity of the assay was improved, since interfering matrix compounds were removed by size exclusion chromatography (SEC). ISCID produced three characteristic ions with an excellent mean mass accuracy of 0.82 mDa at physiological concentration levels. In summary, by combining TOFMS with a proper sample preparation and chromatographic separation, the technique can be utilized extensively in doping control laboratories for comprehensive screening of chemically different low and high molecular weight compounds, for quantification of threshold substances and even for confirmation. LC-TOFMS rationalized the work flow in doping control laboratories by simplifying the screening scheme, expediting reporting and minimizing the analysis costs. Therefore LC-TOFMS can be exploited widely in doping control, and the need for several separate analysis techniques is reduced.
Resumo:
Neurons can be divided into various classes according to their location, morphology, neurochemical identity and electrical properties. They form complex interconnected networks with precise roles for each cell type. GABAergic neurons expressing the calcium-binding protein parvalbumin (Pv) are mainly interneurons, which serve a coordinating function. Pv-cells modulate the activity of principal cells with high temporal precision. Abnormalities of Pv-interneuron activity in cortical areas have been linked to neuropsychiatric illnesses such as schizophrenia. Cerebellar Purkinje cells are known to be central to motor learning. They are the sole output from the layered cerebellar cortex to deep cerebellar nuclei. There are still many open questions about the precise role of Pv-neurons and Purkinje cells, many of which could be answered if one could achieve rapid, reversible cell-type specific modulation of the activity of these neurons and observe the subsequent changes at the whole-animal level. The aim of these studies was to develop a novel method for the modulation of Pv-neurons and Purkinje cells in vivo and to use this method to investigate the significance of inhibition in these neuronal types with a variety of behavioral experiments in addition to tissue autoradiography, electrophysiology and immunohistochemistry. The GABA(A) receptor γ2 subunit was ablated from Pv-neurons and Purkinje cells in four separate mouse lines. Pv-Δγ2 mice had wide-ranging behavioral alterations and increased GABA-insensitive binding indicative of an altered GABA(A) receptor composition, particularly in midbrain areas. PC-Δγ2 mice experienced little or no motor impairment despite the lack of inhibition in Purkinje cells. In Pv-Δγ2-partial rescue mice, a reversal of motor and cognitive deficits was observed in addition to restoration of the wild-type γ2F77 subunit to the reticular nucleus of thalamus and the cerebellar molecular layer. In PC-Δγ2-swap mice, zolpidem sensitivity was restored to Purkinje cells and the administration of systemic zolpidem evoked a transient motor impairment. On the basis of these results, it is concluded that this new method of cell-type specific modulation is a feasible way to modulate the activity of selected neuronal types. The importance of Purkinje cells to motor control supports previous studies, and the crucial involvement of Pv-neurons in a range of behavioral modalities is confirmed.
Resumo:
The study of soil microbiota and their activities is central to the understanding of many ecosystem processes such as decomposition and nutrient cycling. The collection of microbiological data from soils generally involves several sequential steps of sampling, pretreatment and laboratory measurements. The reliability of results is dependent on reliable methods in every step. The aim of this thesis was to critically evaluate some central methods and procedures used in soil microbiological studies in order to increase our understanding of the factors that affect the measurement results and to provide guidance and new approaches for the design of experiments. The thesis focuses on four major themes: 1) soil microbiological heterogeneity and sampling, 2) storage of soil samples, 3) DNA extraction from soil, and 4) quantification of specific microbial groups by the most-probable-number (MPN) procedure. Soil heterogeneity and sampling are discussed as a single theme because knowledge on spatial (horizontal and vertical) and temporal variation is crucial when designing sampling procedures. Comparison of adjacent forest, meadow and cropped field plots showed that land use has a strong impact on the degree of horizontal variation of soil enzyme activities and bacterial community structure. However, regardless of the land use, the variation of microbiological characteristics appeared not to have predictable spatial structure at 0.5-10 m. Temporal and soil depth-related patterns were studied in relation to plant growth in cropped soil. The results showed that most enzyme activities and microbial biomass have a clear decreasing trend in the top 40 cm soil profile and a temporal pattern during the growing season. A new procedure for sampling of soil microbiological characteristics based on stratified sampling and pre-characterisation of samples was developed. A practical example demonstrated the potential of the new procedure to reduce the analysis efforts involved in laborious microbiological measurements without loss of precision. The investigation of storage of soil samples revealed that freezing (-20 °C) of small sample aliquots retains the activity of hydrolytic enzymes and the structure of the bacterial community in different soil matrices relatively well whereas air-drying cannot be recommended as a storage method for soil microbiological properties due to large reductions in activity. Freezing below -70 °C was the preferred method of storage for samples with high organic matter content. Comparison of different direct DNA extraction methods showed that the cell lysis treatment has a strong impact on the molecular size of DNA obtained and on the bacterial community structure detected. An improved MPN method for the enumeration of soil naphthalene degraders was introduced as an alternative to more complex MPN protocols or the DNA-based quantification approach. The main advantage of the new method is the simple protocol and the possibility to analyse a large number of samples and replicates simultaneously.
Resumo:
According to the most prevalent view, there are 3-4 fixed "slots" in visual working memory for temporary storage. Recently this view has been challenged with a theory of dynamic resources which are restricted in their totality but can be freely allocated. The aim of this study is to clarify which one of the theories better describes the performance in visual working memory tasks with contour shapes. Thus in this study, the interest is in both the number of recalled stimuli and the precision of the memory representations. Stimuli in the experiments were radial frequency patterns, which were constructed by sinusoidally modulating the radius of a circle. Five observers participated in the experiment and it consisted of two different tasks. In the delayed discrimination task the number of recalled stimuli was measured with 2-interval forced choice task. Observer was shown serially two displays with 1, 5 s ISI (inter stimulus interval). Displays contained 1-6 patterns and they differed from each other with changed amplitude in one pattern. The participant s task was to report whether the changed pattern had higher amplitude in the first or in the second interval. The amount of amplitude change was defined with QUEST-procedure and the 75 % discrimination threshold was measured in the task. In the recall task the precision of the memory representations was measured with subjective adjustment method. First, observer was shown 1-6 patterns and after 1, 5 s ISI one location of the previously shown pattern was cued. Observer s task was to adjust amplitude of a probe pattern to match the amplitude of the pattern in working memory. In the delayed discrimination task the performance of all observes declined smoothly when the number of presented patterns was increased. The result supports the resource theory of working memory as there was no sudden fall in the performance. The amplitude threshold for one item was 0.01 0.05 and as the number of items increased from 1 to 6 there was a 4 15 -fold linear increase in the amplitude threshold (0.14 0.29). In the recall adjustment task the precision of four observers performance declined smoothly as the number of presented patterns was increased. The result also supports the resource theory. The standard deviation for one item was 0.03 0.05 and as the number of items increased from 1 to 6 there was a 2 3 -fold linear increase in the amplitude threshold (0.06 0.11). These findings show that the performance in a visual working memory task is described better according to the theory of freely allocated resources and not to the traditional slot-model. In addition, the allocation of the resources depends on the properties of the individual observer and the visual working memory task.
Resumo:
Language software applications encounter new words, e.g., acronyms, technical terminology, names or compounds of such words. In order to add new words to a lexicon, we need to indicate their inflectional paradigm. We present a new generally applicable method for creating an entry generator, i.e. a paradigm guesser, for finite-state transducer lexicons. As a guesser tends to produce numerous suggestions, it is important that the correct suggestions be among the first few candidates. We prove some formal properties of the method and evaluate it on Finnish, English and Swedish full-scale transducer lexicons. We use the open-source Helsinki Finite-State Technology to create finitestate transducer lexicons from existing lexical resources and automatically derive guessers for unknown words. The method has a recall of 82-87 % and a precision of 71-76 % for the three test languages. The model needs no external corpus and can therefore serve as a baseline.