47 resultados para Particle tracking detectors

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present three measurements of the top-quark mass in the lepton plus jets channel with approximately 1.9 fb-1 of integrated luminosity collected with the CDF II detector using quantities with minimal dependence on the jet energy scale. One measurement exploits the transverse decay length of b-tagged jets to determine a top-quark mass of 166.9+9.5-8.5 (stat) +/- 2.9 (syst) GeV/c2, and another the transverse momentum of electrons and muons from W-boson decays to determine a top-quark mass of 173.5+8.8-8.9 (stat) +/- 3.8 (syst) GeV/c2. These quantities are combined in a third, simultaneous mass measurement to determine a top-quark mass of 170.7 +/- 6.3 (stat) +/- 2.6 (syst) GeV/c2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present three measurements of the top-quark mass in the lepton plus jets channel with approximately 1.9 fb-1 of integrated luminosity collected with the CDF II detector using quantities with minimal dependence on the jet energy scale. One measurement exploits the transverse decay length of b-tagged jets to determine a top-quark mass of 166.9+9.5-8.5 (stat) +/- 2.9 (syst) GeV/c2, and another the transverse momentum of electrons and muons from W-boson decays to determine a top-quark mass of 173.5+8.8-8.9 (stat) +/- 3.8 (syst) GeV/c2. These quantities are combined in a third, simultaneous mass measurement to determine a top-quark mass of 170.7 +/- 6.3 (stat) +/- 2.6 (syst) GeV/c2.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Powders are essential materials in the pharmaceutical industry, being involved in majority of all drug manufacturing. Powder flow and particle size are central particle properties addressed by means of particle engineering. The aim of the thesis was to gain knowledge on powder processing with restricted liquid addition, with a primary focus on particle coating and early granule growth. Furthermore, characterisation of this kind of processes was performed. A thin coating layer of hydroxypropyl methylcellulose was applied on individual particles of ibuprofen in a fluidised bed top-spray process. The polymeric coating improved the flow properties of the powder. The improvement was strongly related to relative humidity, which can be seen as an indicator of a change in surface hydrophilicity caused by the coating. The ibuprofen used in the present study had a d50 of 40 μm and thus belongs to the Geldart group C powders, which can be considered as challenging materials in top-spray coating processes. Ibuprofen was similarly coated using a novel ultrasound-assisted coating method. The results were in line with those obtained from powders coated in the fluidised bed process mentioned above. It was found that the ultrasound-assisted method was capable of coating single particles with a simple and robust setup. Granule growth in a fluidised bed process was inhibited by feeding the liquid in pulses. The results showed that the length of the pulsing cycles is of importance, and can be used to adjust granule growth. Moreover, pulsed liquid feed was found to be of greater significance to granule growth in high inlet air relative humidity. Liquid feed pulsing can thus be used as a tool in particle size targeting in fluidised bed processes and in compensating for changes in relative humidity of the inlet air. The nozzle function of a two-fluid external mixing pneumatic nozzle, typical for small scale pharmaceutical fluidised bed processes, was studied in situ in an ongoing fluidised bed process with particle tracking velocimetry. It was found that the liquid droplets undergo coalescence as they proceed away from the nozzle head. The coalescence was expected to increase droplet speed, which was confirmed in the study. The spray turbulence was studied, and the results showed turbulence caused by the event of atomisation and by the oppositely directed fluidising air. It was concluded that particle tracking velocimetry is a suitable tool for in situ spray characterisation. The light transmission through dense particulate systems was found to carry information on particle size and packing density as expected based on the theory of light scattering by solids. It was possible to differentiate binary blends consisting of components with differences in optical properties. Light transmission showed potential as a rapid, simple and inexpensive tool in characterisation of particulate systems giving information on changes in particle systems, which could be utilised in basic process diagnostics.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Silicon particle detectors are used in several applications and will clearly require better hardness against particle radiation in the future large scale experiments than can be provided today. To achieve this goal, more irradiation studies with defect generating bombarding particles are needed. Protons can be considered as important bombarding species, although neutrons and electrons are perhaps the most widely used particles in such irradiation studies. Protons provide unique possibilities, as their defect production rates are clearly higher than those of neutrons and electrons, and, their damage creation in silicon is most similar to the that of pions. This thesis explores the development and testing of an irradiation facility that provides the cooling of the detector and on-line electrical characterisation, such as current-voltage (IV) and capacitance-voltage (CV) measurements. This irradiation facility, which employs a 5-MV tandem accelerator, appears to function well, but some disadvantageous limitations are related to MeV-proton irradiation of silicon particle detectors. Typically, detectors are in non-operational mode during irradiation (i.e., without the applied bias voltage). However, in real experiments the detectors are biased; the ionising proton generates electron-hole pairs, and a rise in rate of proton flux may cause the detector to breakdown. This limits the proton flux for the irradiation of biased detectors. In this work, it is shown that, if detectors are irradiated and kept operational, the electric field decreases the introduction rate of negative space-charges and current-related damage. The effects of various particles with different energies are scaled to each others by the non-ionising energy loss (NIEL) hypothesis. The type of defects induced by irradiation depends on the energy used, and this thesis also discusses the minimum proton energy required at which the NIEL-scaling is valid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By detecting leading protons produced in the Central Exclusive Diffractive process, p+p → p+X+p, one can measure the missing mass, and scan for possible new particle states such as the Higgs boson. This process augments - in a model independent way - the standard methods for new particle searches at the Large Hadron Collider (LHC) and will allow detailed analyses of the produced central system, such as the spin-parity properties of the Higgs boson. The exclusive central diffractive process makes possible precision studies of gluons at the LHC and complements the physics scenarios foreseen at the next e+e− linear collider. This thesis first presents the conclusions of the first systematic analysis of the expected precision measurement of the leading proton momentum and the accuracy of the reconstructed missing mass. In this initial analysis, the scattered protons are tracked along the LHC beam line and the uncertainties expected in beam transport and detection of the scattered leading protons are accounted for. The main focus of the thesis is in developing the necessary radiation hard precision detector technology for coping with the extremely demanding experimental environment of the LHC. This will be achieved by using a 3D silicon detector design, which in addition to the radiation hardness of up to 5×10^15 neutrons/cm2, offers properties such as a high signal-to- noise ratio, fast signal response to radiation and sensitivity close to the very edge of the detector. This work reports on the development of a novel semi-3D detector design that simplifies the 3D fabrication process, but conserves the necessary properties of the 3D detector design required in the LHC and in other imaging applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is focused on the effects of energetic particle precipitation of solar or magnetospheric origin on the polar middle atmosphere. The energetic charged particles have access to the atmosphere in the polar areas, where they are guided by the Earth's magnetic field. The particles penetrate down to 20-100 km altitudes (stratosphere and mesosphere) ionising the ambient air. This ionisation leads to production of odd nitrogen (NOx) and odd hydrogen species, which take part in catalytic ozone destruction. NOx has a very long chemical lifetime during polar night conditions. Therefore NOx produced at high altitudes during polar night can be transported to lower stratospheric altitudes. Particular emphasis in this work is in the use of both space and ground based observations: ozone and NO2 measurements from the GOMOS instrument on board the European Space Agency's Envisat-satellite are used together with subionospheric VLF radio wave observations from ground stations. Combining the two observation techniques enabled detection of NOx enhancements throughout the middle atmosphere, including tracking the descent of NOx enhancements of high altitude origin down to the stratosphere. GOMOS observations of the large Solar Proton Events of October-November 2003 showed the progression of the SPE initiated NOx enhancements through the polar winter. In the upper stratosphere, nighttime NO2 increased by an order of magnitude, and the effect was observed to last for several weeks after the SPEs. Ozone decreases up to 60 % from the pre-SPE values were observed in the upper stratosphere nearly a month after the events. Over several weeks the GOMOS observations showed the gradual descent of the NOx enhancements to lower altitudes. Measurements from years 2002-2006 were used to study polar winter NOx increases and their connection to energetic particle precipitation. NOx enhancements were found to occur in a good correlation with both increased high-energy particle precipitation and increased geomagnetic activity. The average wintertime polar NOx was found to have a nearly linear relationship with the average wintertime geomagnetic activity. The results from this thesis work show how important energetic particle precipitation from outside the atmosphere is as a source of NOx in the middle atmosphere, and thus its importance to the chemical balance of the atmosphere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ALICE (A Large Ion Collider Experiment) is the LHC (Large Hadron Collider) experiment devoted to investigating the strongly interacting matter created in nucleus-nucleus collisions at the LHC energies. The ALICE ITS, Inner Tracking System, consists of six cylindrical layers of silicon detectors with three different technologies; in the outward direction: two layers of pixel detectors, two layers each of drift, and strip detectors. The number of parameters to be determined in the spatial alignment of the 2198 sensor modules of the ITS is about 13,000. The target alignment precision is well below 10 micron in some cases (pixels). The sources of alignment information include survey measurements, and the reconstructed tracks from cosmic rays and from proton-proton collisions. The main track-based alignment method uses the Millepede global approach. An iterative local method was developed and used as well. We present the results obtained for the ITS alignment using about 10^5 charged tracks from cosmic rays that have been collected during summer 2008, with the ALICE solenoidal magnet switched off.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The International Large Detector (ILD) is a concept for a detector at the International Linear Collider, ILC. The ILC will collide electrons and positrons at energies of initially 500 GeV, upgradeable to 1 TeV. The ILC has an ambitious physics program, which will extend and complement that of the Large Hadron Collider (LHC). A hallmark of physics at the ILC is precision. The clean initial state and the comparatively benign environment of a lepton collider are ideally suited to high precision measurements. To take full advantage of the physics potential of ILC places great demands on the detector performance. The design of ILD is driven by these requirements. Excellent calorimetry and tracking are combined to obtain the best possible overall event reconstruction, including the capability to reconstruct individual particles within jets for particle ow calorimetry. This requires excellent spatial resolution for all detector systems. A highly granular calorimeter system is combined with a central tracker which stresses redundancy and efficiency. In addition, efficient reconstruction of secondary vertices and excellent momentum resolution for charged particles are essential for an ILC detector. The interaction region of the ILC is designed to host two detectors, which can be moved into the beam position with a push-pull scheme. The mechanical design of ILD and the overall integration of subdetectors takes these operational conditions into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Silicon strip detectors are fast, cost-effective and have an excellent spatial resolution. They are widely used in many high-energy physics experiments. Modern high energy physics experiments impose harsh operation conditions on the detectors, e.g., of LHC experiments. The high radiation doses cause the detectors to eventually fail as a result of excessive radiation damage. This has led to a need to study radiation tolerance using various techniques. At the same time, a need to operate sensors approaching the end their lifetimes has arisen. The goal of this work is to demonstrate that novel detectors can survive the environment that is foreseen for future high-energy physics experiments. To reach this goal, measurement apparatuses are built. The devices are then used to measure the properties of irradiated detectors. The measurement data are analyzed, and conclusions are drawn. Three measurement apparatuses built as a part of this work are described: two telescopes measuring the tracks of the beam of a particle accelerator and one telescope measuring the tracks of cosmic particles. The telescopes comprise layers of reference detectors providing the reference track, slots for the devices under test, the supporting mechanics, electronics, software, and the trigger system. All three devices work. The differences between these devices are discussed. The reconstruction of the reference tracks and analysis of the device under test are presented. Traditionally, silicon detectors have produced a very clear response to the particles being measured. In the case of detectors nearing the end of their lifefimes, this is no longer true. A new method benefitting from the reference tracks to form clusters is presented. The method provides less biased results compared to the traditional analysis, especially when studying the response of heavily irradiated detectors. Means to avoid false results in demonstrating the particle-finding capabilities of a detector are also discussed. The devices and analysis methods are primarily used to study strip detectors made of Magnetic Czochralski silicon. The detectors studied were irradiated to various fluences prior to measurement. The results show that Magnetic Czochralski silicon has a good radiation tolerance and is suitable for future high-energy physics experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Screening of wastewater effluents from municipal and industrial wastewater treatment plants with biotests showed that the treated wastewater effluents possess only minor acute toxic properties towards whole organisms (e.g. bacteria, algae, daphnia), if any. In vitro tests (sub-mitochondrial membranes and fish hepatocytes) were generally more susceptible to the effluents. Most of the effluents indicated the presence of hormonally active compounds, as the production of vitellogenin, an egg yolk precursor protein, was induced in fish hepatocytes exposed to wastewater. In addition, indications of slight genotoxic potential was found in one effluent concentrate with a recombinant bacteria test. Reverse electron transport (RET) of mitochondrial membranes was used as a model test to conduct effluent assessment followed by toxicant characterisations and identifications. Using a modified U.S. EPA Toxicity Identification Evaluation Phase I scheme and additional case-specific methods, the main compound in a pulp and paper mill effluent causing RET inhibition was characterised to be an organic, relatively hydrophilic high molecular weight (HMW) compound. The toxicant could be verified as HMW lignin by structural analyses using nuclear magnetic resonance. In the confirmation step commercial and in-house extracted lignin products were used. The possible toxicity related structures were characterised by statistical analysis of the chemical breakdown structures of laboratory-scale pulping and bleaching effluents and the toxicities of these effluents. Finally, the biological degradation of the identified toxicant and other wastewater constituents was evaluated using bioassays in combination with chemical analyses. Biological methods have not been used routinely in establishing effluent discharge limits in Finland. However, the biological effects observed in this study could not have been predicted using only routine physical and chemical effluent monitoring parameters. Therefore chemical parameters cannot be considered to be sufficient in controlling effluent discharges especially in case of unknown, possibly bioaccumulative, compounds that may be present in small concentrations and may cause chronic effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ALICE (A Large Ion Collider Experiment) is an experiment at CERN (European Organization for Nuclear Research), where a heavy-ion detector is dedicated to exploit the unique physics potential of nucleus-nucleus interactions at LHC (Large Hadron Collider) energies. In a part of that project, 716 so-called type V4 modules were assembles in Detector Laboratory of Helsinki Institute of Physics during the years 2004 - 2006. Altogether over a million detector strips has made this project the most massive particle detector project in the science history of Finland. One ALICE SSD module consists of a double-sided silicon sensor, two hybrids containing 12 HAL25 front end readout chips and some passive components, such has resistors and capacitors. The components are connected together by TAB (Tape Automated Bonding) microcables. The components of the modules were tested in every assembly phase with comparable electrical tests to ensure the reliable functioning of the detectors and to plot the possible problems. The components were accepted or rejected by the limits confirmed by ALICE collaboration. This study is concentrating on the test results of framed chips, hybrids and modules. The total yield of the framed chips is 90.8%, hybrids 96.1% and modules 86.2%. The individual test results have been investigated in the light of the known error sources that appeared during the project. After solving the problems appearing during the learning-curve of the project, the material problems, such as defected chip cables and sensors, seemed to induce the most of the assembly rejections. The problems were typically seen in tests as too many individual channel failures. Instead, the bonding failures rarely caused the rejections of any component. One sensor type among three different sensor manufacturers has proven to have lower quality than the others. The sensors of this manufacturer are very noisy and their depletion voltage are usually outside of the specification given to the manufacturers. Reaching 95% assembling yield during the module production demonstrates that the assembly process has been highly successful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Topic detection and tracking (TDT) is an area of information retrieval research the focus of which revolves around news events. The problems TDT deals with relate to segmenting news text into cohesive stories, detecting something new, previously unreported, tracking the development of a previously reported event, and grouping together news that discuss the same event. The performance of the traditional information retrieval techniques based on full-text similarity has remained inadequate for online production systems. It has been difficult to make the distinction between same and similar events. In this work, we explore ways of representing and comparing news documents in order to detect new events and track their development. First, however, we put forward a conceptual analysis of the notions of topic and event. The purpose is to clarify the terminology and align it with the process of news-making and the tradition of story-telling. Second, we present a framework for document similarity that is based on semantic classes, i.e., groups of words with similar meaning. We adopt people, organizations, and locations as semantic classes in addition to general terms. As each semantic class can be assigned its own similarity measure, document similarity can make use of ontologies, e.g., geographical taxonomies. The documents are compared class-wise, and the outcome is a weighted combination of class-wise similarities. Third, we incorporate temporal information into document similarity. We formalize the natural language temporal expressions occurring in the text, and use them to anchor the rest of the terms onto the time-line. Upon comparing documents for event-based similarity, we look not only at matching terms, but also how near their anchors are on the time-line. Fourth, we experiment with an adaptive variant of the semantic class similarity system. The news reflect changes in the real world, and in order to keep up, the system has to change its behavior based on the contents of the news stream. We put forward two strategies for rebuilding the topic representations and report experiment results. We run experiments with three annotated TDT corpora. The use of semantic classes increased the effectiveness of topic tracking by 10-30\% depending on the experimental setup. The gain in spotting new events remained lower, around 3-4\%. The anchoring the text to a time-line based on the temporal expressions gave a further 10\% increase the effectiveness of topic tracking. The gains in detecting new events, again, remained smaller. The adaptive systems did not improve the tracking results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Free and Open Source Software (FOSS) has gained increased interest in the computer software industry, but assessing its quality remains a challenge. FOSS development is frequently carried out by globally distributed development teams, and all stages of development are publicly visible. Several product and process-level quality factors can be measured using the public data. This thesis presents a theoretical background for software quality and metrics and their application in a FOSS environment. Information available from FOSS projects in three information spaces are presented, and a quality model suitable for use in a FOSS context is constructed. The model includes both process and product quality metrics, and takes into account the tools and working methods commonly used in FOSS projects. A subset of the constructed quality model is applied to three FOSS projects, highlighting both theoretical and practical concerns in implementing automatic metric collection and analysis. The experiment shows that useful quality information can be extracted from the vast amount of data available. In particular, projects vary in their growth rate, complexity, modularity and team structure.