599 resultados para CERN


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Developing and maintaining a successful institutional repository for research publications requires a considerable investment by the institution. Most of the money is spent on developing the skill-sets of existing staff or hiring new staff with the necessary skills. The return on this investment can be magnified by using this valuable infrastructure to curate collections of other materials such as learning objects, student work, conference proceedings and institutional or local community heritage materials. When Queensland University of Technology (QUT) implemented its repository for research publications (QUT ePrints) over 11 years ago, it was one of the first institutional repositories to be established in Australia. Currently, the repository holds over 29,000 open access research publications and the cumulative total number of full-text downloads for these document now exceeds 16 million. The full-text deposit rate for recently-published peer reviewed papers (currently over 74%) shows how well the repository has been embraced by QUT researchers. The success of QUT ePrints has resulted in requests to accommodate a plethora of materials which are ‘out of scope’ for this repository. QUT Library saw this as an opportunity to use its repository infrastructure (software, technical know-how and policies) to develop and implement a metadata repository for its research datasets (QUT Research Data Finder), a repository for research-related software (QUT Software Finder) and to curate a number of digital collections of institutional and local community heritage materials (QUT Digital Collections). This poster describes the repositories and digital collections curated by QUT Library and outlines the value delivered to the institution, and the wider community, by these initiatives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ALICE (A Large Ion Collider Experiment) is an experiment at CERN (European Organization for Nuclear Research), where a heavy-ion detector is dedicated to exploit the unique physics potential of nucleus-nucleus interactions at LHC (Large Hadron Collider) energies. In a part of that project, 716 so-called type V4 modules were assembles in Detector Laboratory of Helsinki Institute of Physics during the years 2004 - 2006. Altogether over a million detector strips has made this project the most massive particle detector project in the science history of Finland. One ALICE SSD module consists of a double-sided silicon sensor, two hybrids containing 12 HAL25 front end readout chips and some passive components, such has resistors and capacitors. The components are connected together by TAB (Tape Automated Bonding) microcables. The components of the modules were tested in every assembly phase with comparable electrical tests to ensure the reliable functioning of the detectors and to plot the possible problems. The components were accepted or rejected by the limits confirmed by ALICE collaboration. This study is concentrating on the test results of framed chips, hybrids and modules. The total yield of the framed chips is 90.8%, hybrids 96.1% and modules 86.2%. The individual test results have been investigated in the light of the known error sources that appeared during the project. After solving the problems appearing during the learning-curve of the project, the material problems, such as defected chip cables and sensors, seemed to induce the most of the assembly rejections. The problems were typically seen in tests as too many individual channel failures. Instead, the bonding failures rarely caused the rejections of any component. One sensor type among three different sensor manufacturers has proven to have lower quality than the others. The sensors of this manufacturer are very noisy and their depletion voltage are usually outside of the specification given to the manufacturers. Reaching 95% assembling yield during the module production demonstrates that the assembly process has been highly successful.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The TOTEM experiment at the LHC will measure the total proton-proton cross-section with a precision better than 1%, elastic proton scattering over a wide range in momentum transfer -t= p^2 theta^2 up to 10 GeV^2 and diffractive dissociation, including single, double and central diffraction topologies. The total cross-section will be measured with the luminosity independent method that requires the simultaneous measurements of the total inelastic rate and the elastic proton scattering down to four-momentum transfers of a few 10^-3 GeV^2, corresponding to leading protons scattered in angles of microradians from the interaction point. This will be achieved using silicon microstrip detectors, which offer attractive properties such as good spatial resolution (<20 um), fast response (O(10ns)) to particles and radiation hardness up to 10^14 "n"/cm^2. This work reports about the development of an innovative structure at the detector edge reducing the conventional dead width of 0.5-1 mm to 50-60 um, compatible with the requirements of the experiment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

By detecting leading protons produced in the Central Exclusive Diffractive process, p+p → p+X+p, one can measure the missing mass, and scan for possible new particle states such as the Higgs boson. This process augments - in a model independent way - the standard methods for new particle searches at the Large Hadron Collider (LHC) and will allow detailed analyses of the produced central system, such as the spin-parity properties of the Higgs boson. The exclusive central diffractive process makes possible precision studies of gluons at the LHC and complements the physics scenarios foreseen at the next e+e− linear collider. This thesis first presents the conclusions of the first systematic analysis of the expected precision measurement of the leading proton momentum and the accuracy of the reconstructed missing mass. In this initial analysis, the scattered protons are tracked along the LHC beam line and the uncertainties expected in beam transport and detection of the scattered leading protons are accounted for. The main focus of the thesis is in developing the necessary radiation hard precision detector technology for coping with the extremely demanding experimental environment of the LHC. This will be achieved by using a 3D silicon detector design, which in addition to the radiation hardness of up to 5×10^15 neutrons/cm2, offers properties such as a high signal-to- noise ratio, fast signal response to radiation and sensitivity close to the very edge of the detector. This work reports on the development of a novel semi-3D detector design that simplifies the 3D fabrication process, but conserves the necessary properties of the 3D detector design required in the LHC and in other imaging applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Transition Radiation Tracker (TRT) of the ATLAS experiment at the LHC is part of the Inner Detector. It is designed as a robust and powerful gaseous detector that provides tracking through individual drift-tubes (straws) as well as particle identification via transition radiation (TR) detection. The straw tubes are operated with Xe-CO2-O2 70/27/3, a gas that combines the advantages of efficient TR absorption, a short electron drift time and minimum ageing effects. The modules of the barrel part of the TRT were built in the United States while the end-cap wheels are assembled at two Russian institutes. Acceptance tests of barrel modules and end-cap wheels are performed at CERN before assembly and integration with the Semiconductor Tracker (SCT) and the Pixel Detector. This thesis first describes simulations the TRT straw tube. The argon-based acceptance gas mixture as well as two xenon-based operating gases are examined for its properties. Drift velocities and Townsend coefficients are computed with the help of the program Magboltz and used to study electron drift and multiplication in the straw using the software Garfield. The inclusion of Penning transfers in the avalanche process leads to remarkable agreements with experimental data. A high level of cleanliness in the TRT s acceptance test gas system is indispensable. To monitor gas purity, a small straw tube detector has been constructed and extensively used to study the ageing behaviour of the straw tube in Ar-CO2. A variety of ageing tests are presented and discussed. Acceptance tests for the TRT survey dimensions, wire tension, gas-tightness, high-voltage stability and gas gain uniformity along each individual straw. The thesis gives details on acceptance criteria and measurement methods in the case of the end-cap wheels. Special focus is put on wire tension and straw straightness. The effect of geometrically deformed straws on gas gain and energy resolution is examined in an experimental setup and compared to simulation studies. An overview of the most important results from the end-cap wheels tested up to this point is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Currently, we live in an era characterized by the completion and first runs of the LHC accelerator at CERN, which is hoped to provide the first experimental hints of what lies beyond the Standard Model of particle physics. In addition, the last decade has witnessed a new dawn of cosmology, where it has truly emerged as a precision science. Largely due to the WMAP measurements of the cosmic microwave background, we now believe to have quantitative control of much of the history of our universe. These two experimental windows offer us not only an unprecedented view of the smallest and largest structures of the universe, but also a glimpse at the very first moments in its history. At the same time, they require the theorists to focus on the fundamental challenges awaiting at the boundary of high energy particle physics and cosmology. What were the contents and properties of matter in the early universe? How is one to describe its interactions? What kind of implications do the various models of physics beyond the Standard Model have on the subsequent evolution of the universe? In this thesis, we explore the connection between in particular supersymmetric theories and the evolution of the early universe. First, we provide the reader with a general introduction to modern day particle cosmology from two angles: on one hand by reviewing our current knowledge of the history of the early universe, and on the other hand by introducing the basics of supersymmetry and its derivatives. Subsequently, with the help of the developed tools, we direct the attention to the specific questions addressed in the three original articles that form the main scientific contents of the thesis. Each of these papers concerns a distinct cosmological problem, ranging from the generation of the matter-antimatter asymmetry to inflation, and finally to the origin or very early stage of the universe. They nevertheless share a common factor in their use of the machinery of supersymmetric theories to address open questions in the corresponding cosmological models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we estimate the ability of the Bertini cascade to simulate Compact Muon Solenoid (CMS) hadron calorimeter HCAL. LHC test beam activity has a tightly coupled cycle of simulation-to-data analysis. Typically, a Geant4 computer experiment is used to understand test beam measurements. Thus an another aspect of this thesis is a description of studies related to developing new CMS H2 test beam data analysis tools and performing data analysis on the basis of CMS Monte Carlo events. These events have been simulated in detail using Geant4 physics models, full CMS detector description, and event reconstruction. Using the ROOT data analysis framework we have developed an offline ANN-based approach to tag b-jets associated with heavy neutral Higgs particles, and we show that this kind of NN methodology can be successfully used to separate the Higgs signal from the background in the CMS experiment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantum chromodynamics (QCD) is the theory describing interaction between quarks and gluons. At low temperatures, quarks are confined forming hadrons, e.g. protons and neutrons. However, at extremely high temperatures the hadrons break apart and the matter transforms into plasma of individual quarks and gluons. In this theses the quark gluon plasma (QGP) phase of QCD is studied using lattice techniques in the framework of dimensionally reduced effective theories EQCD and MQCD. Two quantities are in particular interest: the pressure (or grand potential) and the quark number susceptibility. At high temperatures the pressure admits a generalised coupling constant expansion, where some coefficients are non-perturbative. We determine the first such contribution of order g^6 by performing lattice simulations in MQCD. This requires high precision lattice calculations, which we perform with different number of colors N_c to obtain N_c-dependence on the coefficient. The quark number susceptibility is studied by performing lattice simulations in EQCD. We measure both flavor singlet (diagonal) and non-singlet (off-diagonal) quark number susceptibilities. The finite chemical potential results are optained using analytic continuation. The diagonal susceptibility approaches the perturbative result above 20T_c$, but below that temperature we observe significant deviations. The results agree well with 4d lattice data down to temperatures 2T_c.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discoveries at the LHC will soon set the physics agenda for future colliders. This report of a CERN Theory Institute includes the summaries of Working Groups that reviewed the physics goals and prospects of LHC running with 10 to 300 fb(-1) of integrated luminosity, of the proposed sLHC luminosity upgrade, of the ILC, of CLIC, of the LHeC and of a muon collider. The four Working Groups considered possible scenarios for the first 10 fb(-1) of data at the LHC in which (i) a state with properties that are compatible with a Higgs boson is discovered, (ii) no such state is discovered either because the Higgs properties are such that it is difficult to detect or because no Higgs boson exists, (iii) a missing-energy signal beyond the Standard Model is discovered as in some supersymmetric models, and (iv) some other exotic signature of new physics is discovered. In the contexts of these scenarios, the Working Groups reviewed the capabilities of the future colliders to study in more detail whatever new physics may be discovered by the LHC. Their reports provide the particle physics community with some tools for reviewing the scientific priorities for future colliders after the LHC produces its first harvest of new physics from multi-TeV collisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A detailed study is presented of the expected performance of the ATLAS detector. The reconstruction of tracks, leptons, photons, missing energy and jets is investigated, together with the performance of b-tagging and the trigger. The physics potential for a variety of interesting physics processes, within the Standard Model and beyond, is examined. The study comprises a series of notes based on simulations of the detector and physics processes, with particular emphasis given to the data expected from the first years of operation of the LHC at CERN.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The TOTEM collaboration has developed and tested the first prototype of its Roman Pots to be operated in the LHC. TOTEM Roman Pots contain stacks of 10 silicon detectors with strips oriented in two orthogonal directions. To measure proton scattering angles of a few microradians, the detectors will approach the beam centre to a distance of 10 sigma + 0.5 mm (= 1.3 mm). Dead space near the detector edge is minimised by using two novel "edgeless" detector technologies. The silicon detectors are used both for precise track reconstruction and for triggering. The first full-sized prototypes of both detector technologies as well as their read-out electronics have been developed, built and operated. The tests took place first in a fixed-target muon beam at CERN's SPS, and then in the proton beam-line of the SPS accelerator ring. We present the test beam results demonstrating the successful functionality of the system despite slight technical shortcomings to be improved in the near future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Silicon strip detectors are fast, cost-effective and have an excellent spatial resolution. They are widely used in many high-energy physics experiments. Modern high energy physics experiments impose harsh operation conditions on the detectors, e.g., of LHC experiments. The high radiation doses cause the detectors to eventually fail as a result of excessive radiation damage. This has led to a need to study radiation tolerance using various techniques. At the same time, a need to operate sensors approaching the end their lifetimes has arisen. The goal of this work is to demonstrate that novel detectors can survive the environment that is foreseen for future high-energy physics experiments. To reach this goal, measurement apparatuses are built. The devices are then used to measure the properties of irradiated detectors. The measurement data are analyzed, and conclusions are drawn. Three measurement apparatuses built as a part of this work are described: two telescopes measuring the tracks of the beam of a particle accelerator and one telescope measuring the tracks of cosmic particles. The telescopes comprise layers of reference detectors providing the reference track, slots for the devices under test, the supporting mechanics, electronics, software, and the trigger system. All three devices work. The differences between these devices are discussed. The reconstruction of the reference tracks and analysis of the device under test are presented. Traditionally, silicon detectors have produced a very clear response to the particles being measured. In the case of detectors nearing the end of their lifefimes, this is no longer true. A new method benefitting from the reference tracks to form clusters is presented. The method provides less biased results compared to the traditional analysis, especially when studying the response of heavily irradiated detectors. Means to avoid false results in demonstrating the particle-finding capabilities of a detector are also discussed. The devices and analysis methods are primarily used to study strip detectors made of Magnetic Czochralski silicon. The detectors studied were irradiated to various fluences prior to measurement. The results show that Magnetic Czochralski silicon has a good radiation tolerance and is suitable for future high-energy physics experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A better understanding of vacuum arcs is desirable in many of today's 'big science' projects including linear colliders, fusion devices, and satellite systems. For the Compact Linear Collider (CLIC) design, radio-frequency (RF) breakdowns occurring in accelerating cavities influence efficiency optimisation and cost reduction issues. Studying vacuum arcs both theoretically as well as experimentally under well-defined and reproducible direct-current (DC) conditions is the first step towards exploring RF breakdowns. In this thesis, we have studied Cu DC vacuum arcs with a combination of experiments, a particle-in-cell (PIC) model of the arc plasma, and molecular dynamics (MD) simulations of the subsequent surface damaging mechanism. We have also developed the 2D Arc-PIC code and the physics model incorporated in it, especially for the purpose of modelling the plasma initiation in vacuum arcs. Assuming the presence of a field emitter at the cathode initially, we have identified the conditions for plasma formation and have studied the transitions from field emission stage to a fully developed arc. The 'footing' of the plasma is the cathode spot that supplies the arc continuously with particles; the high-density core of the plasma is located above this cathode spot. Our results have shown that once an arc plasma is initiated, and as long as energy is available, the arc is self-maintaining due to the plasma sheath that ensures enhanced field emission and sputtering. The plasma model can already give an estimate on how the time-to-breakdown changes with the neutral evaporation rate, which is yet to be determined by atomistic simulations. Due to the non-linearity of the problem, we have also performed a code-to-code comparison. The reproducibility of plasma behaviour and time-to-breakdown with independent codes increased confidence in the results presented here. Our MD simulations identified high-flux, high-energy ion bombardment as a possible mechanism forming the early-stage surface damage in vacuum arcs. In this mechanism, sputtering occurs mostly in clusters, as a consequence of overlapping heat spikes. Different-sized experimental and simulated craters were found to be self-similar with a crater depth-to-width ratio of about 0.23 (sim) - 0.26 (exp). Experiments, which we carried out to investigate the energy dependence of DC breakdown properties, point at an intrinsic connection between DC and RF scaling laws and suggest the possibility of accumulative effects influencing the field enhancement factor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A construction of a new family of distributed space time codes (DSTCs) having full diversity and low Maximum Likelihood (ML) decoding complexity is provided for the two phase based cooperative diversity protocols of Jing-Hassibi and the recently proposed Generalized Non-orthogonal Amplify and Forward (GNAF) protocol of Rajan et al. The salient feature of the proposed DSTCs is that they satisfy the extra constraints imposed by the protocols and are also four-group ML decodable which leads to significant reduction in ML decoding complexity compared to all existing DSTC constructions. Moreover these codes have uniform distribution of power among the relays as well as in time. Also, simulations results indicate that these codes perform better in comparison with the only known DSTC with the same rate and decoding complexity, namely the Coordinate Interleaved Orthogonal Design (CIOD). Furthermore, they perform very close to DSTCs from field extensions which have same rate but higher decoding complexity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present talk, we will discuss a six dimensional mass generation for the neutrinos. The SM neutrinos live on a 3-brane and interact via a brane localised mass term with a Weyl singlet neutrino residing in all the six dimensions. We present the physical neutrino mass spectrum and show that the active neutrino mass and the KK masses have a logarithmic cut-off dependence at the tree level. This translates in to a renormalisation group running of n -masses above the KK compactification scale coming from classical effects without any SM particles in the spectrum.This could have effects in neutrinoless double beta decay experiments.