917 resultados para LHC,CMS,Big Data
Resumo:
Multidimensional and one-dimensional quantum-statistical (Bose-Einstein) correlations are measured in proton proton collisions at 0.9, 2.76 and 7 TeV, in proton lead collisions at 5.02 TeV/nucleon pair and peripheral lead lead collisions at 2.76 TeV/nucleon pair center-of-mass energy with the CMS detector at the LHC. The correlation functions are extracted in terms of different components of the relative momentum of the pair, in order to investigate the extension of the emission source in different directions. The results are presented for different intervals of transverse pair momentum, k(T), and charged particle multiplicity of the collision, N-tracks, as well as for their integrated values. Besides inclusive charged particles, charged pions and kaons, identified via their energy loss in the silicon tracker detector, can also be correlated. The extracted source radii increase with increasing multiplicity, and decrease with increasing k(T). The results open the possibility to study scaling and factorization properties of these radii as a function of multiplicity, k(T), colliding system size and center-of-mass energy.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Searches are presented for heavy scalar (H) and pseudoscalar (A) Higgs bosons posited in the two doublet model (2HDM) extensions of the standard model (SM). These searches are based on a data sample of pp collisions collected with the CMS experiment at the LHC at a center-of-mass energy of root s = 8 TeV and corresponding to an integrated luminosity of 19.5 fb(-1). The decays H -> hh and A -> Zh, where h denotes an SM-like Higgs boson, lead to events with three or more isolated charged leptons or with a photon pair accompanied by one or more isolated leptons. The search results are presented in terms of the H and A production cross sections times branching fractions and are further interpreted in terms of 2HDM parameters. We place 95% C.L. cross section upper limits of approximately 7 pb on sigma B for H -> hh and 2 pb for A -> Zh. Also presented are the results of a search for the rare decay of the top quark that results in a charm quark and an SM Higgs boson, t -> ch, the existence of which would indicate a nonzero flavor-changing Yukawa coupling of the top quark to the Higgs boson. We place a 95% C.L. upper limit of 0.56% on B(t -> ch).
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Townsend’s big-eared bat, Corynorhinus townsendii, is distributed broadly across western North America and in two isolated, endangered populations in central and eastern United States. There are five subspecies of C. townsendii; C. t. pallescens, C. t. australis, C. t. townsendii, C. t. ingens, and C. t. virginianus with varying degrees of concern over the conservation status of each. The aim of this study was to use mitochondrial and microsatellite DNA data to examine genetic diversity, population differentiation, and dispersal of three C. townsendii subspecies. C. t. virginianus is found in isolated populations in the eastern United States and was listed as endangered under the Endangered Species Act in 1979. Concern also exists about declining populations of two western subspecies, C. t. pallescens and C. t. townsendii. Using a comparative approach, estimates of the genetic diversity within populations of the endangered subspecies, C. t. virginianus, were found to be significantly lower than within populations of the two western subspecies. Further, both classes of molecular markers revealed significant differentiation among regional populations of C. t. virginianus with most genetic diversity distributed among populations. Genetic diversity was not significantly different between C. t. townsendii and C. t. pallescens. Some populations of C. t. townsendii are not genetically differentiated from populations of C. t. pallescens in areas of sympatry. For the western subspecies gene flow appears to occur primarily through male dispersal. Finally, geographic regions representing significantly differentiated and genetically unique populations of C. townsendii virginianus are recognized as distinct evolutionary significant units.
Resumo:
Hundreds of Terabytes of CMS (Compact Muon Solenoid) data are being accumulated for storage day by day at the University of Nebraska-Lincoln, which is one of the eight US CMS Tier-2 sites. Managing this data includes retaining useful CMS data sets and clearing storage space for newly arriving data by deleting less useful data sets. This is an important task that is currently being done manually and it requires a large amount of time. The overall objective of this study was to develop a methodology to help identify the data sets to be deleted when there is a requirement for storage space. CMS data is stored using HDFS (Hadoop Distributed File System). HDFS logs give information regarding file access operations. Hadoop MapReduce was used to feed information in these logs to Support Vector Machines (SVMs), a machine learning algorithm applicable to classification and regression which is used in this Thesis to develop a classifier. Time elapsed in data set classification by this method is dependent on the size of the input HDFS log file since the algorithmic complexities of Hadoop MapReduce algorithms here are O(n). The SVM methodology produces a list of data sets for deletion along with their respective sizes. This methodology was also compared with a heuristic called Retention Cost which was calculated using size of the data set and the time since its last access to help decide how useful a data set is. Accuracies of both were compared by calculating the percentage of data sets predicted for deletion which were accessed at a later instance of time. Our methodology using SVMs proved to be more accurate than using the Retention Cost heuristic. This methodology could be used to solve similar problems involving other large data sets.
Resumo:
We present measurements of Underlying Event observables in pp collisions at root s = 0 : 9 and 7 TeV. The analysis is performed as a function of the highest charged-particle transverse momentum p(T),L-T in the event. Different regions are defined with respect to the azimuthal direction of the leading (highest transverse momentum) track: Toward, Transverse and Away. The Toward and Away regions collect the fragmentation products of the hardest partonic interaction. The Transverse region is expected to be most sensitive to the Underlying Event activity. The study is performed with charged particles above three different p(T) thresholds: 0.15, 0.5 and 1.0 GeV/c. In the Transverse region we observe an increase in the multiplicity of a factor 2-3 between the lower and higher collision energies, depending on the track p(T) threshold considered. Data are compared to PYTHIA 6.4, PYTHIA 8.1 and PHOJET. On average, all models considered underestimate the multiplicity and summed p(T) in the Transverse region by about 10-30%.
Resumo:
In the framework of gauged flavour symmetries, new fermions in parity symmetric representations of the standard model are generically needed for the compensation of mixed anomalies. The key point is that their masses are also protected by flavour symmetries and some of them are expected to lie way below the flavour symmetry breaking scale(s), which has to occur many orders of magnitude above the electroweak scale to be compatible with the available data from flavour changing neutral currents and CP violation experiments. We argue that, actually, some of these fermions would plausibly get masses within the LHC range. If they are taken to be heavy quarks and leptons, in (bi)-fundamental representations of the standard model symmetries, their mixings with the light ones are strongly constrained to be very small by electroweak precision data. The alternative chosen here is to exactly forbid such mixings by breaking of flavour symmetries into an exact discrete symmetry, the so-called proton-hexality, primarily suggested to avoid proton decay. As a consequence of the large value needed for the flavour breaking scale, those heavy particles are long-lived and rather appropriate for the current and future searches at the LHC for quasi-stable hadrons and leptons. In fact, the LHC experiments have already started to look for them.
Resumo:
Current scientific applications have been producing large amounts of data. The processing, handling and analysis of such data require large-scale computing infrastructures such as clusters and grids. In this area, studies aim at improving the performance of data-intensive applications by optimizing data accesses. In order to achieve this goal, distributed storage systems have been considering techniques of data replication, migration, distribution, and access parallelism. However, the main drawback of those studies is that they do not take into account application behavior to perform data access optimization. This limitation motivated this paper which applies strategies to support the online prediction of application behavior in order to optimize data access operations on distributed systems, without requiring any information on past executions. In order to accomplish such a goal, this approach organizes application behaviors as time series and, then, analyzes and classifies those series according to their properties. By knowing properties, the approach selects modeling techniques to represent series and perform predictions, which are, later on, used to optimize data access operations. This new approach was implemented and evaluated using the OptorSim simulator, sponsored by the LHC-CERN project and widely employed by the scientific community. Experiments confirm this new approach reduces application execution time in about 50 percent, specially when handling large amounts of data.
Resumo:
The Time-Of-Flight (TOF) detector of ALICE is designed to identify charged particles produced in Pb--Pb collisions at the LHC to address the physics of strongly-interacting matter and the Quark-Gluon Plasma (QGP). The detector is based on the Multigap Resistive Plate Chamber (MRPC) technology which guarantees the excellent performance required for a large time-of-flight array. The construction and installation of the apparatus in the experimental site have been completed and the detector is presently fully operative. All the steps which led to the construction of the TOF detector were strictly followed by a set of quality assurance procedures to enable high and uniform performance and eventually the detector has been commissioned with cosmic rays. This work aims at giving a detailed overview of the ALICE TOF detector, also focusing on the tests performed during the construction phase. The first data-taking experience and the first results obtained with cosmic rays during the commissioning phase are presented as well and allow to confirm the readiness state of the TOF detector for LHC collisions.
Resumo:
ALICE, that is an experiment held at CERN using the LHC, is specialized in analyzing lead-ion collisions. ALICE will study the properties of quarkgluon plasma, a state of matter where quarks and gluons, under conditions of very high temperatures and densities, are no longer confined inside hadrons. Such a state of matter probably existed just after the Big Bang, before particles such as protons and neutrons were formed. The SDD detector, one of the ALICE subdetectors, is part of the ITS that is composed by 6 cylindrical layers with the innermost one attached to the beam pipe. The ITS tracks and identifies particles near the interaction point, it also aligns the tracks of the articles detected by more external detectors. The two ITS middle layers contain the whole 260 SDD detectors. A multichannel readout board, called CARLOSrx, receives at the same time the data coming from 12 SDD detectors. In total there are 24 CARLOSrx boards needed to read data coming from all the SDD modules (detector plus front end electronics). CARLOSrx packs data coming from the front end electronics through optical link connections, it stores them in a large data FIFO and then it sends them to the DAQ system. Each CARLOSrx is composed by two boards. One is called CARLOSrx data, that reads data coming from the SDD detectors and configures the FEE; the other one is called CARLOSrx clock, that sends the clock signal to all the FEE. This thesis contains a description of the hardware design and firmware features of both CARLOSrx data and CARLOSrx clock boards, which deal with all the SDD readout chain. A description of the software tools necessary to test and configure the front end electronics will be presented at the end of the thesis.
Resumo:
In this thesis the analysis to reconstruct the transverse momentum p_{t} spectra for pions, kaons and protons identified with the TOF detector of the ALICE experiment in pp Minimum Bias collisions at $\sqrt{s}=7$ TeV was reported.
After a detailed description of all the parameters which influence the TOF PID performance (time resolution, calibration, alignment, matching efficiency, time-zero of the event) the method used to identify the particles, the unfolding procedure, was discussed. With this method, thanks also to the excellent TOF performance, the pion and kaon spectra can be reconstructed in the 0.5
Resumo:
One of the main targets of the CMS experiment is to search for the Standard Model Higgs boson. The 4-lepton channel (from the Higgs decay h->ZZ->4l, l = e,mu) is one of the most promising. The analysis is based on the identification of two opposite-sign, same-flavor lepton pairs: leptons are required to be isolated and to come from the same primary vertex. The Higgs would be statistically revealed by the presence of a resonance peak in the 4-lepton invariant mass distribution. The 4-lepton analysis at CMS is presented, spanning on its most important aspects: lepton identification, variables of isolation, impact parameter, kinematics, event selection, background control and statistical analysis of results. The search leads to an evidence for a signal presence with a statistical significance of more than four standard deviations. The excess of data, with respect to the background-only predictions, indicates the presence of a new boson, with a mass of about 126 GeV/c2 , decaying to two Z bosons, whose characteristics are compatible with the SM Higgs ones.