957 resultados para LHC CMS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Large Hadron Collider, located at the CERN laboratories in Geneva, is the largest particle accelerator in the world. One of the main research fields at LHC is the study of the Higgs boson, the latest particle discovered at the ATLAS and CMS experiments. Due to the small production cross section for the Higgs boson, only a substantial statistics can offer the chance to study this particle properties. In order to perform these searches it is desirable to avoid the contamination of the signal signature by the number and variety of the background processes produced in pp collisions at LHC. Much account assumes the study of multivariate methods which, compared to the standard cut-based analysis, can enhance the signal selection of a Higgs boson produced in association with a top quark pair through a dileptonic final state (ttH channel). The statistics collected up to 2012 is not sufficient to supply a significant number of ttH events; however, the methods applied in this thesis will provide a powerful tool for the increasing statistics that will be collected during the next LHC data taking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performances of the H → ZZ* → 4l analysis are studied in the context of the High Luminosity upgrade of the LHC collider, with the CMS detector. The high luminosity (up to L = 5 × 10^34 cm−2s−1) of the accelerator poses very challenging experimental con- ditions. In particular, the number of overlapping events per bunch crossing will increase to 140. To cope with this difficult environment, the CMS detector will be upgraded in two stages: Phase-I and Phase-II. The tools used in the analysis are the CMS Full Simulation and the fast parametrized Delphes simulation. A validation of Delphes with respect to the Full Simulation is performed, using reference Phase-I detector samples. Delphes is then used to simulate the Phase-II detector response. The Phase-II configuration is compared with the Phase-I detector and the same Phase-I detector affected by aging processes, both modeled with the Full Simulation framework. Conclusions on these three scenarios are derived: the degradation in performances observed with the “aged” scenario shows that a major upgrade of the detector is mandatory. The specific upgrade configuration studied allows to keep the same performances as in Phase-I and, in the case of the four-muons channel, even to exceed them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, data handling and data analysis in High Energy Physics requires a vast amount of computational power and storage. In particular, the world-wide LHC Com- puting Grid (LCG), an infrastructure and pool of services developed and deployed by a ample community of physicists and computer scientists, has demonstrated to be a game changer in the efficiency of data analyses during Run-I at the LHC, playing a crucial role in the Higgs boson discovery. Recently, the Cloud computing paradigm is emerging and reaching a considerable adoption level by many different scientific organizations and not only. Cloud allows to access and utilize not-owned large computing resources shared among many scientific communities. Considering the challenging requirements of LHC physics in Run-II and beyond, the LHC computing community is interested in exploring Clouds and see whether they can provide a complementary approach - or even a valid alternative - to the existing technological solutions based on Grid. In the LHC community, several experiments have been adopting Cloud approaches, and in particular the experience of the CMS experiment is of relevance to this thesis. The LHC Run-II has just started, and Cloud-based solutions are already in production for CMS. However, other approaches of Cloud usage are being thought of and are at the prototype level, as the work done in this thesis. This effort is of paramount importance to be able to equip CMS with the capability to elastically and flexibly access and utilize the computing resources needed to face the challenges of Run-III and Run-IV. The main purpose of this thesis is to present forefront Cloud approaches that allow the CMS experiment to extend to on-demand resources dynamically allocated as needed. Moreover, a direct access to Cloud resources is presented as suitable use case to face up with the CMS experiment needs. Chapter 1 presents an overview of High Energy Physics at the LHC and of the CMS experience in Run-I, as well as preparation for Run-II. Chapter 2 describes the current CMS Computing Model, and Chapter 3 provides Cloud approaches pursued and used within the CMS Collaboration. Chapter 4 and Chapter 5 discuss the original and forefront work done in this thesis to develop and test working prototypes of elastic extensions of CMS computing resources on Clouds, and HEP Computing “as a Service”. The impact of such work on a benchmark CMS physics use-cases is also demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present here a characterization of the Monte Carlo samples used at CMS in the current LHC run (Run 2, sqrt(s)=13 TeV) and we compare them to the ones used in the previous run (Run 1, sqrt(s)=8 TeV). We then use these samples to reconstruct the top quark mass from the all-hadronic decay products and we compare the efficiencies of the standard reconstruction method when applied to the two different samples. We finally find a way to improve the efficiency for 13 TeV samples by using jets reconstructed with a different algorithm, the Cambridge-Aachen algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PhEDEx, the CMS transfer management system, during the first LHC Run has moved about 150 PB and currently it is moving about 2.5 PB of data per week over the Worldwide LHC Computing Grid (WLGC). It was designed to complete each transfer required by users at the expense of the waiting time necessary for its completion. For this reason, after several years of operations, data regarding transfer latencies has been collected and stored into log files containing useful analyzable informations. Then, starting from the analysis of several typical CMS transfer workflows, a categorization of such latencies has been made with a focus on the different factors that contribute to the transfer completion time. The analysis presented in this thesis will provide the necessary information for equipping PhEDEx in the future with a set of new tools in order to proactively identify and fix any latency issues. PhEDEx, il sistema di gestione dei trasferimenti di CMS, durante il primo Run di LHC ha trasferito all’incirca 150 PB ed attualmente trasferisce circa 2.5 PB di dati alla settimana attraverso la Worldwide LHC Computing Grid (WLCG). Questo sistema è stato progettato per completare ogni trasferimento richiesto dall’utente a spese del tempo necessario per il suo completamento. Dopo svariati anni di operazioni con tale strumento, sono stati raccolti dati relativi alle latenze di trasferimento ed immagazzinati in log files contenenti informazioni utili per l’analisi. A questo punto, partendo dall’analisi di una ampia mole di trasferimenti in CMS, è stata effettuata una suddivisione di queste latenze ponendo particolare attenzione nei confronti dei fattori che contribuiscono al tempo di completamento del trasferimento. L’analisi presentata in questa tesi permetterà di equipaggiare PhEDEx con un insieme di utili strumenti in modo tale da identificare proattivamente queste latenze e adottare le opportune tattiche per minimizzare l’impatto sugli utenti finali.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lo scopo di questa tesi è la misura di sezione d’urto di produzione di coppie top-antitop nel canale adronico. Per la misura sono stati utilizzati i dati raccolti dall’esperimento CMS in collisioni protone-protone ad LHC, con un’energia nel centro di massa pari a 13 TeV. Il campione di dati utilizzato corrisponde ad una luminosità integrata di 2.474 f b^ −1 . L’analisi dati inizia selezionando gli eventi che soddisfano determinate condizioni (e.g. trigger, tagli cinematici, sei o più jet, almeno 2 jet provenienti dall’adronizzazione di due quark bottom) con lo scopo di incrementare la purezza del segnale scartando il più possibile gli eventi di fondo. A seguire, viene ricostruita la massa del quark top usando un fit cinematico. Sulle distribuzioni di tale massa si basa la stima degli eventi di fondo e di segnale. Infine, attraverso un fit di verosimiglianza, si ottiene il valore della sezione d’urto: σ t t ̄ = 893 ± 57 (stat) ± 104 (syst) pb. Questo risultato è in buon accordo con il valore teorico di 832 pb e con altre misure di CMS effettuate in canali differenti.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Very recently, the ATLAS and CMS Collaborations reported diboson and dijet excesses above standard model expectations in the invariant mass region of 1.8–2.0 TeV. Interpreting the diboson excess of events in a model independent fashion suggests that the vector boson pair production searches are best described by WZ or ZZ topologies, because states decaying into W+W− pairs are strongly constrained by semileptonic searches. Under the assumption of a low string scale, we show that both the diboson and dijet excesses can be steered by an anomalous U(1) field with very small coupling to leptons. The Drell–Yan bounds are then readily avoided because of the leptophobic nature of the massive Z′ gauge boson. The non-negligible decay into ZZ required to accommodate the data is a characteristic footprint of intersecting D-brane models, wherein the Landau–Yang theorem can be evaded by anomaly-induced operators involving a longitudinal Z. The model presented herein can be viewed purely field-theoretically, although it is particularly well motivated from string theory. Should the excesses become statistically significant at the LHC13, the associated Zγ topology would become a signature consistent only with a stringy origin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a study of the Grid data access patterns in distributed analysis in the CMS experiment at the LHC accelerator. This study ranges from the deep analysis of the historical patterns of access to the most relevant data types in CMS, to the exploitation of a supervised Machine Learning classification system to set-up a machinery able to eventually predict future data access patterns - i.e. the so-called dataset “popularity” of the CMS datasets on the Grid - with focus on specific data types. All the CMS workflows run on the Worldwide LHC Computing Grid (WCG) computing centers (Tiers), and in particular the distributed analysis systems sustains hundreds of users and applications submitted every day. These applications (or “jobs”) access different data types hosted on disk storage systems at a large set of WLCG Tiers. The detailed study of how this data is accessed, in terms of data types, hosting Tiers, and different time periods, allows to gain precious insight on storage occupancy over time and different access patterns, and ultimately to extract suggested actions based on this information (e.g. targetted disk clean-up and/or data replication). In this sense, the application of Machine Learning techniques allows to learn from past data and to gain predictability potential for the future CMS data access patterns. Chapter 1 provides an introduction to High Energy Physics at the LHC. Chapter 2 describes the CMS Computing Model, with special focus on the data management sector, also discussing the concept of dataset popularity. Chapter 3 describes the study of CMS data access patterns with different depth levels. Chapter 4 offers a brief introduction to basic machine learning concepts and gives an introduction to its application in CMS and discuss the results obtained by using this approach in the context of this thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The minimal supergravity model predicts the polarization of the tau coming from the stau to bino decay in the co-annihilation region to +1. This can be exploited to extract this soft tau signal at LHC and also to measure the tiny mass differences between the stau and the bi lightest superparticle. Moreover, this strategy will be applicable for a wider class of bino lightest superparticle models, where the lighter stau has a right component at least of similar size as the left.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The TOTEM experiment at the LHC will measure the total proton-proton cross-section with a precision better than 1%, elastic proton scattering over a wide range in momentum transfer -t= p^2 theta^2 up to 10 GeV^2 and diffractive dissociation, including single, double and central diffraction topologies. The total cross-section will be measured with the luminosity independent method that requires the simultaneous measurements of the total inelastic rate and the elastic proton scattering down to four-momentum transfers of a few 10^-3 GeV^2, corresponding to leading protons scattered in angles of microradians from the interaction point. This will be achieved using silicon microstrip detectors, which offer attractive properties such as good spatial resolution (<20 um), fast response (O(10ns)) to particles and radiation hardness up to 10^14 "n"/cm^2. This work reports about the development of an innovative structure at the detector edge reducing the conventional dead width of 0.5-1 mm to 50-60 um, compatible with the requirements of the experiment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A model for total cross-sections incorporating QCD jet cross-sections and soft gluon resummation is described and compared with present data on pp and pp cross-sections. Predictions for LHC are presented for different parameter sets. It is shown that they differ according to the small x-behaviour of available parton density functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Discoveries at the LHC will soon set the physics agenda for future colliders. This report of a CERN Theory Institute includes the summaries of Working Groups that reviewed the physics goals and prospects of LHC running with 10 to 300 fb(-1) of integrated luminosity, of the proposed sLHC luminosity upgrade, of the ILC, of CLIC, of the LHeC and of a muon collider. The four Working Groups considered possible scenarios for the first 10 fb(-1) of data at the LHC in which (i) a state with properties that are compatible with a Higgs boson is discovered, (ii) no such state is discovered either because the Higgs properties are such that it is difficult to detect or because no Higgs boson exists, (iii) a missing-energy signal beyond the Standard Model is discovered as in some supersymmetric models, and (iv) some other exotic signature of new physics is discovered. In the contexts of these scenarios, the Working Groups reviewed the capabilities of the future colliders to study in more detail whatever new physics may be discovered by the LHC. Their reports provide the particle physics community with some tools for reviewing the scientific priorities for future colliders after the LHC produces its first harvest of new physics from multi-TeV collisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Close to one half of the LHC events are expected to be due to elastic or inelastic diffractive scattering. Still, predictions based on extrapolations of experimental data at lower energies differ by large factors in estimating the relative rate of diffractive event categories at the LHC energies. By identifying diffractive events, detailed studies on proton structure can be carried out. The combined forward physics objects: rapidity gaps, forward multiplicity and transverse energy flows can be used to efficiently classify proton-proton collisions. Data samples recorded by the forward detectors, with a simple extension, will allow first estimates of the single diffractive (SD), double diffractive (DD), central diffractive (CD), and non-diffractive (ND) cross sections. The approach, which uses the measurement of inelastic activity in forward and central detector systems, is complementary to the detection and measurement of leading beam-like protons. In this investigation, three different multivariate analysis approaches are assessed in classifying forward physics processes at the LHC. It is shown that with gene expression programming, neural networks and support vector machines, diffraction can be efficiently identified within a large sample of simulated proton-proton scattering events. The event characteristics are visualized by using the self-organizing map algorithm.