861 resultados para LHC, CMS, Grid Computing, Cloud Comuting, Top Physics
Resumo:
A search for supersymmetry is presented based on events with large missing transverse energy, no isolated electron or muon, and at least three jets with one or more identified as a bottom-quark jet. A simultaneous examination is performed of the numbers of events in exclusive bins of the scalar sum of jet transverse momentum values, missing transverse energy, and bottom-quark jet multiplicity. The sample, corresponding to an integrated luminosity of 19.4fb-1, consists of proton-proton collision data recorded at a center-of-mass energy of 8TeV with the CMS detector at the LHC in 2012. The observed numbers of events are found to be consistent with the standard model expectation, which is evaluated with control samples in data. The results are interpreted in the context of two simplified supersymmetric scenarios in which gluino pair production is followed by the decay of each gluino to an undetected lightest supersymmetric particle and either a bottom or top quark-antiquark pair, characteristic of gluino mediated bottom- or top-squark production. Using the production cross section calculated to next-to-leading-order plus next-to-leading-logarithm accuracy, and in the limit of a massless lightest supersymmetric particle, we exclude gluinos with masses below 1170GeV and 1020GeV for the two scenarios, respectively. © 2013 CERN.
Resumo:
The results of searches for new resonances decaying to a pair of massive vector bosons (WW, WZ, ZZ) are presented. All searches are performed using 5.0 fb-1 of proton-proton collisions, at TeV of center of mass energy, collected by the Compact Muon Solenoid detector at the Large Hadron Collider. No significant excess compared to the standard model background expectation is observed, and upper limits at 95% confidence level are set on the production cross section times the branching fraction of hypothetical particles decaying to a pair of vector bosons. The results are interpreted in the context of several benchmark models, such as the Randall-Sundrum gravitons, the Sequential Standard Model W′, and Technicolor. Graviton resonances in the Randall-Sundrum model with masses smaller than 940 GeV/c2, for coupling parameter k/MPl = 0.05 are excluded. Bulk (ADPS) Randall-Sundrum gravitons with masses smaller than 610 GeV/c2 are excluded, for k/MPl = 0.05. Sequential Standard Model W′ with masses smaller than 1143 GeV/c2 are excluded, as well as ρTC in the 167-687 GeV/c2 mass range, in Low Scale Technicolor models with M(πTC) = 3/4 M(ρTC) - 25 GeV/c2. © 2013 IOP Publishing Ltd.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
A search for pair production of third-generation scalar leptoquarks and supersymmetric top quark partners, top squarks, in final states involving tau leptons and bottom quarks is presented. The search uses events from a data sample of proton-proton collisions corresponding to an integrated luminosity of 19.7 fb(-1), collected with the CMS detector at the LHC with root s = 8 TeV. The number of observed events is found to be in agreement with the expected standard model background. Third-generation scalar leptoquarks with masses below 740 GeV are excluded at 95% confidence level, assuming a 100% branching fraction for the leptoquark decay to a tau lepton and a bottom quark. In addition, this mass limit applies directly to top squarks decaying via an R-parity violating coupling. lambda(') (333). The search also considers a similar signature from top squarks undergoing a chargino-mediated decay involving the Rparity violating coupling. lambda(')(3jk). Each top squark decays to a tau lepton, a bottom quark, and two light quarks. Top squarks in this model with masses below 580 GeV are excluded at 95% confidence level. The constraint on the leptoquark mass is the most stringent to date, and this is the first search for top squarks decaying via. lambda(')(3jk). (C) 2014 The Authors. Published by Elsevier B. V.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The surprising discovery of the X(3872) resonance by the Belle experiment in 2003, and subsequent confirmation by BaBar, CDF and D0, opened up a new chapter of QCD studies and puzzles. Since then, detailed experimental and theoretical studies have been performed in attempt to determine and explain the proprieties of this state. Since the end of 2009 the world’s largest and highest-energy particle accelerator, the Large Hadron Collider (LHC), started its operations at the CERN laboratories in Geneva. One of the main experiments at LHC is CMS (Compact Muon Solenoid), a general purpose detector projected to address a wide range of physical phenomena, in particular the search of the Higgs boson, the only still unconfirmed element of the Standard Model (SM) of particle interactions and, new physics beyond the SM itself. Even if CMS has been designed to study high energy events, it’s high resolution central tracker and superior muon spectrometer made it an optimal tool to study the X(3872) state. In this thesis are presented the results of a series of study on the X(3872) state performed with the CMS experiment. Already with the first year worth of data, a clear peak for the X(3872) has been identified, and the measurement of the cross section ratio with respect to the Psi(2S) has been performed. With the increased statistic collected during 2011 it has been possible to study, in bins of transverse momentum, the cross section ratio between X(3872) and Psi(2S) and separate their prompt and non-prompt component.
Resumo:
The Large Hadron Collider, located at the CERN laboratories in Geneva, is the largest particle accelerator in the world. One of the main research fields at LHC is the study of the Higgs boson, the latest particle discovered at the ATLAS and CMS experiments. Due to the small production cross section for the Higgs boson, only a substantial statistics can offer the chance to study this particle properties. In order to perform these searches it is desirable to avoid the contamination of the signal signature by the number and variety of the background processes produced in pp collisions at LHC. Much account assumes the study of multivariate methods which, compared to the standard cut-based analysis, can enhance the signal selection of a Higgs boson produced in association with a top quark pair through a dileptonic final state (ttH channel). The statistics collected up to 2012 is not sufficient to supply a significant number of ttH events; however, the methods applied in this thesis will provide a powerful tool for the increasing statistics that will be collected during the next LHC data taking.
Resumo:
Questo progetto di tesi è lo sviluppo di un sistema distribuito di acquisizione e visualizzazione interattiva di dati. Tale sistema è utilizzato al CERN (Organizzazione Europea per la Ricerca Nucleare) al fine di raccogliere i dati relativi al funzionamento dell'LHC (Large Hadron Collider, infrastruttura ove avvengono la maggior parte degli esperimenti condotti al CERN) e renderli disponibili al pubblico in tempo reale tramite una dashboard web user-friendly. L'infrastruttura sviluppata è basata su di un prototipo progettato ed implementato al CERN nel 2013. Questo prototipo è nato perché, dato che negli ultimi anni il CERN è diventato sempre più popolare presso il grande pubblico, si è sentita la necessità di rendere disponibili in tempo reale, ad un numero sempre maggiore di utenti esterni allo staff tecnico-scientifico, i dati relativi agli esperimenti effettuati e all'andamento dell'LHC. Le problematiche da affrontare per realizzare ciò riguardano sia i produttori dei dati, ovvero i dispositivi dell'LHC, sia i consumatori degli stessi, ovvero i client che vogliono accedere ai dati. Da un lato, i dispositivi di cui vogliamo esporre i dati sono sistemi critici che non devono essere sovraccaricati di richieste, che risiedono in una rete protetta ad accesso limitato ed utilizzano protocolli di comunicazione e formati dati eterogenei. Dall'altro lato, è necessario che l'accesso ai dati da parte degli utenti possa avvenire tramite un'interfaccia web (o dashboard web) ricca, interattiva, ma contemporaneamente semplice e leggera, fruibile anche da dispositivi mobili. Il sistema da noi sviluppato apporta miglioramenti significativi rispetto alle soluzioni precedentemente proposte per affrontare i problemi suddetti. In particolare presenta un'interfaccia utente costituita da diversi widget configurabili, riuitilizzabili che permettono di esportare i dati sia presentati graficamente sia in formato "machine readable". Un'alta novità introdotta è l'architettura dell'infrastruttura da noi sviluppata. Essa, dato che è basata su Hazelcast, è un'infrastruttura distribuita modulare e scalabile orizzontalmente. È infatti possibile inserire o rimuovere agenti per interfacciarsi con i dispositivi dell'LHC e web server per interfacciarsi con gli utenti in modo del tutto trasparente al sistema. Oltre a queste nuove funzionalità e possbilità, il nostro sistema, come si può leggere nella trattazione, fornisce molteplici spunti per interessanti sviluppi futuri.
Resumo:
In addition to multi-national Grid infrastructures, several countries operate their own national Grid infrastructures to support science and industry within national borders. These infrastructures have the benefit of better satisfying the needs of local, regional and national user communities. Although Switzerland has strong research groups in several fields of distributed computing, only recently a national Grid effort was kick-started to integrate a truly heterogeneous set of resource providers, middleware pools, and users. In the following. article we discuss our efforts to start Grid activities at a national scale to combine several scientific communities and geographical domains. We make a strong case for the need of standards that have to be built on top of existing software systems in order to provide support for a heterogeneous Grid infrastruc