957 resultados para LHC CMS


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis describes methods for the reliable identification of hadronically decaying tau leptons in the search for heavy Higgs bosons of the minimal supersymmetric standard model of particle physics (MSSM). The identification of the hadronic tau lepton decays, i.e. tau-jets, is applied to the gg->bbH, H->tautau and gg->tbH+, H+->taunu processes to be searched for in the CMS experiment at the CERN Large Hadron Collider. Of all the event selections applied in these final states, the tau-jet identification is the single most important event selection criterion to separate the tiny Higgs boson signal from a large number of background events. The tau-jet identification is studied with methods based on a signature of a low charged track multiplicity, the containment of the decay products within a narrow cone, an isolated electromagnetic energy deposition, a non-zero tau lepton flight path, the absence of electrons, muons, and neutral hadrons in the decay signature, and a relatively small tau lepton mass compared to the mass of most hadrons. Furthermore, in the H+->taunu channel, helicity correlations are exploited to separate the signal tau jets from those originating from the W->taunu decays. Since many of these identification methods rely on the reconstruction of charged particle tracks, the systematic uncertainties resulting from the mechanical tolerances of the tracking sensor positions are estimated with care. The tau-jet identification and other standard selection methods are applied to the search for the heavy neutral and charged Higgs bosons in the H->tautau and H+->taunu decay channels. For the H+->taunu channel, the tau-jet identification is redone and optimized with a recent and more detailed event simulation than previously in the CMS experiment. Both decay channels are found to be very promising for the discovery of the heavy MSSM Higgs bosons. The Higgs boson(s), whose existence has not yet been experimentally verified, are a part of the standard model and its most popular extensions. They are a manifestation of a mechanism which breaks the electroweak symmetry and generates masses for particles. Since the H->tautau and H+->taunu decay channels are important for the discovery of the Higgs bosons in a large region of the permitted parameter space, the analysis described in this thesis serves as a probe for finding out properties of the microcosm of particles and their interactions in the energy scales beyond the standard model of particle physics.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A search for dielectron decays of heavy neutral resonances has been performed using proton-proton collision data collected at √s = 7 TeV by the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) in 2011. The data sample corresponds to an integrated luminosity of 5 fb−1. The dielectron mass distribution is consistent with Standard Model (SM) predictions. An upper limit on the ratio of the cross section times branching fraction of new bosons, normalized to the cross section times branching fraction of the Z boson, is set at the 95 % confidence level. This result is translated into limits on the mass of new neutral particles at the level of 2120 GeV for the Z′ in the Sequential Standard Model, 1810 GeV for the superstring-inspired Z′ψ resonance, and 1940 (1640) GeV for Kaluza-Klein gravitons with the coupling parameter k/MPl of 0.10 (0.05).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this thesis we build a novel analysis framework to perform the direct extraction of all possible effective Higgs boson couplings to the neutral electroweak gauge bosons in the H → ZZ(*) → 4l channel also referred to as the golden channel. We use analytic expressions of the full decay differential cross sections for the H → VV' → 4l process, and the dominant irreducible standard model qq ̄ → 4l background where 4l = 2e2μ,4e,4μ. Detector effects are included through an explicit convolution of these analytic expressions with transfer functions that model the detector responses as well as acceptance and efficiency effects. Using the full set of decay observables, we construct an unbinned 8-dimensional detector level likelihood function which is con- tinuous in the effective couplings, and includes systematics. All potential anomalous couplings of HVV' where V = Z,γ are considered, allowing for general CP even/odd admixtures and any possible phases. We measure the CP-odd mixing between the tree-level HZZ coupling and higher order CP-odd couplings to be compatible with zero, and in the range [−0.40, 0.43], and the mixing between HZZ tree-level coupling and higher order CP -even coupling to be in the ranges [−0.66, −0.57] ∪ [−0.15, 1.00]; namely compatible with a standard model Higgs. We discuss the expected precision in determining the various HVV' couplings in future LHC runs. A powerful and at first glance surprising prediction of the analysis is that with 100-400 fb-1, the golden channel will be able to start probing the couplings of the Higgs boson to diphotons in the 4l channel. We discuss the implications and further optimization of the methods for the next LHC runs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

O presente trabalho trata do estudo, por meio de simulação de Monte Carlo, de correlações entre variáveis cinemáticas nas topologias de difração simples e de dupla troca de pomeron com vista a delimitar e estudar o espaço de fase referente às topologias citadas, em especial no que se refere á produção inclusiva de dijatos no contexto do experimento CMS/LHC. Será também apresentada uma análise da produção, por difração simples, de dijatos inclusivos a energia no centro de massa √s = 14 TeV (também por simulação de Monte Carlo), na qual estabelecemos um procedimento, a ser usado com dados, para a observação desse tipo de processo. Ainda analisamos a influência de diversos valores da probabilidade de sobrevivência do intervalo de rapidez, [|S|], nos resultados, de forma que com 10 pb -1 de dados acumulados, uma simples observação da produção de dijatos difrativos inclusivos, pelo método proposto, pode vir a excluir valores muito pequenos de [|S|].

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Neste trabalho estudamos as características das distribuições da lacuna de rapidez em amostras de eventos de minimum bias de colisões pp a ps=7 TeV no CMS/LHC. Tais eventos são constituídos por processos difrativos, além de processos de QCD mole. São investigados o tamanho e a localização das lacunas, assim como as correlações entre as distribuições obtidas a partir dos objetos reconstruídos no detector e as distribuições obtidas a partir das partículas geradas via simulação Monte Carlo. Uma boa compreensão dessas distribuições pode, eventualmente, possibilitar a caracterização de eventos difrativos nos dados.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

O escopo desse trabalho é a observação de dijatos de difração simples em colisões pp com ps = 7 TeV, durante os primeiros períodos de aquisição de dados do experimento CMS/LHC. A técnica utilizada foi a medida da multiplicidade no calorímetro HF. Os dados foram analisados para diferentes períodos de aquisição de dados do ano de 2010, com ∫ Ldt ~_ 3,2 pb-1. Comparamos os dados observados com o Monte Carlo simulado com efeito de empilhamento e sem esse efeito.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

O Compact Muon Solenoid (CMS) é um dos principais detectores instalados no LHC que possibilita o estudo de diferentes aspectos da Física, indo do Modelo Padrão à matéria escura. Esse detector de propósito geral, foi construído para ser capaz de medir múons com uma grande precisão e todos os seus subdetectores foram construídos com uma alta granularidade, tornando possível identificar e caracterizar as propriedades cinemáticas das partículas finais da colisão. O algoritmo de reconstrução de eventos inclui a identificação de jatos, ou seja, é possível identificar a assinatura da produção de pártons na colisão e a medida de seções de choque da produção de muitos jatos é um dos métodos para se explorar as contribuições da Cromodinâmica Quântica (Quantum Chromodynamics - QCD) perturbativa, permitindo avaliar as previsões implementadas nas simulações de eventos. Tendo em vista a caracterização de processos relacionados com a QCD em colisões de próton-próton a uma energia do centro de massa de 7 TeV, é apresentada a medida da seção de choque da produção inclusiva de multijatos no CMS. Para realizar essa medida foram utilizados dados reais coletados em 2010, onde não se apresentava muitas colisões por cruzamento de pacote, com uma luminosidade integrada de L = 2,869 pb-1 e utilizando jatos que estão em quase todo o espaço de fase acessível em pseudorapidez |n|≤ 4,8 e momentum transverso pT ≥ 30 GeV/ c2. Desse resultado foram removidos os efeitos de detecção comparado com predições simuladas.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Este trabalho apresenta um estudo sobre a produção de dijatos exclusivos em interações pp, do tipo pp → p "+" dijatos "+" p, onde os prótons desta interação permanecem intactos, e o símbolo "+" indica uma lacuna na pseudorapidez, uma região com ausência de atividade hadrônica entre os prótons espalhados e o sistema central de dijatos; este processo é conhecido como produção central exclusiva. A análise utiliza uma amostra de dados que corresponde a uma luminosidade efetiva de 24;48 pb-1 coletados pelo experimento Compact Muon Solenoid (CMS) no Large Hadron Collider (LHC), no ano de 2010, com energia de centro de massa √s = 7 TeV. Este canal possui uma assinatura experimental única, caracterizada pelos prótons espalhados na região frontal, ou a baixos ângulos e duas grandes lacunas opostas. O processo da produção central exclusiva é útil para o entendimento das interações no contexto da QCD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of the work was to realize a high-speed digital data transfer system for RPC muon chambers in the CMS experiment on CERN’s new LHC accelerator. This large scale system took many years and many stages of prototyping to develop, and required the participation of tens of people. The system interfaces to Frontend Boards (FEB) at the 200,000-channel detector and to the trigger and readout electronics in the control room of the experiment. The distance between these two is about 80 metres and the speed required for the optic links was pushing the limits of available technology when the project was started. Here, as in many other aspects of the design, it was assumed that the features of readily available commercial components would develop in the course of the design work, just as they did. By choosing a high speed it was possible to multiplex the data from some the chambers into the same fibres to reduce the number of links needed. Further reduction was achieved by employing zero suppression and data compression, and a total of only 660 optical links were needed. Another requirement, which conflicted somewhat with choosing the components a late as possible was that the design needed to be radiation tolerant to an ionizing dose of 100 Gy and to a have a moderate tolerance to Single Event Effects (SEEs). This required some radiation test campaigns, and eventually led to ASICs being chosen for some of the critical parts. The system was made to be as reconfigurable as possible. The reconfiguration needs to be done from a distance as the electronics is not accessible except for some short and rare service breaks once the accelerator starts running. Therefore reconfigurable logic is extensively used, and the firmware development for the FPGAs constituted a sizable part of the work. Some special techniques needed to be used there too, to achieve the required radiation tolerance. The system has been demonstrated to work in several laboratory and beam tests, and now we are waiting to see it in action when the LHC will start running in the autumn 2008.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Compact Muon Solenoid (CMS) detector is described. The detector operates at the Large Hadron Collider (LHC) at CERN. It was conceived to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1) (10(27)cm(-2)s(-1)). At the core of the CMS detector sits a high-magnetic-field and large-bore superconducting solenoid surrounding an all-silicon pixel and strip tracker, a lead-tungstate scintillating-crystals electromagnetic calorimeter, and a brass-scintillator sampling hadron calorimeter. The iron yoke of the flux-return is instrumented with four stations of muon detectors covering most of the 4 pi solid angle. Forward sampling calorimeters extend the pseudo-rapidity coverage to high values (vertical bar eta vertical bar <= 5) assuring very good hermeticity. The overall dimensions of the CMS detector are a length of 21.6 m, a diameter of 14.6 m and a total weight of 12500 t.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The first LHC pp collisions at centre-of-mass energies of 0.9 and 2.36 TeV were recorded by the CMS detector in December 2009. The trajectories of charged particles produced in the collisions were reconstructed using the all-silicon Tracker and their momenta were measured in the 3.8 T axial magnetic field. Results from the Tracker commissioning are presented including studies of timing, efficiency, signal-to-noise, resolution, and ionization energy. Reconstructed tracks are used to benchmark the performance in terms of track and vertex resolutions, reconstruction of decays, estimation of ionization energy loss, as well as identification of photon conversions, nuclear interactions, and heavy-flavour decays.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The CMS Level-1 trigger was used to select cosmic ray muons and LHC beam events during data-taking runs in 2008, and to estimate the level of detector noise. This paper describes the trigger components used, the algorithms that were executed, and the trigger synchronisation. Using data from extended cosmic ray runs, the muon, electron/photon, and jet triggers have been validated, and their performance evaluated. Efficiencies were found to be high, resolutions were found to be good, and rates as expected. © 2010 IOP Publishing Ltd and SISSA.