991 resultados para CERN LHC
Resumo:
Measurements of hadron production in p+C interactions at 31 GeV/c are performed using the NA61/SHINE spectrometer at the CERN SPS. The analysis is based on the full set of data collected in 2009 using a graphite target with a thickness of 4% of a nuclear interaction length. Inelastic and production cross sections as well as spectra of π±, K±, p, K0s and Λ are measured with high precision. These measurements are essential for improved calculations of the initial neutrino fluxes in the T2K long-baseline neutrino oscillation experiment in Japan. A comparison of the NA61/SHINE measurements with predictions of several hadroproduction models is presented.
Resumo:
"European Organization for Nuclear Research."
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Searches for the supersymmetric partner of the top quark (stop) are motivated by natural supersymmetry, where the stop has to be light to cancel the large radiative corrections to the Higgs boson mass. This thesis presents three different searches for the stop at √s = 8 TeV and √s = 13 TeV using data from the ATLAS experiment at CERN’s Large Hadron Collider. The thesis also includes a study of the primary vertex reconstruction performance in data and simulation at √s = 7 TeV using tt and Z events. All stop searches presented are carried out in final states with a single lepton, four or more jets and large missing transverse energy. A search for direct stop pair production is conducted with 20.3 fb−1 of data at a center-of-mass energy of √s = 8 TeV. Several stop decay scenarios are considered, including those to a top quark and the lightest neutralino and to a bottom quark and the lightest chargino. The sensitivity of the analysis is also studied in the context of various phenomenological MSSM models in which more complex decay scenarios can be present. Two different analyses are carried out at √s = 13 TeV. The first one is a search for both gluino-mediated and direct stop pair production with 3.2 fb−1 of data while the second one is a search for direct stop pair production with 13.2 fb−1 of data in the decay scenario to a bottom quark and the lightest chargino. The results of the analyses show no significant excess over the Standard Model predictions in the observed data. Consequently, exclusion limits are set at 95% CL on the masses of the stop and the lightest neutralino.
Resumo:
L’obiettivo di tutto il mio lavoro è stato quello di misurare le sezioni d’urto di produzione dei bosoni deboli W ± e Z nei loro decadimenti leptonici (e, μ) coi dati raccolti dal rivelatore ATLAS a LHC con un’energia del centro di massa di √s = 13 TeV relativi all’estate 2015. Gli eventi selezionati sono gli stessi di quelli del recente articolo della Collaborazione ATLAS sullo stesso argomento, in modo anche da poter operare un confronto tra i risultati ottenuti. Confronto peraltro necessario, poichè i risultati sono stati ottenuti con due metodologie differenti: tradizionale (classica) per l’articolo, bayesiana in questa tesi. L’approccio bayesiano permette di combinare i vari canali e di trattare gli effetti sistematici in modo del tutto naturale. I risultati ottenuti sono in ottimo accordo con le predizioni dello Standard Model e con quelli pubblicati da ATLAS.
Resumo:
Neste trabalho de disserta¸c˜ao, investigamos os efeitos nucleares em processos de produ¸c˜ao de quarkonium no Relativistic Heavy Ion Collider (RHIC) e no Large Hadron Collider (LHC). Para tanto, consideramos o Modelo de Evapora¸c˜ao de Cor (CEM), baseado em processos partˆonicos calculados mediante a QCD perturbativa e em intera¸c˜oes n˜ao perturbativas via troca de gl´uons suaves para a forma¸c˜ao do quarkonium. Supress˜ao de quarkonium ´e um dos sinais de forma¸c˜ao do assim chamado Plasma de Quarks e Gl´uons (QGP) em colis˜oes ultrarelativ´ısticas de ´ıons pesados. No entanto, a supress˜ao n˜ao ´e somente causada em colis˜oes n´ucleo-n´ucleo (AA) devido `a forma¸c˜ao do QGP. De fato, a supress˜ao de quarkonium tamb´em foi observada em colis˜oes pr´oton-n´ucleo (pA). A fim de separar os efeitos da mat´eria quente (devidos ao QGP) e fria (efeitos n˜ao devidos ao QGP), pode-se olhar primeiro para colis˜oes pA, onde somente efeitos de mat´eria fria desempenham um papel fundamental, e depois aplicar esses efeitos em colis˜oes AA, uma vez que parte da supress˜ao ´e devido a efeitos de mat´eria fria. No regime de altas energias, a produ¸c˜ao do quarkonium ´e fortemente dependente da distribui¸c˜ao de gl´uons nuclear, o que viabiliza uma oportunidade ´unica de estudar o comportamento de pequeno x dos gl´uons dentro do n´ucleo e, consequentemente, restringir os efeitos nucleares. Estudamos os processos nucleares utilizando distintas parametriza¸c˜oes para as distribui¸c˜oes partˆonicas nucleares. Calculamos a raz˜ao nuclear para processos pA e AA em fun¸c˜ao da vari´avel rapidez para a produ¸c˜ao de quarkonium, o que permite estimar os efeitos nucleares. Al´em disso, apresentamos uma compara¸c˜ao com os dados do RHIC para a produ¸c˜ao do m´eson J/Ψ em colis˜oes pA, demonstrando que a an´alise deste observ´avel ´e uma quest˜ao em aberto na literatura. Adicionalmente, estimamos a produ¸c˜ao de quarks pesados e quarkonium na etapa inicial e durante a fase termal de uma colis˜ao ultrarelativ´ıstica de ´ıons pesados. O objetivo deste estudo ´e estimar as distintas contribui¸c˜oes para a produ¸c˜ao e de alguns efeitos do meio nuclear.
Resumo:
Since it has been found that the MadGraph Monte Carlo generator offers superior flavour-matching capability as compared to Alpgen, the suitability of MadGraph for the generation of ttb¯ ¯b events is explored, with a view to simulating this background in searches for the Standard Model Higgs production and decay process ttH, H ¯ → b ¯b. Comparisons are performed between the output of MadGraph and that of Alpgen, showing that satisfactory agreement in their predictions can be obtained with the appropriate generator settings. A search for the Standard Model Higgs boson, produced in association with the top quark and decaying into a b ¯b pair, using 20.3 fb−1 of 8 TeV collision data collected in 2012 by the ATLAS experiment at CERN’s Large Hadron Collider, is presented. The GlaNtp analysis framework, together with the RooFit package and associated software, are used to obtain an expected 95% confidence-level limit of 4.2 +4.1 −2.0 times the Standard Model expectation, and the corresponding observed limit is found to be 5.9; this is within experimental uncertainty of the published result of the analysis performed by the ATLAS collaboration. A search for a heavy charged Higgs boson of mass mH± in the range 200 ≤ mH± /GeV ≤ 600, where the Higgs mediates the five-flavour beyond-theStandard-Model physics process gb → tH± → ttb, with one top quark decaying leptonically and the other decaying hadronically, is presented, using the 20.3 fb−1 8 TeV ATLAS data set. Upper limits on the product of the production cross-section and the branching ratio of the H± boson are computed for six mass points, and these are found to be compatible within experimental uncertainty with those obtained by the corresponding published ATLAS analysis.
Resumo:
Crossing the Franco-Swiss border, the Large Hadron Collider (LHC), designed to collide 7 TeV proton beams, is the world's largest and most powerful particle accelerator the operation of which was originally intended to commence in 2008. Unfortunately, due to an interconnect discontinuity in one of the main dipole circuit's 13 kA superconducting busbars, a catastrophic quench event occurred during initial magnet training, causing significant physical system damage. Furthermore, investigation into the cause found that such discontinuities were not only present in the circuit in question, but throughout the entire LHC. This prevented further magnet training and ultimately resulted in the maximum sustainable beam energy being limited to approximately half that of the design nominal, 3.5-4 TeV, for the first three years of operation (Run 1, 2009-2012) and a major consolidation campaign being scheduled for the first long shutdown (LS 1, 2012-2014). Throughout Run 1, a series of studies attempted to predict the amount of post-installation training quenches still required to qualify each circuit to nominal-energy current levels. With predictions in excess of 80 quenches (each having a recovery time of 8-12+ hours) just to achieve 6.5 TeV and close to 1000 quenches for 7 TeV, it was decided that for Run 2, all systems be at least qualified for 6.5 TeV operation. However, even with all interconnect discontinuities scheduled to be repaired during LS 1, numerous other concerns regarding circuit stability arose. In particular, observations of an erratic behaviour of magnet bypass diodes and the degradation of other potentially weak busbar sections, as well as observations of seemingly random millisecond spikes in beam losses, known as unidentified falling object (UFO) events, which, if persist at 6.5 TeV, may eventually deposit sufficient energy to quench adjacent magnets. In light of the above, the thesis hypothesis states that, even with the observed issues, the LHC main dipole circuits can safely support and sustain near-nominal proton beam energies of at least 6.5 TeV. Research into minimising the risk of magnet training led to the development and implementation of a new qualification method, capable of providing conclusive evidence that all aspects of all circuits, other than the magnets and their internal joints, can safely withstand a quench event at near-nominal current levels, allowing for magnet training to be carried out both systematically and without risk. This method has become known as the Copper Stabiliser Continuity Measurement (CSCM). Results were a success, with all circuits eventually being subject to a full current decay from 6.5 TeV equivalent current levels, with no measurable damage occurring. Research into UFO events led to the development of a numerical model capable of simulating typical UFO events, reproducing entire Run 1 measured event data sets and extrapolating to 6.5 TeV, predicting the likelihood of UFO-induced magnet quenches. Results provided interesting insights into the involved phenomena as well as confirming the possibility of UFO-induced magnet quenches. The model was also capable of predicting that such events, if left unaccounted for, are likely to be commonplace or not, resulting in significant long-term issues for 6.5+ TeV operation. Addressing the thesis hypothesis, the following written works detail the development and results of all CSCM qualification tests and subsequent magnet training as well as the development and simulation results of both 4 TeV and 6.5 TeV UFO event modelling. The thesis concludes, post-LS 1, with the LHC successfully sustaining 6.5 TeV proton beams, but with UFO events, as predicted, resulting in otherwise uninitiated magnet quenches and being at the forefront of system availability issues.
Resumo:
Using a peculiar version of the SU(3)(L) circle times U(1)(N) electroweak model, we investigate the production of doubly charged Higgs boson at the Large Hadron Collider. Our results include branching ratio calculations for the doubly charged Higgs and for one of the neutral scalar bosons of the model. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The Complex singlet extension of the Standard Model (CxSM) is the simplest extension that provides scenarios for Higgs pair production with different masses. The model has two interesting phases: the dark matter phase, with a Standard Model-like Higgs boson, a new scalar and a dark matter candidate; and the broken phase, with all three neutral scalars mixing. In the latter phase Higgs decays into a pair of two different Higgs bosons are possible. In this study we analyse Higgs-to-Higgs decays in the framework of singlet extensions of the Standard Model (SM), with focus on the CxSM. After demonstrating that scenarios with large rates for such chain decays are possible we perform a comparison between the NMSSM and the CxSM. We find that, based on Higgs-to-Higgs decays, the only possibility to distinguish the two models at the LHC run 2 is through final states with two different scalars. This conclusion builds a strong case for searches for final states with two different scalars at the LHC run 2. Finally, we propose a set of benchmark points for the real and complex singlet extensions to be tested at the LHC run 2. They have been chosen such that the discovery prospects of the involved scalars are maximised and they fulfil the dark matter constraints. Furthermore, for some of the points the theory is stable up to high energy scales. For the computation of the decay widths and branching ratios we developed the Fortran code sHDECAY, which is based on the implementation of the real and complex singlet extensions of the SM in HDECAY.
Resumo:
Questa tesi è incentrata sullo studio e la determinazione del flusso neutronico della facility nTOF (neutron Time Of Flight) del CERN di Ginevra nel corso della campagna sperimentale del 2016. L'esperimento è finalizzato alla misura della sezione d'urto della reazione di cattura neutronica da parte degli isotopi dispari di gadolinio, 155Gd e 157Gd. In particolare l'analisi verrà condotta in modo da ottenere dati sperimentali nello spettro di energie da neutroni termici (10-2 eV) a 1.0 eV e migliorare i dati già esistenti per energie fino a 1.0 MeV. Dopo aver ricordato le motivazioni scientifiche e tecnologiche che sono alla base del progetto di ricerca, si descrivono le caratteristiche della facility nTOF e si trattano i fondamenti delle reazioni nucleari e le tecniche del tempo di volo, di misura di flusso e di cattura utilizzate nel corso dell'esperimento. Nella parte finale del lavoro si presentano i dati sperimentali acquisiti sul flusso neutronico, la cui accurata conoscenza è fondamentale per la misura di sezioni d'urto di reazioni indotte da neutroni. I risultati ottenuti sono quindi stati elaborati e confrontati con i dati precedenti per poter essere validati e per poter verificare eventuali discrepanze. Dalle analisi dei dati si deduce come la precisione ottenuta sulla determinazione del flusso sia ottimale per i successivi studi che verranno condotti sulla sezione d'urto degli isotopi dispari di gadolinio.
Resumo:
O CERN - a Organização Europeia para a Investigação Nuclear - é um dos maiores centros de investigação a nível mundial, responsável por diversas descobertas na área da física bem como na área das ciências da computação. O CERN Document Server, também conhecido como CDS Invenio, é um software desenvolvido no CERN, que tem como objectivo fornecer um conjunto de ferramentas para gerir bibliotecas digitais. A fim de melhorar as funcionalidades do CDS Invenio foi criado um novo módulo, chamado BibCirculation, para gerir os livros (e outros itens) da biblioteca do CERN, funcionando como um sistema integrado de gestão de bibliotecas. Esta tese descreve os passos que foram dados para atingir os vários objectivos deste projecto, explicando, entre outros, o processo de integração com os outros módulos existentes bem como a forma encontrada para associar informações dos livros com os metadados do CDS lnvenio. É também possível encontrar uma apresentação detalhada sobre todo o processo de implementação e os testes realizados. Finalmente, são apresentadas as conclusões deste projecto e o trabalho a desenvolver futuramente. ABSTRACT: CERN - The European Organization for Nuclear Research - is one of the largest research centers worldwide, responsible for several discoveries in physics as well as in computer science. The CERN Document Server, also known as CDS Invenio, is a software developed at CERN, which aims to provide a set of tools for managing digital libraries. ln order to improve the functionalities of CDS Invenio a new module was developed, called BibCirculation, to manage books (and other items) from the CERN library, and working as an Integrated Library System. This thesis shows the steps that have been done to achieve the several goals of this project, explaining, among others aspects, the process of integration with other existing modules as well as the way to associate the information about books with the metadata from CDS lnvenio. You can also find detailed explanation of the entire implementation process and testing. Finally, there are presented the conclusions of this project and ideas for future development.
Resumo:
This presentation describes a situation where an open access mandate was developed and implemented at an institutional level, in this case, an Australian University. Some conclusions are drawn about its effect over a five year period of implementation.