621 resultados para hadron
Resumo:
The time-dependent CP asymmetries of the $B^0\to\pi^+\pi^-$ and $B^0_s\toK^+K^-$ decays and the time-integrated CP asymmetries of the $B^0\toK^+\pi^-$ and $B^0_s\to\pi^+K^-$ decays are measured, using the $p-p$ collision data collected with the LHCb detector and corresponding to the full Run2. The results are compatible with previous determinations of these quantities from LHCb, except for the CP-violation parameters of the $B^0_s\to K^+K^-$ decays, that show a discrepancy exceeding 3 standard deviations between different data-taking periods. The investigations being conducted to understand the discrepancy are documented. The measurement of the CKM matrix element $|V_{cb}|$ using $B^0_{s}\to D^{(*)-}_s\mu^+ \nu_\mu$ is also reported, using the $p-p$ collision data collected with the LHCb detector and corresponding to the full Run1. The measurement leads to $|V_{cb}| = (41.4\pm0.6\pm0.9\pm1.2)\times 10^{-3}$, where the first uncertainty is statistical, the second is systematic, and the third is due to external inputs. This measurement is compatible with the world averages and constitutes the first measurement of $|V_{cb}|$ at a hadron collider and the absolute first one with decays of the $B^0_s$ meson. The analysis also provides the very first measurements of the branching ratio and form factors parameters of the signal decay modes. The study of the characteristics ruling the response of an electromagnetic calorimeter (ECAL) to profitably operate in the high luminosity regime foreseen for the Upgrade2 of LHCb is reported in the final part of this Thesis. A fast and flexible simulation framework is developed to this purpose. Physics performance of different configurations of the ECAL are evaluated using samples of fully simulated $B^0\to \pi^+\pi^-\pi^0$ and $B^0\to K^{*0}e^+e^-$ decays. The results are used to guide the development of the future ECAL and are reported in the Framework Technical Design Report of the LHCb Upgrade2 detector.
Resumo:
Ionizing radiations are important tools employed every day in the modern society. For example, in medicine they are routinely used for diagnostic and therapy. The large variety of applications leads to the need of novel, more efficient, low-cost ionizing radiation detectors with new functionalities. Personal dosimetry would benefit from wearable detectors able to conform to the body surfaces. Traditional semiconductors used for ionizing radiation direct detectors offer high performance but they are intrinsically stiff, brittle and require high voltages to operate. Hybrid lead-halide perovskites emerged recently as a novel class of materials for ionizing radiation detection. They combine high absorption coefficient, solution processability and high charge transport capability, enabling efficient and low-cost detection. The deposition from solution allows the fabrication of thin-film flexible devices. In this thesis, I studied the detection properties of different types of hybrid perovskites, deposited from solution in thin-film form, and tested under X-rays, gamma-rays and protons beams. I developed the first ultraflexible X-ray detector with exceptional conformability. The effect of coupling organic layers with perovskites was studied at the nanoscale giving a direct demonstration of trap passivation effect at the grain boundaries. Different perovskite formulations were deposited and tested to improve the film stability. I report about the longest aging studies on perovskite X-ray detectors showing that the addition of starch in the precursors’ solution can improve the stability in time with only a 7% decrease in sensitivity after 630 days of storage in ambient conditions. 2D perovskites were also explored as direct detector for X-rays and gamma-rays. Detection of 511 keV photons by a thin-film device is here demonstrated and was validated for monitoring a radiotracer injection. At last, a new approach has been used: a 2D/3Dmixed perovskite thin-film demonstrated to reliably detect 5 MeV protons, envisioning wearable dose monitoring during proton/hadron therapy treatments.
Resumo:
In this thesis, a search for same-sign top quark pairs produced according to the Standard Model Effective Field Theory (SMEFT) is presented. The analysis is carried out within the ATLAS Collaboration using collision data at a center-of-mass energy of $\sqrt{s} = 13$ TeV, collected by the ATLAS detector during the Run 2 of the Large Hadron Collider, corresponding to an integrated luminosity of $140$ fb$^{-1}$. Three SMEFT operators are considered in the analysis, namely $\mathcal{O}_{RR}$, $\mathcal{O}_{LR}^{(1)}$, and $\mathcal{O}_{LR}^{(8)}$. The signal associated to same-sign top pairs is searched in the dilepton channel, with the top quarks decaying via $t \longrightarrow W^+ b \longrightarrow \ell^+ \nu b$, leading to a final state signature composed of a pair of high-transverse momentum same-sign leptons and $b$-jets. Deep Neural Networks are employed in the analysis to enhance sensitivity to the different SMEFT operators and to perform signal-background discrimination. This is the first result of the ATLAS Collaboration concerning the search for same-sign top quark pairs production in proton-proton collision data at $\sqrt{s} = 13$ TeV, in the framework of the SMEFT.
Resumo:
In high-energy hadron collisions, the production at parton level of heavy-flavour quarks (charm and bottom) is described by perturbative Quantum Chromo-dynamics (pQCD) calculations, given the hard scale set by the quark masses. However, in hadron-hadron collisions, the predictions of the heavy-flavour hadrons eventually produced entail the knowledge of the parton distribution functions, as well as an accurate description of the hadronisation process. The latter is taken into account via the fragmentation functions measured at e$^+$e$^-$ colliders or in ep collisions, but several observations in LHC Run 1 and Run 2 data challenged this picture. In this dissertation, I studied the charm hadronisation in proton-proton collision at $\sqrt{s}$ = 13 TeV with the ALICE experiment at the LHC, making use of a large statistic data sample collected during LHC Run 2. The production of heavy-flavour in this collision system will be discussed, also describing various hadronisation models implemented in commonly used event generators, which try to reproduce experimental data, taking into account the unexpected results at LHC regarding the enhanced production of charmed baryons. The role of multiple parton interaction (MPI) will also be presented and how it affects the total charm production as a function of multiplicity. The ALICE apparatus will be described before moving to the experimental results, which are related to the measurement of relative production rates of the charm hadrons $\Sigma_c^{0,++}$ and $\Lambda_c^+$, which allow us to study the hadronisation mechanisms of charm quarks and to give constraints to different hadronisation models. Furthermore, the analysis of D mesons ($D^{0}$, $D^{+}$ and $D^{*+}$) as a function of charged-particle multiplicity and spherocity will be shown, investigating the role of multi-parton interactions. This research is relevant per se and for the mission of the ALICE experiment at the LHC, which is devoted to the study of Quark-Gluon Plasma.
Resumo:
In the upcoming years, various upgrades and improvements are planned for the CERN Large Hadron Collider (LHC) and represent the mandate of the High-Luminosity project. The upgrade will allow for a total stored beam energy of about 700 MJ, which will need, among others, an extremely efficient collimation system. This will be achieved with the addition of a hollow electron lens (HEL) system to help control the beam-halo depletion and mitigate the effects of fast beam losses. In this master thesis, we present a diffusion model of the HEL for HL-LHC. In particular, we explore several scenarios to use such a device, focusing on the halo depletion efficiency given by different noise regimes.
Resumo:
L'esperimento ATLAS, come gli altri esperimenti che operano al Large Hadron Collider, produce Petabytes di dati ogni anno, che devono poi essere archiviati ed elaborati. Inoltre gli esperimenti si sono proposti di rendere accessibili questi dati in tutto il mondo. In risposta a questi bisogni è stato progettato il Worldwide LHC Computing Grid che combina la potenza di calcolo e le capacità di archiviazione di più di 170 siti sparsi in tutto il mondo. Nella maggior parte dei siti del WLCG sono state sviluppate tecnologie per la gestione dello storage, che si occupano anche della gestione delle richieste da parte degli utenti e del trasferimento dei dati. Questi sistemi registrano le proprie attività in logfiles, ricchi di informazioni utili agli operatori per individuare un problema in caso di malfunzionamento del sistema. In previsione di un maggiore flusso di dati nei prossimi anni si sta lavorando per rendere questi siti ancora più affidabili e uno dei possibili modi per farlo è lo sviluppo di un sistema in grado di analizzare i file di log autonomamente e individuare le anomalie che preannunciano un malfunzionamento. Per arrivare a realizzare questo sistema si deve prima individuare il metodo più adatto per l'analisi dei file di log. In questa tesi viene studiato un approccio al problema che utilizza l'intelligenza artificiale per analizzare i logfiles, più nello specifico viene studiato l'approccio che utilizza dell'algoritmo di clustering K-means.