10 resultados para Speech Production Measurement
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
This thesis is about three major aspects of the identification of top quarks. First comes the understanding of their production mechanism, their decay channels and how to translate theoretical formulae into programs that can simulate such physical processes using Monte Carlo techniques. In particular, the author has been involved in the introduction of the POWHEG generator in the framework of the ATLAS experiment. POWHEG is now fully used as the benchmark program for the simulation of ttbar pairs production and decay, along with MC@NLO and AcerMC: this will be shown in chapter one. The second chapter illustrates the ATLAS detectors and its sub-units, such as calorimeters and muon chambers. It is very important to evaluate their efficiency in order to fully understand what happens during the passage of radiation through the detector and to use this knowledge in the calculation of final quantities such as the ttbar production cross section. The last part of this thesis concerns the evaluation of this quantity deploying the so-called "golden channel" of ttbar decays, yielding one energetic charged lepton, four particle jets and a relevant quantity of missing transverse energy due to the neutrino. The most important systematic errors arising from the various part of the calculation are studied in detail. Jet energy scale, trigger efficiency, Monte Carlo models, reconstruction algorithms and luminosity measurement are examples of what can contribute to the uncertainty about the cross-section.
Resumo:
In this thesis we present a study of the D0 meson (through one of its two-body decay channel, D0 → Kπ) collected by the CDF II experiment at the Tevatron pp ̄ collider at Fermilab. In particular we measured the differential production cross section as a function of the transverse momentum down to pT = 1.5 GeV/c.
Resumo:
The main work of this thesis concerns the measurement of the production cross section using LHC 2011 data collected at a center-of-mass energy equal to 7 TeV by the ATLAS detector and resulting in a total integrated luminosity of 4.6 inverse fb. The ZZ total cross section is finally compared with the NLO prediction calculated with modern Monte Carlo generators. In addition, the three differential distributions (∆φ(l,l), ZpT and M4l) are shown unfolded back to the underlying distributions using a Bayesian iterative algorithm. Finally, the transverse momentum of the leading Z is used to provide limits on anoumalus triple gauge couplings forbidden in the Standard Model.
Resumo:
The analysis of the K(892)*0 resonance production in Pb–Pb collisions at √sNN = 2.76 TeV with the ALICE detector at the LHC is presented. The analysis is motivated by the interest in the measurement of short-lived resonances production that can provide insights on the properties of the medium produced in heavy-ion collisions both during its partonic (Quark-Gluon Plasma) and hadronic phase. This particular analysis exploits particle identification of the ALICE Time-Of-Flight detector. The ALICE experiment is presented, with focus on the performance of the Time-Of-Flight system. The aspects of calibration and data quality controls are discussed in detail, while illustrating the excellent and very stable performance of the system in different collision environments at the LHC. A full analysis of the K*0 resonance production is presented: from the resonance reconstruction to the determination of the efficiency and the systematic uncertainty. The results show that the analysis strategy discussed is a valid tool to measure the K∗0 up to intermediate momenta. Preliminary results on K*0 resonance production at the LHC are presented and confirmed to be a powerful tool to study the physics of ultra-relativistic heavy-ion collisions.
Resumo:
The production rate of $b$ and $\bar{b}$ hadrons in $pp$ collisions are not expected to be strictly identical, due to imbalance between quarks and anti-quarks in the initial state. This phenomenon can be naively related to the fact that the $\bar{b}$ quark produced in the hard scattering might combine with a $u$ or $d$ valence quark from the colliding protons, whereas the same cannot happen for a $b$ quark. This thesis presents the analysis performed to determine the production asymmetries of $B^0$ and $B^0_s$. The analysis relies on data samples collected by the LHCb detector at the Large Hadron Collider (LHC) during the 2011 and 2012 data takings at two different values of the centre of mass energy $\sqrt{s}=7$ TeV and at $\sqrt{s}=8$ TeV, corresponding respectively to an integrated luminosity of 1 fb$^{-1}$ and of 2 fb$^{-1}$. The production asymmetry is one of the key ingredients to perform measurements of $CP$ violation in b-hadron decays at the LHC, since $CP$ asymmetries must be disentangled from other sources. The measurements of the production asymmetries are performed in bins of $p_\mathrm{T}$ and $\eta$ of the $B$-meson. The values of the production asymmetries, integrated in the ranges $4 < p_\mathrm{T} < 30$ GeV/c and $2.5<\eta<4.5$, are determined to be: \begin{equation} A_\mathrm{P}(\B^0)= (-1.00\pm0.48\pm0.29)\%,\nonumber \end{equation} \begin{equation} A_\mathrm{P}(\B^0_s)= (\phantom{-}1.09\pm2.61\pm0.61)\%,\nonumber \end{equation} where the first uncertainty is statistical and the second is systematic. The measurement of $A_\mathrm{P}(B^0)$ is performed using the full statistics collected by LHCb so far, corresponding to an integrated luminosity of 3 fb$^{-1}$, while the measurement of $A_\mathrm{P}(B^0_s)$ is realized with the first 1 fb$^{-1}$, leaving room for improvement. No clear evidence of dependences on the values of $p_\mathrm{T}$ and $\eta$ is observed. The results presented in this thesis are the most precise measurements available up to date.
Resumo:
Intangible resources have raised the interests of scholars from different research areas due to their importance as crucial factors for firm performance; yet, contributions to this field still lack a theoretical framework. This research analyses the state-of-the-art results reached in the literature concerning intangibles, their main features and evaluation problems and models. In search for a possible theoretical framework, the research draws a kind of indirect analysis of intangibles through the theories of the firm, their critic and developments. The heterodox approaches of the evolutionary theory and resource-based view are indicated as possible frameworks. Based on this theoretical analysis, organization capital (OC) is identified, for its features, as the most important intangible for firm performance. Empirical studies on the relationship intangibles-firm performance have been sporadic and have failed to reach firm conclusions with respect to OC; in the attempt to fill this gap, the effect of OC is tested on a large sample of European firms using the Compustat Global database. OC is proxied by capitalizing an income statement item (Selling, General and Administrative expenses) that includes expenses linked to information technology, business process design, reputation enhancement and employee training. This measure of OC is employed in a cross-sectional estimation of a firm level production function - modeled with different functional specifications (Cobb-Douglas and Translog) - that measures OC contribution to firm output and profitability. Results are robust and confirm the importance of OC for firm performance.
Resumo:
This thesis comes after a strong contribution on the realization of the CMS computing system, which can be seen as a relevant part of the experiment itself. A physics analysis completes the road from Monte Carlo production and analysis tools realization to the final physics study which is the actual goal of the experiment. The topic of physics work of this thesis is the study of tt events fully hadronic decay in the CMS experiment. A multi-jet trigger has been provided to fix a reasonable starting point, reducing the multi-jet sample to the nominal trigger rate. An offline selection has been provided to reduce the S/B ratio. The b-tag is applied to provide a further S/B improvement. The selection is applied to the background sample and to the samples generated at different top quark masses. The top quark mass candidate is reconstructed for all those samples using a kinematic fitter. The resulting distributions are used to build p.d.f.’s, interpolating them with a continuous arbitrary curve. These curves are used to perform the top mass measurement through a likelihood comparison
Resumo:
In this thesis we describe in detail the Monte Carlo simulation (LVDG4) built to interpret the experimental data collected by LVD and to measure the muon-induced neutron yield in iron and liquid scintillator. A full Monte Carlo simulation, based on the Geant4 (v 9.3) toolkit, has been developed and validation tests have been performed. We used the LVDG4 to determine the active vetoing and the shielding power of LVD. The idea was to evaluate the feasibility to host a dark matter detector in the most internal part, called Core Facility (LVD-CF). The first conclusion is that LVD is a good moderator, but the iron supporting structure produce a great number of neutrons near the core. The second conclusions is that if LVD is used as an active veto for muons, the neutron flux in the LVD-CF is reduced by a factor 50, of the same order of magnitude of the neutron flux in the deepest laboratory of the world, Sudbury. Finally, the muon-induced neutron yield has been measured. In liquid scintillator we found $(3.2 \pm 0.2) \times 10^{-4}$ n/g/cm$^2$, in agreement with previous measurements performed at different depths and with the general trend predicted by theoretical calculations and Monte Carlo simulations. Moreover we present the first measurement, in our knowledge, of the neutron yield in iron: $(1.9 \pm 0.1) \times 10^{-3}$ n/g/cm$^2$. That measurement provides an important check for the MC of neutron production in heavy materials that are often used as shield in low background experiments.
Resumo:
The atmospheric muon charge ratio, defined as the number of positive over negative charged muons, is an interesting quantity for the study of high energy hadronic interactions in atmosphere and the nature of the primary cosmic rays. The measurement of the charge ratio in the TeV muon energy range allows to study the hadronic interactions in kinematic regions not yet explored at accelerators. The OPERA experiment is a hybrid electronic detector/emulsion apparatus, located in the underground Gran Sasso Laboratory, at an average depth of 3800 meters water equivalent (m.w.e.). OPERA is the first large magnetized detector that can measure the muon charge ratio at the LNGS depth, with a wide acceptance for cosmic ray muons coming from above. In this thesis, the muon charge ratio is measured using the spectrometers of the OPERA detector in the highest energy region. The charge ratio was computed separately for single and for multiple muon events, in order to select different primary cosmic ray samples in energy and composition. The measurement as a function of the surface muon energy is used to infer parameters characterizing the particle production in atmosphere, that will be used to constrain Monte Carlo predictions. Finally, the experimental results are interpreted in terms of cosmic ray and particle physics models.