5 resultados para multi-method study
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
Since its discovery, top quark has represented one of the most investigated field in particle physics. The aim of this thesis is the reconstruction of hadronic top with high transverse momentum (boosted) with the Template Overlap Method (TOM). Because of the high energy, the decay products of boosted tops are partially or totally overlapped and thus they are contained in a single large radius jet (fat-jet). TOM compares the internal energy distributions of the candidate fat-jet to a sample of tops obtained by a MC simulation (template). The algorithm is based on the definition of an overlap function, which quantifies the level of agreement between the fat-jet and the template, allowing an efficient discrimination of signal from the background contributions. A working point has been decided in order to obtain a signal efficiency close to 90% and a corresponding background rejection at 70%. TOM performances have been tested on MC samples in the muon channel and compared with the previous methods present in literature. All the methods will be merged in a multivariate analysis to give a global top tagging which will be included in ttbar production differential cross section performed on the data acquired in 2012 at sqrt(s)=8 TeV in high phase space region, where new physics processes could be possible. Due to its peculiarity to increase the pT, the Template Overlap Method will play a crucial role in the next data taking at sqrt(s)=13 TeV, where the almost totality of the tops will be produced at high energy, making the standard reconstruction methods inefficient.
Resumo:
Oggi sappiamo che la materia ordinaria rappresenta solo una piccola parte dell'intero contenuto in massa dell'Universo. L'ipotesi dell'esistenza della Materia Oscura, un nuovo tipo di materia che interagisce solo gravitazionalmente e, forse, tramite la forza debole, è stata avvalorata da numerose evidenze su scala sia galattica che cosmologica. Gli sforzi rivolti alla ricerca delle cosiddette WIMPs (Weakly Interacting Massive Particles), il generico nome dato alle particelle di Materia Oscura, si sono moltiplicati nel corso degli ultimi anni. L'esperimento XENON1T, attualmente in costruzione presso i Laboratori Nazionali del Gran Sasso (LNGS) e che sarà in presa dati entro la fine del 2015, segnerà un significativo passo in avanti nella ricerca diretta di Materia Oscura, che si basa sulla rivelazione di collisioni elastiche su nuclei bersaglio. XENON1T rappresenta la fase attuale del progetto XENON, che ha già realizzato gli esperimenti XENON10 (2005) e XENON100 (2008 e tuttora in funzione) e che prevede anche un ulteriore sviluppo, chiamato XENONnT. Il rivelatore XENON1T sfrutta circa 3 tonnellate di xeno liquido (LXe) e si basa su una Time Projection Chamber (TPC) a doppia fase. Dettagliate simulazioni Monte Carlo della geometria del rivelatore, assieme a specifiche misure della radioattività dei materiali e stime della purezza dello xeno utilizzato, hanno permesso di predire con accuratezza il fondo atteso. In questo lavoro di tesi, presentiamo lo studio della sensibilità attesa per XENON1T effettuato tramite il metodo statistico chiamato Profile Likelihood (PL) Ratio, il quale nell'ambito di un approccio frequentista permette un'appropriata trattazione delle incertezze sistematiche. In un primo momento è stata stimata la sensibilità usando il metodo semplificato Likelihood Ratio che non tiene conto di alcuna sistematica. In questo modo si è potuto valutare l'impatto della principale incertezza sistematica per XENON1T, ovvero quella sulla emissione di luce di scintillazione dello xeno per rinculi nucleari di bassa energia. I risultati conclusivi ottenuti con il metodo PL indicano che XENON1T sarà in grado di migliorare significativamente gli attuali limiti di esclusione di WIMPs; la massima sensibilità raggiunge una sezione d'urto σ=1.2∙10-47 cm2 per una massa di WIMP di 50 GeV/c2 e per una esposizione nominale di 2 tonnellate∙anno. I risultati ottenuti sono in linea con l'ambizioso obiettivo di XENON1T di abbassare gli attuali limiti sulla sezione d'urto, σ, delle WIMPs di due ordini di grandezza. Con tali prestazioni, e considerando 1 tonnellata di LXe come massa fiduciale, XENON1T sarà in grado di superare gli attuali limiti (esperimento LUX, 2013) dopo soli 5 giorni di acquisizione dati.
Resumo:
This dissertation presents a calibration procedure for a pressure velocity probe. The dissertation is divided into four main chapters. The first chapter is divided into six main sections. In the firsts two, the wave equation in fluids and the velocity of sound in gases are calculated, the third section contains a general solution of the wave equation in the case of plane acoustic waves. Section four and five report the definition of the acoustic impedance and admittance, and the practical units the sound level is measured with, i.e. the decibel scale. Finally, the last section of the chapter is about the theory linked to the frequency analysis of a sound wave and includes the analysis of sound in bands and the discrete Fourier analysis, with the definition of some important functions. The second chapter describes different reference field calibration procedures that are used to calibrate the P-V probes, between them the progressive plane wave method, which is that has been used in this work. Finally, the last section of the chapter contains a description of the working principles of the two transducers that have been used, with a focus on the velocity one. The third chapter of the dissertation is devoted to the explanation of the calibration set up and the instruments used for the data acquisition and analysis. Since software routines were extremely important, this chapter includes a dedicated section on them and the proprietary routines most used are thoroughly explained. Finally, there is the description of the work that has been done, which is identified with three different phases, where the data acquired and the results obtained are presented. All the graphs and data reported were obtained through the Matlab® routine. As for the last chapter, it briefly presents all the work that has been done as well as an excursus on a new probe and on the way the procedure implemented in this dissertation could be applied in the case of a general field.
Resumo:
The first goal of this study is to analyse a real-world multiproduct onshore pipeline system in order to verify its hydraulic configuration and operational feasibility by constructing a simulation model step by step from its elementary building blocks that permits to copy the operation of the real system as precisely as possible. The second goal is to develop this simulation model into a user-friendly tool that one could use to find an “optimal” or “best” product batch schedule for a one year time period. Such a batch schedule could change dynamically as perturbations occur during operation that influence the behaviour of the entire system. The result of the simulation, the ‘best’ batch schedule is the one that minimizes the operational costs in the system. The costs involved in the simulation are inventory costs, interface costs, pumping costs, and penalty costs assigned to any unforeseen situations. The key factor to determine the performance of the simulation model is the way time is represented. In our model an event based discrete time representation is selected as most appropriate for our purposes. This means that the time horizon is divided into intervals of unequal lengths based on events that change the state of the system. These events are the arrival/departure of the tanker ships, the openings and closures of loading/unloading valves of storage tanks at both terminals, and the arrivals/departures of trains/trucks at the Delivery Terminal. In the feasibility study we analyse the system’s operational performance with different Head Terminal storage capacity configurations. For these alternative configurations we evaluated the effect of different tanker ship delay magnitudes on the number of critical events and product interfaces generated, on the duration of pipeline stoppages, the satisfaction of the product demand and on the operative costs. Based on the results and the bottlenecks identified, we propose modifications in the original setup.
Resumo:
Microplastics have become ubiquitous pollutants in the marine environment. Ingestion of microplastics by a wide range of marine organisms has been recorded both in laboratory and field studies. Despite growing concern for microplastics, few studies have evaluated their concentrations and distribution in wild populations. Further, there is a need to identify cost-effective standardized methodologies for microplastics extraction and analysis in organisms. In this thesis I present: (i) the results of a multi-scale field sampling to quantify and characterize microplastics occurrence and distribution in 4 benthic marine invertebrates from saltmarshes along the North Adriatic Italian coastal lagoons; (ii) a comparison of the effects and cost-effectiveness of two extraction protocols for microplastics isolation on microfibers and on wild collected organisms; (iii) the development of a novel field- based technique to quantify and characterize the microplastic uptake rates of wild and farmed populations of mussels (Mytilus galloprovincialis) through the analysis of their biodeposits. I found very low and patchy amounts of microplastics in the gastrointestinal tracts of sampled organisms. The omnivorous crab Carcinus aestuarii was the species with the highest amounts of microplastics, but there was a notable variation among individuals. There were no substantial differences between enzymatic and alkaline extraction methods. However, the alkaline extraction was quicker and cheaper. Biodeposit traps proved to be an effective method to estimate mussel ingestion rates. However their performance differed significantly among sites, suggesting that the method, as currently designed, is sensible to local environmental conditions. There were no differences in the ingestion rates of microplastics between farmed and wild mussels. The estimates of microplastic ingestion and the validated procedures for their extraction provide a strong basis for future work on microplastic pollution.