962 resultados para SPLASHING EXPERIMENT
Resumo:
The Carr-Purcell pulse sequence, with low refocusing flip angle, produces echoes midway between refocusing pulses that decay to a minimum value dependent on T*(2). When the refocusing flip angle was pi/2 (CP90) and tau > T*(2), the signal after the minimum value, increased to reach a steady-state free precession regime (SSFP), composed of a free induction decay signal after each pulse and an echo, before the next pulse. When tau < T*(2), the signal increased from the minimum value to the steady-state regime with a time constant (T*) = 2T(1)T(2)/(T-1 + T-2). identical to the time constant observed in the SSFP sequence, known as the continuous wave free precession (CWFP). The steady-state amplitude obtained with M-cp90 = M0T2/(T-1+T-2) was identical to CWFP. Therefore, this sequence was named CP-CWFP because it is a Carr-Purcell sequence that produces results similar to the CWFP. However, CP-CWFP is a better sequence for measuring the longitudinal and transverse relaxation times in single scan, when the sample exhibits T-1 similar to T-2. Therefore, this sequence can be a useful method in time domain NMR and can be widely used in the agriculture, food and petrochemical industries because those samples tend to have similar relaxation times in low magnetic fields. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
The Amazon basin is a region of constant scientific interest due to its environmental importance and its biodiversity and climate on a global scale. The seasonal variations in water volume are one of the examples of topics studied nowadays. In general, the variations in river levels depend primarily on the climate and physics characteristics of the corresponding basins. The main factor which influences the water level in the Amazon Basin is the intensive rainfall over this region as a consequence of the humidity of the tropical climate. Unfortunately, the Amazon basin is an area with lack of water level information due to difficulties in access for local operations. The purpose of this study is to compare and evaluate the Equivalent Water Height (Ewh) from GRACE (Gravity Recovery And Climate Experiment) mission, to study the connection between water loading and vertical variations of the crust due to the hydrologic. In order to achieve this goal, the Ewh is compared with in-situ information from limnimeter. For the analysis it was computed the correlation coefficients, phase and amplitude of GRACE Ewh solutions and in-situ data, as well as the timing of periods of drought in different parts of the basin. The results indicated that vertical variations of the lithosphere due to water mass loading could reach 7 to 5 cm per year, in the sedimentary and flooded areas of the region, where water level variations can reach 10 to 8 m.
Resumo:
Cirrus clouds are an interesting point in the research of the atmosphere due their behavior and the effect on the earth radiation budget. They can affect the atmospheric radiation budget by reflecting the incoming solar radiation and absorbing the outgoing terrestrial radiation. Also, this cloud type is involved in the dehydration of the upper troposphere and lower stratosphere. So, it is interesting to increment the measurements of this type of clouds from the ground. During November and December 2012, through the CHUVA-SUL campaign, measurements with lidar in Santa Maria, Rio Grande do Sul were conducted. The system installed in Santa Maria site (29.8 °S; 53.7 °W, 100 m asl) was a single elastic-backscatter lidar using the wavelength of 532 nm. Some days with cirrus clouds lidar measurements were detected. Four days with presence of cirrus cloud are showed in the present study. These days, 7, 8, 19 and 28 November 2012, was selected due the persistence of cirrus clouds over many hours. The raw retrieval lidar signals and inverted backscatter coefficient profiles were analyzed for the selected days. Base and top height was obtained by analysis of raw signal and backscatter coefficient. Extinction coefficient profiles were obtained by the assumption of the lidar ratio. Cirrus cloud optical depth (COD) values were calculated, from the integration of the extinction coefficient between the base and top altitudes of the cirrus clouds.
Resumo:
Biomass burning represents one of the largest sources of particulate matter to the atmosphere, which results in a significant perturbation to the Earth’s radiative balance coupled with serious negative impacts on public health. Globally, biomass burning aerosols are thought to exert a small warming effect of 0.03 Wm-2, however the uncertainty is 4 times greater than the central estimate. On regional scales, the impact is substantially greater, particularly in areas such as the Amazon Basin where large, intense and frequent burning occurs on an annual basis for several months (usually from August-October). Furthermore, a growing number of people live within the Amazon region, which means that they are subject to the deleterious effects on their health from exposure to substantial volumes of polluted air. Initial results from the South American Biomass Burning Analysis (SAMBBA) field experiment, which took place during September and October 2012 over Brazil, are presented here. A suite of instrumentation was flown on-board the UK Facility for Airborne Atmospheric Measurement (FAAM) BAe-146 research aircraft and was supported by ground based measurements, with extensive measurements made in Porto Velho, Rondonia. The aircraft sampled a range of conditions with sampling of fresh biomass burning plumes, regional haze and elevated biomass burning layers within the free troposphere. The physical, chemical and optical properties of the aerosols across the region will be characterized in order to establish the impact of biomass burning on regional air quality, weather and climate.
Resumo:
This thesis comes after a strong contribution on the realization of the CMS computing system, which can be seen as a relevant part of the experiment itself. A physics analysis completes the road from Monte Carlo production and analysis tools realization to the final physics study which is the actual goal of the experiment. The topic of physics work of this thesis is the study of tt events fully hadronic decay in the CMS experiment. A multi-jet trigger has been provided to fix a reasonable starting point, reducing the multi-jet sample to the nominal trigger rate. An offline selection has been provided to reduce the S/B ratio. The b-tag is applied to provide a further S/B improvement. The selection is applied to the background sample and to the samples generated at different top quark masses. The top quark mass candidate is reconstructed for all those samples using a kinematic fitter. The resulting distributions are used to build p.d.f.’s, interpolating them with a continuous arbitrary curve. These curves are used to perform the top mass measurement through a likelihood comparison
Resumo:
In this thesis the performances of the CMS Drift Tubes Local Trigger System of the CMS detector are studied. CMS is one of the general purpose experiments that will operate at the Large Hadron Collider at CERN. Results from data collected during the Cosmic Run At Four Tesla (CRAFT) commissioning exercise, a globally coordinated run period where the full experiment was involved and configured to detect cosmic rays crossing the CMS cavern, are presented. These include analyses on the precision and accuracy of the trigger reconstruction mechanism and measurement of the trigger efficiency. The description of a method to perform system synchronization is also reported, together with a comparison of the outcomes of trigger electronics and its software emulator code.
Resumo:
ALICE, that is an experiment held at CERN using the LHC, is specialized in analyzing lead-ion collisions. ALICE will study the properties of quarkgluon plasma, a state of matter where quarks and gluons, under conditions of very high temperatures and densities, are no longer confined inside hadrons. Such a state of matter probably existed just after the Big Bang, before particles such as protons and neutrons were formed. The SDD detector, one of the ALICE subdetectors, is part of the ITS that is composed by 6 cylindrical layers with the innermost one attached to the beam pipe. The ITS tracks and identifies particles near the interaction point, it also aligns the tracks of the articles detected by more external detectors. The two ITS middle layers contain the whole 260 SDD detectors. A multichannel readout board, called CARLOSrx, receives at the same time the data coming from 12 SDD detectors. In total there are 24 CARLOSrx boards needed to read data coming from all the SDD modules (detector plus front end electronics). CARLOSrx packs data coming from the front end electronics through optical link connections, it stores them in a large data FIFO and then it sends them to the DAQ system. Each CARLOSrx is composed by two boards. One is called CARLOSrx data, that reads data coming from the SDD detectors and configures the FEE; the other one is called CARLOSrx clock, that sends the clock signal to all the FEE. This thesis contains a description of the hardware design and firmware features of both CARLOSrx data and CARLOSrx clock boards, which deal with all the SDD readout chain. A description of the software tools necessary to test and configure the front end electronics will be presented at the end of the thesis.
Resumo:
The OPERA experiment aims at the direct observation of ν_mu -> ν_tau oscillations in the CNGS (CERN Neutrinos to Gran Sasso) neutrino beam produced at CERN; since the ν_e contamination in the CNGS beam is low, OPERA will also be able to study the sub-dominant oscillation channel ν_mu -> ν_e. OPERA is a large scale hybrid apparatus divided in two supermodules, each equipped with electronic detectors, an iron spectrometer and a highly segmented ~0.7 kton target section made of Emulsion Cloud Chamber (ECC) units. During my research work in the Bologna Lab. I have taken part to the set-up of the automatic scanning microscopes studying and tuning the scanning system performances and efficiencies with emulsions exposed to a test beam at CERN in 2007. Once the triggered bricks were distributed to the collaboration laboratories, my work was centered on the procedure used for the localization and the reconstruction of neutrino events.
Resumo:
Das experimentelle Studium der 1966 von Gerasimov, Drell undHearn unabhängig voneinander aufgestellten und als GDH-SummenregelbezeichnetenRelation macht die Vermessung totalerPhotoabsorptionswirkungsquerschnitte von zirkular polarisierten Photonen an longitudinalpolarisierten Nukleonen über einen weiten Energiebereich notwendig. Die im Sommer1998 erfolgte Messung am Mainzer Mikrotron stellt das erste derartigeExperiment mit reellen Photonen zur Messung des GDH-Integrals am Protondar. Die Verwendung eines Frozen-Spin-Butanoltargets, das eingesetzt wurde, umeinen möglichst hohen Proton-Polarisationsgrad zu erreichen, hat diezusätzliche experimentelle Schwierigkeit zur Folge, daß die imButanoltarget enthaltenen Kohlenstoffkerne ebenfalls Reaktionsprodukte liefern, diezusammen mit den am Proton erzeugten nachgewiesen werden.Ziel der Arbeit war die Bestimmung von Wirkungsquerschnittenam freien Proton aus Messungen an einem komplexen Target (CH2) wie esbeim polarisiertenTarget vorliegt. Die hierzu durchgeführten Pilotexperimentedienten neben der Entwicklung von Methoden zur Reaktionsidentifikation auchder Eichung des Detektorsystems. Durch die Reproduktion der schon bekanntenund vermessenen unpolarisierten differentiellen und totalenEin-Pion-Wirkungsquerschnitte am Proton (gamma p -> p pi0 und gamma p -> n pi+), die bis zueiner Photonenergievon etwa 400 MeV den Hauptbeitrag zum GDH-Integralausmachen, konnte gezeigt werden, daß eine Separation der Wasserstoff- vonKohlenstoffereignissen möglich ist. Die notwendigen Techniken hierzu wurden imRahmen dieser Arbeit zu einem allgemein nutzbaren Werkzeug entwickelt.Weiterhin konnte gezeigt werden, daß der vom Kohlenstoffstammende Anteil der Reaktionen keine Helizitätsabhängigkeit besitzt. Unterdieser Voraussetzung reduziert sich die Bestimmung der helizitätsabhängigenWirkungsquerschnittsdifferenz auf eine einfacheDifferenzbildung. Aus den erhaltenen Ergebnissen der intensiven Analyse von Daten, diemit einem unpolarisierten Target erhalten wurden, konnten so schnellerste Resultate für Messungen, die mit dem polarisierten Frozen-Spin-Targetaufgenommen wurden, geliefert werden. Es zeigt sich, daß sich dieseersten Resultate für polarisierte differentielle und totale (gammaN)-Wirkungsquerschnitte im Delta-Bereich in guter Übereinstimmung mit theoretischenAnalysen befinden.
Resumo:
This thesis is about three major aspects of the identification of top quarks. First comes the understanding of their production mechanism, their decay channels and how to translate theoretical formulae into programs that can simulate such physical processes using Monte Carlo techniques. In particular, the author has been involved in the introduction of the POWHEG generator in the framework of the ATLAS experiment. POWHEG is now fully used as the benchmark program for the simulation of ttbar pairs production and decay, along with MC@NLO and AcerMC: this will be shown in chapter one. The second chapter illustrates the ATLAS detectors and its sub-units, such as calorimeters and muon chambers. It is very important to evaluate their efficiency in order to fully understand what happens during the passage of radiation through the detector and to use this knowledge in the calculation of final quantities such as the ttbar production cross section. The last part of this thesis concerns the evaluation of this quantity deploying the so-called "golden channel" of ttbar decays, yielding one energetic charged lepton, four particle jets and a relevant quantity of missing transverse energy due to the neutrino. The most important systematic errors arising from the various part of the calculation are studied in detail. Jet energy scale, trigger efficiency, Monte Carlo models, reconstruction algorithms and luminosity measurement are examples of what can contribute to the uncertainty about the cross-section.
Resumo:
The ALICE experiment at the LHC has been designed to cope with the experimental conditions and observables of a Quark Gluon Plasma reaction. One of the main assets of the ALICE experiment with respect to the other LHC experiments is the particle identification. The large Time-Of-Flight (TOF) detector is the main particle identification detector of the ALICE experiment. The overall time resolution, better that 80 ps, allows the particle identification over a large momentum range (up to 2.5 GeV/c for pi/K and 4 GeV/c for K/p). The TOF makes use of the Multi-gap Resistive Plate Chamber (MRPC), a detector with high efficiency, fast response and intrinsic time resoltion better than 40 ps. The TOF detector embeds a highly-segmented trigger system that exploits the fast rise time and the relatively low noise of the MRPC strips, in order to identify several event topologies. This work aims to provide detailed description of the TOF trigger system. The results achieved in the 2009 cosmic-ray run at CERN are presented to show the performances and readiness of TOF trigger system. The proposed trigger configuration for the proton-proton and Pb-Pb beams are detailed as well with estimates of the efficiencies and purity samples.
Resumo:
La misura della luminosità è un obiettivo importante per tutta la fisica del modello standard e per la scoperta di nuova fisica, poiché è legata alla sezione d'urto (σ) e al rate di produzione (R) di un determinato processo dalla relazione L = R*σ. Nell'eserimento ATLAS a LHC è installato un monitor di luminosità dedicato chiamato LUCID (Luminosity measurements Using Cherenkov Integrating Detector). Grazie ai dati acquisiti durante il 2010 la valutazione off-line delle performances del LUCID e l'implementazione di controlli on-line sulla qualità dei dati raccolti è stata possibile. I dati reali sono stati confrontati con i dati Monte Carlo e le simulazioni sono state opportunamente aggiustate per ottimizzare l'accordo tra i due. La calibrazione della luminosità relativa che permette di ottenere una valutazione della luminosità assoluta è stata possibile grazie ai cosiddetti Van der Meer scan, grazie ai quale è stata ottenuta una precisione dell'11%. L'analisi della fisica del decadimento della Z è in tuttora in corso per ottenere tramite il rate a cui avviene il processo una normalizzazione della luminosità con una precisione migliore del 5%.
Resumo:
Die vorliegende Dissertation beinhaltet Anwendungen der Quantenchemie und methodische Entwicklungen im Bereich der "Coupled-Cluster"-Theorie zu den folgenden Themen: 1.) Die Bestimmung von Geometrieparametern in wasserstoffverbrückten Komplexen mit Pikometer-Genauigkeit durch Kopplung von NMR-Experimenten und quantenchemischen Rechnungen wird an zwei Beispielen dargelegt. 2.) Die hierin auftretenden Unterschiede in Theorie und Experiment werden diskutiert. Hierzu wurde die Schwingungsmittelung des Dipolkopplungstensors implementiert, um Nullpunkt-Effekte betrachten zu können. 3.) Ein weiterer Aspekt der Arbeit behandelt die Strukturaufklärung an diskotischen Flüssigkristallen. Die quantenchemische Modellbildung und das Zusammenspiel mit experimentellen Methoden, vor allem der Festkörper-NMR, wird vorgestellt. 4.) Innerhalb dieser Arbeit wurde mit der Parallelisierung des Quantenchemiepaketes ACESII begonnen. Die grundlegende Strategie und erste Ergebnisse werden vorgestellt. 5.) Zur Skalenreduktion des CCCSD(T)-Verfahrens durch Faktorisierung wurden verschiedene Zerlegungen des Energienenners getestet. Ein sich hieraus ergebendes Verfahren zur Berechnung der CCSD(T)-Energie wurde implementiert. 6.) Die Reaktionsaufklärung der Bildung von HSOH aus di-tert-Butyl-Sulfoxid wird vorgestellt. Dazu wurde die Thermodynamik der Reaktionsschritte mit Methoden der Quantenchemie berechnet.