961 resultados para Glasshouse experiment
Resumo:
Identical neutral kaon pair correlations are measured in root s = 7 TeV pp collisions in the ALICE experiment. One-dimensional (KsKs0)-K-0 correlation functions in terms of the invariant momentum difference of kaon pairs are formed in two multiplicity and two transverse momentum ranges. The femtoscopic parameters for the radius and correlation strength of the kaon source are extracted. The fit includes quantum statistics and final-state interactions of the a(0)/f(0) resonance. (KsKs0)-K-0 correlations show an increase in radius for increasing multiplicity and a slight decrease in radius for increasing transverse mass, mT, as seen in pi pi correlations in pp collisions and in heavy-ion collisions. Transverse mass scaling is observed between the (KsKs0)-K-0 and pi pi radii. Also, the first observation is made of the decay of the f(2)'(1525) meson into the (KsKs0)-K-0 channel in pp collisions. (C) 2012 CERN. Published by Elsevier B.V. All rights reserved.
Resumo:
Considerable effort has been made in recent years to optimize materials properties for magnetic hyperthermia applications. However, due to the complexity of the problem, several aspects pertaining to the combined influence of the different parameters involved still remain unclear. In this paper, we discuss in detail the role of the magnetic anisotropy on the specific absorption rate of cobalt-ferrite nanoparticles with diameters ranging from 3 to 14 nm. The structural characterization was carried out using x-ray diffraction and Rietveld analysis and all relevant magnetic parameters were extracted from vibrating sample magnetometry. Hyperthermia investigations were performed at 500 kHz with a sinusoidal magnetic field amplitude of up to 68 Oe. The specific absorption rate was investigated as a function of the coercive field, saturation magnetization, particle size, and magnetic anisotropy. The experimental results were also compared with theoretical predictions from the linear response theory and dynamic hysteresis simulations, where exceptional agreement was found in both cases. Our results show that the specific absorption rate has a narrow and pronounced maxima for intermediate anisotropy values. This not only highlights the importance of this parameter but also shows that in order to obtain optimum efficiency in hyperthermia applications, it is necessary to carefully tailor the materials properties during the synthesis process. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4729271]
Resumo:
The concept of metacontingency was taught to undergraduate students of Psychology by using a "game" simulation proposed originally by Vichi, Andery and Glenn (2009). Twenty-five students, distributed into three groups were exposed to six experimental sessions in which they had to make bets and divide the amounts gained. The three groups competed against each other for photocopies quotas. Two contingencies shifted over the sessions. Under Contingency B, the group would win points only if in the previous round each member had received the same amount of points and under Contingency A, winning was contingent on an unequal distribution of the points. We observed that proportional divisions predominated independent of the contingency in course. The manipulation of cultural consequences (winning or losing points) produced consistent modifications in two response categories: 1) choices of the value bet in each round, and 2) divisions of the points among group members. Controlling relations between cultural consequences and the behavior of dividing were statistically significant in one of the groups, whereas in the other two groups controlling relations were observed only in Contingency B. A review of the reinforcement criteria used in the original experiment is suggested.
Resumo:
The continued growth of large cities is producing increasing volumes of urban sewage sludge. Disposing of this waste without damaging the environment requires careful management. The application of large quantities of biosolids (treated sewage sludge) to agricultural lands for many years may result in the excessive accumulation of nutrients like phosphorus (P) and thereby raise risks of eutrophication in nearby water bodies. We evaluated the fractionation of P in samples of an Oxisol collected as part of a field experiment in which biosolids were added at three rates to a maize (Zea mays L) plantation over four consecutive years. The biosolids treatments were equivalent to one, two and four times the recommended N rate for maize crops. In a fourth treatment, mineral fertilizer was applied at the rate recommended for maize. Inorganic P forms were extracted with ammonium chloride to remove soluble and loosely bound P; P bound to aluminum oxide (P-Al) was extracted with ammonium fluoride; P bound to iron oxide (P-Fe) was extracted with sodium hydroxide; and P bound to calcium (P-Ca) was extracted with sulfuric acid. Organic P was calculated as the difference between total P and inorganic P. The predominant fraction of P was P-Fe, followed by P-Al and P-Ca. P fractions were positively correlated to the amounts of P applied, except for P-Ca. The low values of P-Ca were due to the advanced weathering processes to which the Oxisol have been subjected, under which forms of P-Ca are converted to P-Fe and P-Al. The fertilization with P via biosolids increased P availability for maize plants even when a large portion of P was converted to more stable forms. Phosphorus content in maize leaves and grains was positively correlated with P fractions in soils. From these results it can be concluded that the application of biosolids in highly weathered tropical clayey soils for many years, even above the recommended rate based on N requirements for maize, tend to be less potentially hazardous to the environment than in less weathered sandy soils because the non-readily P fractions are predominant after the addition of biosolids. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The Carr-Purcell pulse sequence, with low refocusing flip angle, produces echoes midway between refocusing pulses that decay to a minimum value dependent on T*(2). When the refocusing flip angle was pi/2 (CP90) and tau > T*(2), the signal after the minimum value, increased to reach a steady-state free precession regime (SSFP), composed of a free induction decay signal after each pulse and an echo, before the next pulse. When tau < T*(2), the signal increased from the minimum value to the steady-state regime with a time constant (T*) = 2T(1)T(2)/(T-1 + T-2). identical to the time constant observed in the SSFP sequence, known as the continuous wave free precession (CWFP). The steady-state amplitude obtained with M-cp90 = M0T2/(T-1+T-2) was identical to CWFP. Therefore, this sequence was named CP-CWFP because it is a Carr-Purcell sequence that produces results similar to the CWFP. However, CP-CWFP is a better sequence for measuring the longitudinal and transverse relaxation times in single scan, when the sample exhibits T-1 similar to T-2. Therefore, this sequence can be a useful method in time domain NMR and can be widely used in the agriculture, food and petrochemical industries because those samples tend to have similar relaxation times in low magnetic fields. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
The Amazon basin is a region of constant scientific interest due to its environmental importance and its biodiversity and climate on a global scale. The seasonal variations in water volume are one of the examples of topics studied nowadays. In general, the variations in river levels depend primarily on the climate and physics characteristics of the corresponding basins. The main factor which influences the water level in the Amazon Basin is the intensive rainfall over this region as a consequence of the humidity of the tropical climate. Unfortunately, the Amazon basin is an area with lack of water level information due to difficulties in access for local operations. The purpose of this study is to compare and evaluate the Equivalent Water Height (Ewh) from GRACE (Gravity Recovery And Climate Experiment) mission, to study the connection between water loading and vertical variations of the crust due to the hydrologic. In order to achieve this goal, the Ewh is compared with in-situ information from limnimeter. For the analysis it was computed the correlation coefficients, phase and amplitude of GRACE Ewh solutions and in-situ data, as well as the timing of periods of drought in different parts of the basin. The results indicated that vertical variations of the lithosphere due to water mass loading could reach 7 to 5 cm per year, in the sedimentary and flooded areas of the region, where water level variations can reach 10 to 8 m.
Resumo:
Cirrus clouds are an interesting point in the research of the atmosphere due their behavior and the effect on the earth radiation budget. They can affect the atmospheric radiation budget by reflecting the incoming solar radiation and absorbing the outgoing terrestrial radiation. Also, this cloud type is involved in the dehydration of the upper troposphere and lower stratosphere. So, it is interesting to increment the measurements of this type of clouds from the ground. During November and December 2012, through the CHUVA-SUL campaign, measurements with lidar in Santa Maria, Rio Grande do Sul were conducted. The system installed in Santa Maria site (29.8 °S; 53.7 °W, 100 m asl) was a single elastic-backscatter lidar using the wavelength of 532 nm. Some days with cirrus clouds lidar measurements were detected. Four days with presence of cirrus cloud are showed in the present study. These days, 7, 8, 19 and 28 November 2012, was selected due the persistence of cirrus clouds over many hours. The raw retrieval lidar signals and inverted backscatter coefficient profiles were analyzed for the selected days. Base and top height was obtained by analysis of raw signal and backscatter coefficient. Extinction coefficient profiles were obtained by the assumption of the lidar ratio. Cirrus cloud optical depth (COD) values were calculated, from the integration of the extinction coefficient between the base and top altitudes of the cirrus clouds.
Resumo:
Biomass burning represents one of the largest sources of particulate matter to the atmosphere, which results in a significant perturbation to the Earth’s radiative balance coupled with serious negative impacts on public health. Globally, biomass burning aerosols are thought to exert a small warming effect of 0.03 Wm-2, however the uncertainty is 4 times greater than the central estimate. On regional scales, the impact is substantially greater, particularly in areas such as the Amazon Basin where large, intense and frequent burning occurs on an annual basis for several months (usually from August-October). Furthermore, a growing number of people live within the Amazon region, which means that they are subject to the deleterious effects on their health from exposure to substantial volumes of polluted air. Initial results from the South American Biomass Burning Analysis (SAMBBA) field experiment, which took place during September and October 2012 over Brazil, are presented here. A suite of instrumentation was flown on-board the UK Facility for Airborne Atmospheric Measurement (FAAM) BAe-146 research aircraft and was supported by ground based measurements, with extensive measurements made in Porto Velho, Rondonia. The aircraft sampled a range of conditions with sampling of fresh biomass burning plumes, regional haze and elevated biomass burning layers within the free troposphere. The physical, chemical and optical properties of the aerosols across the region will be characterized in order to establish the impact of biomass burning on regional air quality, weather and climate.
Resumo:
This thesis comes after a strong contribution on the realization of the CMS computing system, which can be seen as a relevant part of the experiment itself. A physics analysis completes the road from Monte Carlo production and analysis tools realization to the final physics study which is the actual goal of the experiment. The topic of physics work of this thesis is the study of tt events fully hadronic decay in the CMS experiment. A multi-jet trigger has been provided to fix a reasonable starting point, reducing the multi-jet sample to the nominal trigger rate. An offline selection has been provided to reduce the S/B ratio. The b-tag is applied to provide a further S/B improvement. The selection is applied to the background sample and to the samples generated at different top quark masses. The top quark mass candidate is reconstructed for all those samples using a kinematic fitter. The resulting distributions are used to build p.d.f.’s, interpolating them with a continuous arbitrary curve. These curves are used to perform the top mass measurement through a likelihood comparison
Resumo:
In this thesis the performances of the CMS Drift Tubes Local Trigger System of the CMS detector are studied. CMS is one of the general purpose experiments that will operate at the Large Hadron Collider at CERN. Results from data collected during the Cosmic Run At Four Tesla (CRAFT) commissioning exercise, a globally coordinated run period where the full experiment was involved and configured to detect cosmic rays crossing the CMS cavern, are presented. These include analyses on the precision and accuracy of the trigger reconstruction mechanism and measurement of the trigger efficiency. The description of a method to perform system synchronization is also reported, together with a comparison of the outcomes of trigger electronics and its software emulator code.
Resumo:
ALICE, that is an experiment held at CERN using the LHC, is specialized in analyzing lead-ion collisions. ALICE will study the properties of quarkgluon plasma, a state of matter where quarks and gluons, under conditions of very high temperatures and densities, are no longer confined inside hadrons. Such a state of matter probably existed just after the Big Bang, before particles such as protons and neutrons were formed. The SDD detector, one of the ALICE subdetectors, is part of the ITS that is composed by 6 cylindrical layers with the innermost one attached to the beam pipe. The ITS tracks and identifies particles near the interaction point, it also aligns the tracks of the articles detected by more external detectors. The two ITS middle layers contain the whole 260 SDD detectors. A multichannel readout board, called CARLOSrx, receives at the same time the data coming from 12 SDD detectors. In total there are 24 CARLOSrx boards needed to read data coming from all the SDD modules (detector plus front end electronics). CARLOSrx packs data coming from the front end electronics through optical link connections, it stores them in a large data FIFO and then it sends them to the DAQ system. Each CARLOSrx is composed by two boards. One is called CARLOSrx data, that reads data coming from the SDD detectors and configures the FEE; the other one is called CARLOSrx clock, that sends the clock signal to all the FEE. This thesis contains a description of the hardware design and firmware features of both CARLOSrx data and CARLOSrx clock boards, which deal with all the SDD readout chain. A description of the software tools necessary to test and configure the front end electronics will be presented at the end of the thesis.
Resumo:
The OPERA experiment aims at the direct observation of ν_mu -> ν_tau oscillations in the CNGS (CERN Neutrinos to Gran Sasso) neutrino beam produced at CERN; since the ν_e contamination in the CNGS beam is low, OPERA will also be able to study the sub-dominant oscillation channel ν_mu -> ν_e. OPERA is a large scale hybrid apparatus divided in two supermodules, each equipped with electronic detectors, an iron spectrometer and a highly segmented ~0.7 kton target section made of Emulsion Cloud Chamber (ECC) units. During my research work in the Bologna Lab. I have taken part to the set-up of the automatic scanning microscopes studying and tuning the scanning system performances and efficiencies with emulsions exposed to a test beam at CERN in 2007. Once the triggered bricks were distributed to the collaboration laboratories, my work was centered on the procedure used for the localization and the reconstruction of neutrino events.
Resumo:
Das experimentelle Studium der 1966 von Gerasimov, Drell undHearn unabhängig voneinander aufgestellten und als GDH-SummenregelbezeichnetenRelation macht die Vermessung totalerPhotoabsorptionswirkungsquerschnitte von zirkular polarisierten Photonen an longitudinalpolarisierten Nukleonen über einen weiten Energiebereich notwendig. Die im Sommer1998 erfolgte Messung am Mainzer Mikrotron stellt das erste derartigeExperiment mit reellen Photonen zur Messung des GDH-Integrals am Protondar. Die Verwendung eines Frozen-Spin-Butanoltargets, das eingesetzt wurde, umeinen möglichst hohen Proton-Polarisationsgrad zu erreichen, hat diezusätzliche experimentelle Schwierigkeit zur Folge, daß die imButanoltarget enthaltenen Kohlenstoffkerne ebenfalls Reaktionsprodukte liefern, diezusammen mit den am Proton erzeugten nachgewiesen werden.Ziel der Arbeit war die Bestimmung von Wirkungsquerschnittenam freien Proton aus Messungen an einem komplexen Target (CH2) wie esbeim polarisiertenTarget vorliegt. Die hierzu durchgeführten Pilotexperimentedienten neben der Entwicklung von Methoden zur Reaktionsidentifikation auchder Eichung des Detektorsystems. Durch die Reproduktion der schon bekanntenund vermessenen unpolarisierten differentiellen und totalenEin-Pion-Wirkungsquerschnitte am Proton (gamma p -> p pi0 und gamma p -> n pi+), die bis zueiner Photonenergievon etwa 400 MeV den Hauptbeitrag zum GDH-Integralausmachen, konnte gezeigt werden, daß eine Separation der Wasserstoff- vonKohlenstoffereignissen möglich ist. Die notwendigen Techniken hierzu wurden imRahmen dieser Arbeit zu einem allgemein nutzbaren Werkzeug entwickelt.Weiterhin konnte gezeigt werden, daß der vom Kohlenstoffstammende Anteil der Reaktionen keine Helizitätsabhängigkeit besitzt. Unterdieser Voraussetzung reduziert sich die Bestimmung der helizitätsabhängigenWirkungsquerschnittsdifferenz auf eine einfacheDifferenzbildung. Aus den erhaltenen Ergebnissen der intensiven Analyse von Daten, diemit einem unpolarisierten Target erhalten wurden, konnten so schnellerste Resultate für Messungen, die mit dem polarisierten Frozen-Spin-Targetaufgenommen wurden, geliefert werden. Es zeigt sich, daß sich dieseersten Resultate für polarisierte differentielle und totale (gammaN)-Wirkungsquerschnitte im Delta-Bereich in guter Übereinstimmung mit theoretischenAnalysen befinden.
Resumo:
This thesis is about three major aspects of the identification of top quarks. First comes the understanding of their production mechanism, their decay channels and how to translate theoretical formulae into programs that can simulate such physical processes using Monte Carlo techniques. In particular, the author has been involved in the introduction of the POWHEG generator in the framework of the ATLAS experiment. POWHEG is now fully used as the benchmark program for the simulation of ttbar pairs production and decay, along with MC@NLO and AcerMC: this will be shown in chapter one. The second chapter illustrates the ATLAS detectors and its sub-units, such as calorimeters and muon chambers. It is very important to evaluate their efficiency in order to fully understand what happens during the passage of radiation through the detector and to use this knowledge in the calculation of final quantities such as the ttbar production cross section. The last part of this thesis concerns the evaluation of this quantity deploying the so-called "golden channel" of ttbar decays, yielding one energetic charged lepton, four particle jets and a relevant quantity of missing transverse energy due to the neutrino. The most important systematic errors arising from the various part of the calculation are studied in detail. Jet energy scale, trigger efficiency, Monte Carlo models, reconstruction algorithms and luminosity measurement are examples of what can contribute to the uncertainty about the cross-section.