347 resultados para Showers (Plumbing fixtures)
Resumo:
The Cherenkov Telescope Array (CTA) is a new observatory for very high-energy (VHE) gamma rays. CTA has ambitions science goals, for which it is necessary to achieve full-sky coverage, to improve the sensitivity by about an order of magnitude, to span about four decades of energy, from a few tens of GeV to above 100 TeV with enhanced angular and energy resolutions over existing VHE gamma-ray observatories. An international collaboration has formed with more than 1000 members from 27 countries in Europe, Asia, Africa and North and South America. In 2010 the CTA Consortium completed a Design Study and started a three-year Preparatory Phase which leads to production readiness of CTA in 2014. In this paper we introduce the science goals and the concept of CTA, and provide an overview of the project.
Resumo:
[EN] The 1883 eruption of Krakatau is one of the best known volcanic events in the world, although it was not the largest, nor the deadliest of known eruptions. However, the eruption happened in a critical moment (just after the first global telegraph network was established) and in a strategic place (the Sunda Straits were a naval traffic hot spot at that time). The lecture will explore these events in some detail before presenting an outline on ongoing multidisciplinary efforts to unravel the past and present day plumbing systems of the 1883 eruption and that of the active Anak Krakatau cone. A mid- and a lower-crustal magma storage level exist beneath the volcano, placing significant emphasis on magma-crust interaction in the uppermost, sediment-rich crust. This final aspect shares similarities with the 2011/2012 El Hierro eruption, highlighting the relevance of the interaction between ascending magmas and marine deposits that oceanic magmas have to pass. At Krakatau, shallow-level crustal contamination offers a possible explanation for the explosive nature of the 1883 eruption and also for those of the presently active Anak Krakatau edifice and helps constrain location, style and processes of subvolcanic magma storage.
Resumo:
IceCube, ein Neutrinoteleskop, welches zur Zeit am Südpol aufgebaut und voraussichtlich 2011 fertiggestellt sein wird, kann galaktische Kernkollaps-Supernovae mit hoher Signifikanz und unübertroffener statistischer Genauigkeit der Neutrinolichtkurve detektieren. Derartige Supernovae werden begleitet von einem massiven Ausbruch niederenergetischer Neutrinos aller Flavour. Beim Durchfliegen des Detektormediums Eis entstehen Positronen und Elektronen, welche wiederum lokale Tscherenkowlichtschauer produzieren, die in ihrer Summe das gesamte Eis erleuchten. Ein Nachweis ist somit, trotz der Optimierung IceCubes auf hochenergetische Teilchenspuren, über eine kollektive Rauschratenerhöhung aller optischen Module möglich. Die vorwiegende Reaktion ist der inverse Betazerfall der Antielektronneutrinos, welcher über 90,% des gesamten Signals ausmacht.rnrnDiese Arbeit beschreibt die Implementierung und Funktionsweise der Supernova-Datennahme-Software sowie der Echtzeitanalyse, mit welcher die oben genannte Nachweismethode seit August 2007 realisiert ist. Die Messdaten der ersten zwei Jahre wurden ausgewertet und belegen ein extrem stabiles Verhalten des Detektors insgesamt sowie fast aller Lichtsensoren, die eine gemittelte Ausfallquote von lediglich 0,3,% aufweisen. Eine Simulation der Detektorantwort nach zwei unterschiedlichen Supernova-Modellen ergibt eine Sichtweite IceCubes, die im besten Falle bis zur 51,kpc entfernten Großen Magellanschen Wolke reicht. Leider ist der Detektor nicht in der Lage, die Deleptonisierungsspitze aufzulösen, denn Oszillationen der Neutrinoflavour innerhalb des Sterns modifizieren die Neutrinospektren ungünstig. Jedoch können modellunabhängig anhand des frühesten Signalanstiegs die inverse Massenhierarchie sowie $sin^2 2theta_{13} > 10^{-3}$ etabliert werden, falls die Entfernung zur Supernova $leq$,6,kpc beträgt. Gleiches kann durch Auswertung eines möglichen Einflusses der Erdmaterie auf die Neutrinooszillation mit Hilfe der Messung eines zweiten Neutrinodetektors erreicht werden.
Resumo:
The Standard Model of elementary particle physics was developed to describe the fundamental particles which constitute matter and the interactions between them. The Large Hadron Collider (LHC) at CERN in Geneva was built to solve some of the remaining open questions in the Standard Model and to explore physics beyond it, by colliding two proton beams at world-record centre-of-mass energies. The ATLAS experiment is designed to reconstruct particles and their decay products originating from these collisions. The precise reconstruction of particle trajectories plays an important role in the identification of particle jets which originate from bottom quarks (b-tagging). This thesis describes the step-wise commissioning of the ATLAS track reconstruction and b-tagging software and one of the first measurements of the b-jet production cross section in pp collisions at sqrt(s)=7 TeV with the ATLAS detector. The performance of the track reconstruction software was studied in great detail, first using data from cosmic ray showers and then collisions at sqrt(s)=900 GeV and 7 TeV. The good understanding of the track reconstruction software allowed a very early deployment of the b-tagging algorithms. First studies of these algorithms and the measurement of the b-tagging efficiency in the data are presented. They agree well with predictions from Monte Carlo simulations. The b-jet production cross section was measured with the 2010 dataset recorded by the ATLAS detector, employing muons in jets to estimate the fraction of b-jets. The measurement is in good agreement with the Standard Model predictions.
Resumo:
This thesis presents an analysis for the search of Supersymmetry with the ATLAS detector at the LHC. The final state with one lepton, several coloured particles and large missing transverse energy was chosen. Particular emphasis was placed on the optimization of the requirements for lepton identification. This optimization showed to be particularly useful when combining with multi-lepton selections. The systematic error associated with the higher order QCD diagrams in Monte Carlo production is given particular focus. Methods to verify and correct the energy measurement of hadronic showers are developed. Methods for the identification and removal of mismeasurements caused by the detector are found in the single muon and four jet environment are applied. A new detector simulation system is shown to provide good prospects for future fast Monte Carlo production. The analysis was performed for $35pb^{-1}$ and no significant deviation from the Standard Model is seen. Exclusion limits subchannel for minimal Supergravity. Previous limits set by Tevatron and LEP are extended.
Resumo:
Aerosol particles are important actors in the Earth’s atmosphere and climate system. They scatter and absorb sunlight, serve as nuclei for water droplets and ice crystals in clouds and precipitation, and are a subject of concern for public health. Atmospheric aerosols originate from both natural and anthropogenic sources, and emissions resulting from human activities have the potential to influence the hydrological cycle and climate. An assessment of the extent and impacts of this human force requires a sound understanding of the natural aerosol background. This dissertation addresses the composition, properties, and atmospheric cycling of biogenic aerosol particles, which represent a major fraction of the natural aerosol burden. The main focal points are: (i) Studies of the autofluo-rescence of primary biological aerosol particles (PBAP) and its application in ambient measure-ments, and (ii) X-ray microscopic and spectroscopic investigations of biogenic secondary organic aerosols (SOA) from the Amazonian rainforest.rnAutofluorescence of biological material has received increasing attention in atmospheric science because it allows real-time monitoring of PBAP in ambient air, however it is associated with high uncertainty. This work aims at reducing the uncertainty through a comprehensive characterization of the autofluorescence properties of relevant biological materials. Fluorescence spectroscopy and microscopy were applied to analyze the fluorescence signatures of pure biological fluorophores, potential non-biological interferences, and various types of reference PBAP. Characteristic features and fingerprint patterns were found and provide support for the operation, interpretation, and further development of PBAP autofluorescence measurements. Online fluorescence detection and offline fluorescence microscopy were jointly applied in a comprehensive bioaerosol field measurement campaign that provided unprecedented insights into PBAP-linked biosphere-atmosphere interactions in a North-American semi-arid forest environment. Rain showers were found to trigger massive bursts of PBAP, including high concentrations of biological ice nucleators that may promote further precipitation and can be regarded as part of a bioprecipitation feedback cycle in the climate system. rnIn the pristine tropical rainforest air of the Amazon, most cloud and fog droplets form on bio-genic SOA particles, but the composition, morphology, mixing state and origin of these particles is hardly known. X-ray microscopy and spectroscopy (STXM-NEXAFS) revealed distinctly different types of secondary organic matter (carboxyl- vs. hydroxy-rich) with internal structures that indicate a strong influence of phase segregation, cloud and fog processing on SOA formation, and aging. In addition, nanometer-sized potassium-rich particles emitted by microorganisms and vegetation were found to act as seeds for the condensation of SOA. Thus, the influence of forest biota on the atmospheric abundance of cloud condensation nuclei appears to be more direct than previously assumed. Overall, the results of this dissertation suggest that biogenic aerosols, clouds and precipitation are indeed tightly coupled through a bioprecipitation cycle, and that advanced microscopic and spectroscopic techniques can provide detailed insights into these mechanisms.rn
Resumo:
Am Mainzer Mikrotron können Lambda-Hyperkerne in (e,e'K^+)-Reaktionen erzeugt werden. Durch den Nachweis des erzeugten Kaons im KAOS-Spektrometer lassen sich Reaktionen markieren, bei denen ein Hyperon erzeugt wurde. Die Spektroskopie geladener Pionen, die aus schwachen Zweikörperzerfällen leichter Hyperkerne stammen, erlaubt es die Bindungsenergie des Hyperons im Kern mit hoher Präzision zu bestimmen. Neben der direkten Produktion von Hyperkernen ist auch die Erzeugung durch die Fragmentierung eines hoch angeregten Kontinuumszustands möglich. Dadurch können unterschiedliche Hyperkerne in einem Experiment untersucht werden. Für die Spektroskopie der Zerfallspionen stehen hochauflösende Magnetspektrometer zur Verfügung. Um die Grundzustandsmasse der Hyperkerne aus dem Pionimpuls zu berechnen, ist es erforderlich, dass das Hyperfragment vor dem Zerfall im Target abgebremst wird. Basierend auf dem bekannten Wirkungsquerschnitt der elementaren Kaon-Photoproduktion wurde eine Berechnung der zu erwartenden Ereignisrate vorgenommen. Es wurde eine Monte-Carlo-Simulation entwickelt, die den Fragmentierungsprozess und das Abbremsen der Hyperfragmente im Target beinhaltet. Diese nutzt ein statistisches Aufbruchsmodell zur Beschreibung der Fragmentierung. Dieser Ansatz ermöglicht für Wasserstoff-4-Lambda-Hyperkerne eine Vorhersage der zu erwartenden Zählrate an Zerfallspionen. In einem Pilotexperiment im Jahr 2011 wurde erstmalig an MAMI der Nachweis von Hadronen mit dem KAOS-Spektrometer unter einem Streuwinkel von 0° demonstriert, und koinzident dazu Pionen nachgewiesen. Es zeigte sich, dass bedingt durch die hohen Untergrundraten von Positronen in KAOS eine eindeutige Identifizierung von Hyperkernen in dieser Konfiguration nicht möglich war. Basierend auf diesen Erkenntnissen wurde das KAOS-Spektrometer so modifiziert, dass es als dedizierter Kaonenmarkierer fungierte. Zu diesem Zweck wurde ein Absorber aus Blei im Spektrometer montiert, in dem Positronen durch Schauerbildung abgestoppt werden. Die Auswirkung eines solchen Absorbers wurde in einem Strahltest untersucht. Eine Simulation basierend auf Geant4 wurde entwickelt mittels derer der Aufbau von Absorber und Detektoren optimiert wurde, und die Vorhersagen über die Auswirkung auf die Datenqualität ermöglichte. Zusätzlich wurden mit der Simulation individuelle Rückrechnungsmatrizen für Kaonen, Pionen und Protonen erzeugt, die die Wechselwirkung der Teilchen mit der Bleiwand beinhalteten, und somit eine Korrektur der Auswirkungen ermöglichen. Mit dem verbesserten Aufbau wurde 2012 eine Produktionsstrahlzeit durchgeführt, wobei erfolgreich Kaonen unter 0° Streuwinkel koninzident mit Pionen aus schwachen Zerfällen detektiert werden konnten. Dabei konnte im Impulsspektrum der Zerfallspionen eine Überhöhung mit einer Signifikanz, die einem p-Wert von 2,5 x 10^-4 entspricht, festgestellt werden. Diese Ereignisse können aufgrund ihres Impulses, den Zerfällen von Wasserstoff-4-Lambda-Hyperkernen zugeordnet werden, wobei die Anzahl detektierter Pionen konsistent mit der berechneten Ausbeute ist.
Resumo:
In this thesis, we develop high precision tools for the simulation of slepton pair production processes at hadron colliders and apply them to phenomenological studies at the LHC. Our approach is based on the POWHEG method for the matching of next-to-leading order results in perturbation theory to parton showers. We calculate matrix elements for slepton pair production and for the production of a slepton pair in association with a jet perturbatively at next-to-leading order in supersymmetric quantum chromodynamics. Both processes are subsequently implemented in the POWHEG BOX, a publicly available software tool that contains general parts of the POWHEG matching scheme. We investigate phenomenological consequences of our calculations in several setups that respect experimental exclusion limits for supersymmetric particles and provide precise predictions for slepton signatures at the LHC. The inclusion of QCD emissions in the partonic matrix elements allows for an accurate description of hard jets. Interfacing our codes to the multi-purpose Monte-Carlo event generator PYTHIA, we simulate parton showers and slepton decays in fully exclusive events. Advanced kinematical variables and specific search strategies are examined as means for slepton discovery in experimentally challenging setups.
Resumo:
Bacteria, yeasts, and viruses are rapidly killed on metallic copper surfaces, and the term "contact killing" has been coined for this process. While the phenomenon was already known in ancient times, it is currently receiving renewed attention. This is due to the potential use of copper as an antibacterial material in health care settings. Contact killing was observed to take place at a rate of at least 7 to 8 logs per hour, and no live microorganisms were generally recovered from copper surfaces after prolonged incubation. The antimicrobial activity of copper and copper alloys is now well established, and copper has recently been registered at the U.S. Environmental Protection Agency as the first solid antimicrobial material. In several clinical studies, copper has been evaluated for use on touch surfaces, such as door handles, bathroom fixtures, or bed rails, in attempts to curb nosocomial infections. In connection to these new applications of copper, it is important to understand the mechanism of contact killing since it may bear on central issues, such as the possibility of the emergence and spread of resistant organisms, cleaning procedures, and questions of material and object engineering. Recent work has shed light on mechanistic aspects of contact killing. These findings will be reviewed here and juxtaposed with the toxicity mechanisms of ionic copper. The merit of copper as a hygienic material in hospitals and related settings will also be discussed.
Resumo:
Experimental modal analysis techniques are applied to characterize the planar dynamic behavior of two spur planetary gears. Rotational and translational vibrations of the sun gear, carrier, and planet gears are measured. Experimentally obtained natural frequencies, mode shapes, and dynamic response are compared to the results from lumped-parameter and finite element models. Two qualitatively different classes of mode shapes in distinct frequency ranges are observed in the experiments and confirmed by the lumped-parameter model, which considers the accessory shafts and fixtures in the system to capture all of the natural frequencies and modes. The finite element model estimates the high-frequency modes that have significant tooth mesh deflection without considering the shafts and fixtures. The lumped-parameter and finite element models accurately predict the natural frequencies and modal properties established by experimentation. Rotational, translational, and planet mode types presented in published mathematical studies are confirmed experimentally. The number and types of modes in the low-frequency and high-frequency bands depend on the degrees of freedom in the central members and planet gears, respectively. The accuracy of natural frequency prediction is improved when the planet bearings have differing stiffnesses in the tangential and radial directions, consistent with the bearing load direction. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The High-Altitude Water Cherenkov (HAWC) Experiment is a gamma-ray observatory that utilizes water silos as Cherenkov detectors to measure the electromagnetic air showers created by gamma rays. The experiment consists of an array of closely packed water Cherenkov detectors (WCDs), each with four photomultiplier tubes (PMTs). The direction of the gamma ray will be reconstructed using the times when the electromagnetic shower front triggers PMTs in each WCD. To achieve an angular resolution as low as 0.1 degrees, a laser calibration system will be used to measure relative PMT response times. The system will direct 300ps laser pulses into two fiber-optic networks. Each network will use optical fan-outs and switches to direct light to specific WCDs. The first network is used to measure the light transit time out to each pair of detectors, and the second network sends light to each detector, calibrating the response times of the four PMTs within each detector. As the relative PMT response times are dependent on the number of photons in the light pulse, neutral density filters will be used to control the light intensity across five orders of magnitude. This system will run both continuously in a low-rate mode, and in a high-rate mode with many intensity levels. In this thesis, the design of the calibration system and systematic studies verifying its performance are presented.
Resumo:
The exsolution of volatiles from magma maintains an important control on volcanic eruption styles. The nucleation, growth, and connectivity of bubbles during magma ascent provide the driving force behind eruptions, and the rate, volume, and ease of gas exsolution can affect eruptive activity. Volcanic plumes are the observable consequence of this magmatic degassing, and remote sensing techniques allow us to quantify changes in gas exsolution. However, until recently the methods used to measure volcanic plumes did not have the capability of detecting rapid changes in degassing on the scale of standard geophysical observations. The advent of the UV camera now makes high sample rate gas measurements possible. This type of dataset can then be compared to other volcanic observations to provide an in depth picture of degassing mechanisms in the shallow conduit. The goals of this research are to develop a robust methodology for UV camera field measurements of volcanic plumes, and utilize this data in conjunction with seismoacoustic records to illuminate degassing processes. Field and laboratory experiments were conducted to determine the effects of imaging conditions, vignetting, exposure time, calibration technique, and filter usage on the UV camera sulfur dioxide measurements. Using the best practices determined from these studies, a field campaign was undertaken at Volcán de Pacaya, Guatemala. Coincident plume sulfur dioxide measurements, acoustic recordings, and seismic observations were collected and analyzed jointly. The results provide insight into the small explosive features, variations in degassing rate, and plumbing system of this complex volcanic system. This research provides useful information for determining volcanic hazard at Pacaya, and demonstrates the potential of the UV camera in multiparameter studies.
Resumo:
Data collected with the surface detector array of the Pierre Auger Observatory during the period from January 1, 2004 to March 17, 2015 was examined for evidence of production of long-lived weakly interacting particles in interactions of ultra high energy cosmic rays in the atmosphere. The search was performed using extensive air showers with primary energies more than 10 EeV and arrival directions in the range of 57.5◦ to 77.5◦. There was no evidence of significant production of such particles. An upper limit on the fraction of extensive air showers in which such particles are produced was set.
Resumo:
In this issue...Crane Plumbing Company, Butte Chamber of Commerce, Eugene Bullock, chemistry, Theta Tau fraternity, YMCA, President Van Pelt
Resumo:
Measurements of the production of jets of particles in association with a Z boson in pp collisions at root s = 7 TeV are presented, using data corresponding to an integrated luminosity of 4.6 fb(-1) collected by the ATLAS experiment at the Large Hadron Collider. Inclusive and differential jet cross sections in Z events, with Z decaying into electron or muon pairs, are measured for jets with transverse momentum p(T) > 30 GeV and rapidity vertical bar y vertical bar < 4.4. The results are compared to next-to-leading-order perturbative QCD calculations, and to predictions from different Monte Carlo generators based on leading-order and next-to-leading-order matrix elements supplemented by parton showers.