4 resultados para measurement of time interval

em ArchiMeD - Elektronische Publikationen der Universität Mainz - Alemanha


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The subject of the presented thesis is the accurate measurement of time dilation, aiming at a quantitative test of special relativity. By means of laser spectroscopy, the relativistic Doppler shifts of a clock transition in the metastable triplet spectrum of ^7Li^+ are simultaneously measured with and against the direction of motion of the ions. By employing saturation or optical double resonance spectroscopy, the Doppler broadening as caused by the ions' velocity distribution is eliminated. From these shifts both time dilation as well as the ion velocity can be extracted with high accuracy allowing for a test of the predictions of special relativity. A diode laser and a frequency-doubled titanium sapphire laser were set up for antiparallel and parallel excitation of the ions, respectively. To achieve a robust control of the laser frequencies required for the beam times, a redundant system of frequency standards consisting of a rubidium spectrometer, an iodine spectrometer, and a frequency comb was developed. At the experimental section of the ESR, an automated laser beam guiding system for exact control of polarisation, beam profile, and overlap with the ion beam, as well as a fluorescence detection system were built up. During the first experiments, the production, acceleration and lifetime of the metastable ions at the GSI heavy ion facility were investigated for the first time. The characterisation of the ion beam allowed for the first time to measure its velocity directly via the Doppler effect, which resulted in a new improved calibration of the electron cooler. In the following step the first sub-Doppler spectroscopy signals from an ion beam at 33.8 %c could be recorded. The unprecedented accuracy in such experiments allowed to derive a new upper bound for possible higher-order deviations from special relativity. Moreover future measurements with the experimental setup developed in this thesis have the potential to improve the sensitivity to low-order deviations by at least one order of magnitude compared to previous experiments; and will thus lead to a further contribution to the test of the standard model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Volatile organic compounds play a critical role in ozone formation and drive the chemistry of the atmosphere, together with OH radicals. The simplest volatile organic compound methane is a climatologically important greenhouse gas, and plays a key role in regulating water vapour in the stratosphere and hydroxyl radicals in the troposphere. The OH radical is the most important atmospheric oxidant and knowledge of the atmospheric OH sink, together with the OH source and ambient OH concentrations is essential for understanding the oxidative capacity of the atmosphere. Oceanic emission and / or uptake of methanol, acetone, acetaldehyde, isoprene and dimethyl sulphide (DMS) was characterized as a function of photosynthetically active radiation (PAR) and a suite of biological parameters, in a mesocosm experiment conducted in the Norwegian fjord. High frequency (ca. 1 minute-1) methane measurements were performed using a gas chromatograph - flame ionization detector (GC-FID) in the boreal forests of Finland and the tropical forests of Suriname. A new on-line method (Comparative Reactivity Method - CRM) was developed to directly measure the total OH reactivity (sink) of ambient air. It was observed that under conditions of high biological activity and a PAR of ~ 450 μmol photons m-2 s-1, the ocean acted as a net source of acetone. However, if either of these criteria was not fulfilled then the ocean acted as a net sink of acetone. This new insight into the biogeochemical cycling of acetone at the ocean-air interface has helped to resolve discrepancies from earlier works such as Jacob et al. (2002) who reported the ocean to be a net acetone source (27 Tg yr-1) and Marandino et al. (2005) who reported the ocean to be a net sink of acetone (- 48 Tg yr-1). The ocean acted as net source of isoprene, DMS and acetaldehyde but net sink of methanol. Based on these findings, it is recommended that compound specific PAR and biological dependency be used for estimating the influence of the global ocean on atmospheric VOC budgets. Methane was observed to accumulate within the nocturnal boundary layer, clearly indicating emissions from the forest ecosystems. There was a remarkable similarity in the time series of the boreal and tropical forest ecosystem. The average of the median mixing ratios during a typical diel cycle were 1.83 μmol mol-1 and 1.74 μmol mol-1 for the boreal forest ecosystem and tropical forest ecosystem respectively. A flux value of (3.62 ± 0.87) x 1011 molecules cm-2 s-1 (or 45.5 ± 11 Tg CH4 yr-1 for global boreal forest area) was derived, which highlights the importance of the boreal forest ecosystem for the global budget of methane (~ 600 Tg yr-1). The newly developed CRM technique has a dynamic range of ~ 4 s-1 to 300 s-1 and accuracy of ± 25 %. The system has been tested and calibrated with several single and mixed hydrocarbon standards showing excellent linearity and accountability with the reactivity of the standards. Field tests at an urban and forest site illustrate the promise of the new method. The results from this study have improved current understanding about VOC emissions and uptake from ocean and forest ecosystems. Moreover, a new technique for directly measuring the total OH reactivity of ambient air has been developed and validated, which will be a valuable addition to the existing suite of atmospheric measurement techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Precision measurements of observables in neutron beta decay address important open questions of particle physics and cosmology. In this thesis, a measurement of the proton recoil spectrum with the spectrometer aSPECT is described. From this spectrum the antineutrino-electron angular correlation coefficient a can be derived. In our first beam time at the FRM II in Munich, background instabilities prevented us from presenting a new value for a. In the latest beam time at the ILL in Grenoble, the background has been reduced sufficiently. As a result of the data analysis, we identified and fixed a problem in the detector electronics which caused a significant systematic error. The aim of the latest beam time was a new value for a with an error well below the present literature value of 4%. A statistical accuracy of about 1.4% was reached, but we could only set upper limits on the correction of the problem in the detector electronics, too high to determine a meaningful result. This thesis focused on the investigation of different systematic effects. With the knowledge of the systematics gained in this thesis, we are able to improve aSPECT to perform a 1% measurement of a in a further beam time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Zeitreihen sind allgegenwärtig. Die Erfassung und Verarbeitung kontinuierlich gemessener Daten ist in allen Bereichen der Naturwissenschaften, Medizin und Finanzwelt vertreten. Das enorme Anwachsen aufgezeichneter Datenmengen, sei es durch automatisierte Monitoring-Systeme oder integrierte Sensoren, bedarf außerordentlich schneller Algorithmen in Theorie und Praxis. Infolgedessen beschäftigt sich diese Arbeit mit der effizienten Berechnung von Teilsequenzalignments. Komplexe Algorithmen wie z.B. Anomaliedetektion, Motivfabfrage oder die unüberwachte Extraktion von prototypischen Bausteinen in Zeitreihen machen exzessiven Gebrauch von diesen Alignments. Darin begründet sich der Bedarf nach schnellen Implementierungen. Diese Arbeit untergliedert sich in drei Ansätze, die sich dieser Herausforderung widmen. Das umfasst vier Alignierungsalgorithmen und ihre Parallelisierung auf CUDA-fähiger Hardware, einen Algorithmus zur Segmentierung von Datenströmen und eine einheitliche Behandlung von Liegruppen-wertigen Zeitreihen.rnrnDer erste Beitrag ist eine vollständige CUDA-Portierung der UCR-Suite, die weltführende Implementierung von Teilsequenzalignierung. Das umfasst ein neues Berechnungsschema zur Ermittlung lokaler Alignierungsgüten unter Verwendung z-normierten euklidischen Abstands, welches auf jeder parallelen Hardware mit Unterstützung für schnelle Fouriertransformation einsetzbar ist. Des Weiteren geben wir eine SIMT-verträgliche Umsetzung der Lower-Bound-Kaskade der UCR-Suite zur effizienten Berechnung lokaler Alignierungsgüten unter Dynamic Time Warping an. Beide CUDA-Implementierungen ermöglichen eine um ein bis zwei Größenordnungen schnellere Berechnung als etablierte Methoden.rnrnAls zweites untersuchen wir zwei Linearzeit-Approximierungen für das elastische Alignment von Teilsequenzen. Auf der einen Seite behandeln wir ein SIMT-verträgliches Relaxierungschema für Greedy DTW und seine effiziente CUDA-Parallelisierung. Auf der anderen Seite führen wir ein neues lokales Abstandsmaß ein, den Gliding Elastic Match (GEM), welches mit der gleichen asymptotischen Zeitkomplexität wie Greedy DTW berechnet werden kann, jedoch eine vollständige Relaxierung der Penalty-Matrix bietet. Weitere Verbesserungen umfassen Invarianz gegen Trends auf der Messachse und uniforme Skalierung auf der Zeitachse. Des Weiteren wird eine Erweiterung von GEM zur Multi-Shape-Segmentierung diskutiert und auf Bewegungsdaten evaluiert. Beide CUDA-Parallelisierung verzeichnen Laufzeitverbesserungen um bis zu zwei Größenordnungen.rnrnDie Behandlung von Zeitreihen beschränkt sich in der Literatur in der Regel auf reellwertige Messdaten. Der dritte Beitrag umfasst eine einheitliche Methode zur Behandlung von Liegruppen-wertigen Zeitreihen. Darauf aufbauend werden Distanzmaße auf der Rotationsgruppe SO(3) und auf der euklidischen Gruppe SE(3) behandelt. Des Weiteren werden speichereffiziente Darstellungen und gruppenkompatible Erweiterungen elastischer Maße diskutiert.