5 resultados para nächstes Heft

em ArchiMeD - Elektronische Publikationen der Universität Mainz - Alemanha


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proxy data are essential for the investigation of climate variability on time scales larger than the historical meteorological observation period. The potential value of a proxy depends on our ability to understand and quantify the physical processes that relate the corresponding climate parameter and the signal in the proxy archive. These processes can be explored under present-day conditions. In this thesis, both statistical and physical models are applied for their analysis, focusing on two specific types of proxies, lake sediment data and stable water isotopes.rnIn the first part of this work, the basis is established for statistically calibrating new proxies from lake sediments in western Germany. A comprehensive meteorological and hydrological data set is compiled and statistically analyzed. In this way, meteorological times series are identified that can be applied for the calibration of various climate proxies. A particular focus is laid on the investigation of extreme weather events, which have rarely been the objective of paleoclimate reconstructions so far. Subsequently, a concrete example of a proxy calibration is presented. Maxima in the quartz grain concentration from a lake sediment core are compared to recent windstorms. The latter are identified from the meteorological data with the help of a newly developed windstorm index, combining local measurements and reanalysis data. The statistical significance of the correlation between extreme windstorms and signals in the sediment is verified with the help of a Monte Carlo method. This correlation is fundamental for employing lake sediment data as a new proxy to reconstruct windstorm records of the geological past.rnThe second part of this thesis deals with the analysis and simulation of stable water isotopes in atmospheric vapor on daily time scales. In this way, a better understanding of the physical processes determining these isotope ratios can be obtained, which is an important prerequisite for the interpretation of isotope data from ice cores and the reconstruction of past temperature. In particular, the focus here is on the deuterium excess and its relation to the environmental conditions during evaporation of water from the ocean. As a basis for the diagnostic analysis and for evaluating the simulations, isotope measurements from Rehovot (Israel) are used, provided by the Weizmann Institute of Science. First, a Lagrangian moisture source diagnostic is employed in order to establish quantitative linkages between the measurements and the evaporation conditions of the vapor (and thus to calibrate the isotope signal). A strong negative correlation between relative humidity in the source regions and measured deuterium excess is found. On the contrary, sea surface temperature in the evaporation regions does not correlate well with deuterium excess. Although requiring confirmation by isotope data from different regions and longer time scales, this weak correlation might be of major importance for the reconstruction of moisture source temperatures from ice core data. Second, the Lagrangian source diagnostic is combined with a Craig-Gordon fractionation parameterization for the identified evaporation events in order to simulate the isotope ratios at Rehovot. In this way, the Craig-Gordon model can be directly evaluated with atmospheric isotope data, and better constraints for uncertain model parameters can be obtained. A comparison of the simulated deuterium excess with the measurements reveals that a much better agreement can be achieved using a wind speed independent formulation of the non-equilibrium fractionation factor instead of the classical parameterization introduced by Merlivat and Jouzel, which is widely applied in isotope GCMs. Finally, the first steps of the implementation of water isotope physics in the limited-area COSMO model are described, and an approach is outlined that allows to compare simulated isotope ratios to measurements in an event-based manner by using a water tagging technique. The good agreement between model results from several case studies and measurements at Rehovot demonstrates the applicability of the approach. Because the model can be run with high, potentially cloud-resolving spatial resolution, and because it contains sophisticated parameterizations of many atmospheric processes, a complete implementation of isotope physics will allow detailed, process-oriented studies of the complex variability of stable isotopes in atmospheric waters in future research.rn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantum Chromodynamics (QCD) is the theory of strong interactions, one of the four fundamental forces in our Universe. It describes the interaction of gluons and quarks which build up hadrons like protons and neutrons. Most of the visible matter in our universe is made of protons and neutrons. Hence, we are interested in their fundamental properties like their masses, their distribution of charge and their shape. \\rnThe only known theoretical, non-perturbative and {\it ab initio} method to investigate hadron properties at low energies is lattice Quantum Chromodynamics (lattice QCD). However, up-to-date simulations (especially for baryonic quantities) do not achieve the accuracy of experiments. In fact, current simulations do not even reproduce the experimental values for the form factors. The question arises wether these deviations can be explained by systematic effects in lattice QCD simulations.rnrnThis thesis is about the computation of nucleon form factors and other hadronic quantities from lattice QCD. So called Wilson fermions are used and the u- and d-quarks are treated fully dynamically. The simulations were performed using gauge ensembles with a range of lattice spacings, volumes and pion masses.\\rnFirst of all, the lattice spacing was set to be able to make contact between the lattice results and their experimental complement and to be able to perform a continuum extrapolation. The light quark mass has been computed and found to be $m_{ud}^{\overline{\text{MS}}}(2\text{ GeV}) = 3.03(17)(38)\text{ MeV}$. This value is in good agreement with values from experiments and other lattice determinations.\\rnElectro-magnetic and axial form factors of the nucleon have been calculated. From these form factors the nucleon radii and the coupling constants were computed. The different ensembles enabled us to investigate systematically the dependence of these quantities on the volume, the lattice spacing and the pion mass.\newpage Finally we perform a continuum extrapolation and chiral extrapolations to the physical point.\\rnIn addition, we investigated so called excited state contributions to these observables. A technique was used, the summation method, which reduces these effects significantly and a much better agreement with experimental data was achieved. On the lattice, the Dirac radius and the axial charge are usually found to be much smaller than the experimental values. However, due to the carefully investigation of all the afore-mentioned systematic effects we get $\langle r_1^2\rangle_{u-d}=0.627(54)\text{ fm}^2$ and $g_A=1.218(92)$, which is in agreement with the experimental values within the errors.rnrnThe first three chapters introduce the theoretical background of form factors of the nucleon and lattice QCD in general. In chapter four the lattice spacing is determined. The computation of nucleon form factors is described in chapter five where systematic effects are investigated. All results are presented in chapter six. The thesis ends with a summary of the results and identifies options to complement and extend the calculations presented. rn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In my dissertation I investigated the influence of behavioral variation between and within ant colonies on group performance. In particular, I analyzed how evolution shapes behavior in response to ecological conditions, and whether within-group diversity improves productivity as suggested by theory. Our field and laboratory experiments showed that behavioral diverse groups are more productive. Different aggression levels within colonies were beneficial under competitive field situations, whereas diversity in brood care and exploratory behavior were favored in non-competitive laboratory situations. We then examined whether population density and social parasite presence shape aggression through phenotypic plasticity and/or natural selection. The importance of selection was indicated by the absence of density or parasite effects on aggression in a field manipulation. Indeed, more aggressive colonies fared better under high density and during parasite attack. When analyzing the proximate causes of individual behavioral variation, ovarian development was shown to be linked to division of labor and aggressiveness. Finally, our studies show that differences in the collective behavior can be linked to immune defense and productivity. My dissertation demonstrates that behavioral variation should be studied on multiple scales and when possible combined with physiological analyses to better understand the evolution of animal personalities in social groups.rn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data deduplication describes a class of approaches that reduce the storage capacity needed to store data or the amount of data that has to be transferred over a network. These approaches detect coarse-grained redundancies within a data set, e.g. a file system, and remove them.rnrnOne of the most important applications of data deduplication are backup storage systems where these approaches are able to reduce the storage requirements to a small fraction of the logical backup data size.rnThis thesis introduces multiple new extensions of so-called fingerprinting-based data deduplication. It starts with the presentation of a novel system design, which allows using a cluster of servers to perform exact data deduplication with small chunks in a scalable way.rnrnAfterwards, a combination of compression approaches for an important, but often over- looked, data structure in data deduplication systems, so called block and file recipes, is introduced. Using these compression approaches that exploit unique properties of data deduplication systems, the size of these recipes can be reduced by more than 92% in all investigated data sets. As file recipes can occupy a significant fraction of the overall storage capacity of data deduplication systems, the compression enables significant savings.rnrnA technique to increase the write throughput of data deduplication systems, based on the aforementioned block and file recipes, is introduced next. The novel Block Locality Caching (BLC) uses properties of block and file recipes to overcome the chunk lookup disk bottleneck of data deduplication systems. This chunk lookup disk bottleneck either limits the scalability or the throughput of data deduplication systems. The presented BLC overcomes the disk bottleneck more efficiently than existing approaches. Furthermore, it is shown that it is less prone to aging effects.rnrnFinally, it is investigated if large HPC storage systems inhibit redundancies that can be found by fingerprinting-based data deduplication. Over 3 PB of HPC storage data from different data sets have been analyzed. In most data sets, between 20 and 30% of the data can be classified as redundant. According to these results, future work in HPC storage systems should further investigate how data deduplication can be integrated into future HPC storage systems.rnrnThis thesis presents important novel work in different area of data deduplication re- search.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Our growing understanding of human mind and cognition and the development of neurotechnology has triggered debate around cognitive enhancement in neuroethics. The dissertation examines the normative issues of memory enhancement, and focuses on two issues: (1) the distinction between memory treatment and enhancement; and (2) how the issue of authenticity concerns memory interventions, including memory treatments and enhancements. rnThe first part consists of a conceptual analysis of the concepts required for normative considerations. First, the representational nature and the function of memory are discussed. Memory is regarded as a special form of self-representation resulting from a constructive processes. Next, the concepts of selfhood, personhood, and identity are examined and a conceptual tool—the autobiographical self-model (ASM)—is introduced. An ASM is a collection of mental representations of the system’s relations with its past and potential future states. Third, the debate between objectivist and constructivist views of health are considered. I argue for a phenomenological account of health, which is based on the primacy of illness and negative utilitarianism.rnThe second part presents a synthesis of the relevant normative issues based on the conceptual tools developed. I argue that memory enhancement can be distinguished from memory treatment using a demarcation regarding the existence of memory-related suffering. That is, memory enhancements are, under standard circumstances and without any unwilling suffering or potential suffering resulting from the alteration of memory functions, interventions that aim to manipulate memory function based on the self-interests of the individual. I then consider the issue of authenticity, namely whether memory intervention or enhancement endangers “one’s true self”. By analyzing two conceptions of authenticity—authenticity as self-discovery and authenticity as self-creation, I propose that authenticity should be understood in terms of the satisfaction of the functional constraints of an ASM—synchronic coherence, diachronic coherence, and global veridicality. This framework provides clearer criteria for considering the relevant concerns and allows us to examine the moral values of authenticity. rn