943 resultados para Vienna. Bürgerliches zeughaus.
Resumo:
Seventy-nine interstitial water samples from six sites (Ocean Drilling Program Sites 1119-1124) from the southwestern Pacific Ocean have been analyzed for stable isotopes of dissolved sulfate (34S), along with major and minor ions. Sulfate from the interstitial fluids (34S values between +20.7 and +57.5 vs. the Vienna-Canyon Diablo troilite standard) was enriched in 34S with respect to modern seawater (34S +20.6), indicating that differing amounts of microbial sulfate reduction took place at all investigated sites. Microbial sulfate reduction was found at all sites, the intensity depending on the availability of organic matter, which is controlled by paleosedimentation conditions (e.g., sedimentation rate and presence of turbidites). In addition, total reduced inorganic sulfur (essentially pyrite) as a product of microbial sulfate reduction was quantified in selected sediments from Site 1119.
Resumo:
Continental climate evolution of Central Europe has been reconstructed quantitatively for the last 45 million years providing inferred data on mean annual temperature and precipitation, and winter and summer temperatures. Although some regional effects occur, the European Cenozoic continental climate record correlates well with the global oxygen isotope record from marine environments. During the last 45 million years, continental cooling is especially pronounced for inferred winter temperatures but hardly observable from summer temperatures. Correspondingly, Cenozoic cooling in Central Europe is directly associated with an increase of seasonality. In contrast, inferred Cenozoic mean annual precipitation remained relatively stable, indicating the importance of latent heat transport throughout the Cenozoic. Moreover, our data support the concept that changes in atmospheric CO2 concentrations, although linked to climate changes, were not the major driving force of Cenozoic cooling.
Resumo:
This dataset provides scaling information applicable to satellite derived coarse resolution surface soil moisture datasets following the approach by Wagner et al. (2008). It is based on ENVISAT ASAR data and can be utilized to apply the Metop ASCAT dataset (25 km) for local studies as well as to assess the representativeness of in-situ measurement sites and thus their potential for upscaling. The approach based on temporal stability (Wagner et al. 2008) consists of the assessment of the validity of the coarse resolution datasets at medium resolution (1 km, product is the so called 'scaling layer').
Resumo:
This dataset contains the results of granulometric and bulk geochemical analyses of Van Veen surface samples obtained by the Alfred Wegener Institute (AWI) in the course of the 2012 and 2013 summer field seasons. The sampling was performed along transects in depths generally <13 m, to a distance of about <5 km off Herschel Island. In 2012, 75 samples in Pauline Cove and in the vicinity of Simpson Point were obtained. Sample collection was expanded in 2013, on transects established the previous year, with additional locations in Tetris Bay and Workboat Passage. Samples consisted of approximately 100 g of the top 3-6 cm of sediment, and were frozen in the field and freeze dried at the AWI before undergoing analytical procedures. Sample locations were recorded with the onboard global positioning system (GPS) unit. Grain size distributions in our study were obtained using laser diffractometry at the AWI (Beckman Coulter LS200) on the <1 mm fraction of samples oxidized with 30% H2O2 until effervescence ceased to remove organics. Some samples were also sieved using a sieve stack with 1 phi intervals. GRADISTAT (Blott and Pye, 2001) was used to calculate graphical grain size statistics (Folk and Ward, 1957). Grain diameters were logarithmically transformed to phi values, calculated as phi=-log2d, where d is the grain diameter in millimeters (Blott and Pye, 2001; Krumbein, 1934). Freeze dried samples were ground and ground using an Elemetar Vario EL III carbon-nitrogen-sulphur analyzer at the AWI to measure total carbon (TC) and total nitrogen (TN). Tungsten oxide was added to the samples as a catalyst to the pyrolysis. Following this analysis, total organic carbon (TOC) was determined using an Elementar VarioMax. Stable carbon isotope ratios of 13C/12C of 118 samples were determined on a DELTAplusXL mass spectrometer (ThermoFisher Scientific, Bremen) at the German Research Centre for Geosciences (GFZ) in Potsdam, Germany . An additional analysis on 69 samples was carried out at the University of Hamburg with an isotope ratio mass spectrometer (Delta V, Thermo Scientific, Germany) coupled to an elemental analyzer (Flash 2000, Thermo Scientific, Germany). Prior to analysis, soil samples were treated with phosphoric acid (43%) to release inorganic carbon. Values are expressed relative to Vienna Peedee belemnite (VPDB) using external standards (USGS40, -26.4 per mil VPDB and IVA soil 33802153, -27.5 per mil VPDB).
Resumo:
To study the propagation of the uncertainty from basic data across different scale and physics phenomena -> through complex coupled multi-physics and multi-scale simulations
Resumo:
This work is aimed to present the main differences of nuclear data uncertainties among three different nuclear data libraries: EAF-2007, EAF-2010 and SCALE-6.0, under different neutron spectra: LWR, ADS and DEMO (fusion)
Resumo:
Abstraction-Carrying Code (ACC) is a framework for mobile code safety in which the code supplier provides a program together with an abstraction (or abstract model of the program) whose validity entails compliance with a predefined safety policy. The abstraction plays thus the role of safety certificate and its generation is carried out automatically by a fixed-point analyzer. The advantage of providing a (fixed-point) abstraction to the code consumer is that its validity is checked in a single pass (i.e., one iteration) of an abstract interpretation-based checker. A main challenge to make ACC useful in practice is to reduce the size of certificates as much as possible, while at the same time not increasing checking time. Intuitively, we only include in the certificate the information which the checker is unable to reproduce without iterating. We introduce the notion of reduced certifícate which characterizes the subset of the abstraction which a checker needs in order to validate (and re-construct) the full certificate in a single pass. Based on this notion, we show how to instrument a generic analysis algorithm with the necessary extensions in order to identify the information relevant to the checker.
Resumo:
We study a cognitive radio scenario in which the network of sec- ondary users wishes to identify which primary user, if any, is trans- mitting. To achieve this, the nodes will rely on some form of location information. In our previous work we proposed two fully distributed algorithms for this task, with and without a pre-detection step, using propagation parameters as the only source of location information. In a real distributed deployment, each node must estimate its own po- sition and/or propagation parameters. Hence, in this work we study the effect of uncertainty, or error in these estimates on the proposed distributed identification algorithms. We show that the pre-detection step significantly increases robustness against uncertainty in nodes' locations.
Resumo:
Reducing energy consumption is one of the main challenges in most countries. For example, European Member States agreed to reduce greenhouse gas (GHG) emissions by 20% in 2020 compared to 1990 levels (EC 2008). Considering each sector separately, ICTs account nowadays for 2% of total carbon emissions. This percentage will increase as the demand of communication services and applications steps up. At the same time, the expected evolution of ICT-based developments - smart buildings, smart grids and smart transportation systems among others - could result in the creation of energy-saving opportunities leading to global emission reductions (Labouze et al. 2008), although the amount of these savings is under debate (Falch 2010). The main development required in telecommunication networks ?one of the three major blocks of energy consumption in ICTs together with data centers and consumer equipment (Sutherland 2009) ? is the evolution of existing infrastructures into ultra-broadband networks, the so-called Next Generation Networks (NGN). Fourth generation (4G) mobile communications are the technology of choice to complete -or supplement- the ubiquitous deployment of NGN. The risk and opportunities involved in NGN roll-out are currently in the forefront of the economic and policy debate. However, the issue of which is the role of energy consumption in 4G networks seems absent, despite the fact that the economic impact of energy consumption arises as a key element in the cost analysis of this type of networks. Precisely, the aim of this research is to provide deeper insight on the energy consumption involved in the usage of a 4G network, its relationship with network main design features, and the general economic impact this would have in the capital and operational expenditures related with network deployment and usage.