962 resultados para 0804 Data Format


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hominid evolution in the late Miocene has long been hypothesized to be linked to the retreat of the tropical rainforest in Africa. One cause for the climatic and vegetation change often considered was uplift of Africa, but also uplift of the Himalaya and the Tibetan Plateau was suggested to have impacted rainfall distribution over Africa. Recent proxy data suggest that in East Africa open grassland habitats were available to the common ancestors of hominins and apes long before their divergence and do not find evidence for a closed rainforest in the late Miocene. We used the coupled global general circulation model CCSM3 including an interactively coupled dynamic vegetation module to investigate the impact of topography on African hydro-climate and vegetation. We performed sensitivity experiments altering elevations of the Himalaya and the Tibetan Plateau as well as of East and Southern Africa. The simulations confirm the dominant impact of African topography for climate and vegetation development of the African tropics. Only a weak influence of prescribed Asian uplift on African climate could be detected. The model simulations show that rainforest coverage of Central Africa is strongly determined by the presence of elevated African topography. In East Africa, despite wetter conditions with lowered African topography, the conditions were not favorable enough to maintain a closed rainforest. A discussion of the results with respect to other model studies indicates a minor importance of vegetation-atmosphere or ocean-atmosphere feedbacks and a large dependence of the simulated vegetation response on the land surface/vegetation model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we present first results of a new model development, ECHAM5-JSBACH-wiso, where we have incorporated the stable water isotopes H218O and HDO as tracers in the hydrological cycle of the coupled atmosphere-land surface model ECHAM5-JSBACH. The ECHAM5-JSBACH-wiso model was run under present-day climate conditions at two different resolutions (T31L19, T63L31). A comparison between ECHAM5-JSBACH-wiso and ECHAM5-wiso shows that the coupling has a strong impact on the simulated temperature and soil wetness. Caused by these changes of temperature and the hydrological cycle, the d18O in precipitation also shows variations from -4 permil up to 4 permil. One of the strongest anomalies is shown over northeast Asia where, due to an increase of temperature, the d18O in precipitation increases as well. In order to analyze the sensitivity of the fractionation processes over land, we compare a set of simulations with various implementations of these processes over the land surface. The simulations allow us to distinguish between no fractionation, fractionation included in the evaporation flux (from bare soil) and also fractionation included in both evaporation and transpiration (from water transport through plants) fluxes. While the isotopic composition of the soil water may change for d18O by up to +8 permil:, the simulated d18O in precipitation shows only slight differences on the order of ±1 permil. The simulated isotopic composition of precipitation fits well with the available observations from the GNIP (Global Network of Isotopes in Precipitation) database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The data set consists of maps of total velocity of surface currents in the Ibiza Channel, derived from HF radar measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Topographic variation, the spatial variation in elevation and terrain features, underpins a myriad of patterns and processes in geography and ecology and is key to understanding the variation of life on the planet. The characterization of this variation is scale-dependent, i.e. it varies with the distance over which features are assessed and with the spatial grain (grid cell resolution) of analysis. A fully standardized and global multivariate product of different terrain features has the potential to support many large-scale basic research and analytical applications, however to date, such technique is unavailable. Here we used the digital elevation model products of global 250 m GMTED and near-global 90 m SRTM to derive a suite of topographic variables: elevation, slope, aspect, eastness, northness, roughness, terrain roughness index, topographic position index, vector ruggedness measure, profile and tangential curvature, and 10 geomorphological landform classes. We aggregated each variable to 1, 5, 10, 50 and 100 km spatial grains using several aggregation approaches (median, average, minimum, maximum, standard deviation, percent cover, count, majority, Shannon Index, entropy, uniformity). While a global cross-correlation underlines the high similarity of many variables, a more detailed view in four mountain regions reveals local differences, as well as scale variations in the aggregated variables at different spatial grains. All newly-developed variables are available for download at http://www.earthenv.org and can serve as a basis for standardized hydrological, environmental and biodiversity modeling at a global extent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research paper presents the work on feature recognition, tool path data generation and integration with STEP-NC (AP-238 format) for features having Free form / Irregular Contoured Surface(s) (FICS). Initially, the FICS features are modelled / imported in UG CAD package and a closeness index is generated. This is done by comparing the FICS features with basic B-Splines / Bezier curves / surfaces. Then blending functions are caculated by adopting convolution theorem. Based on the blending functions, contour offsett tool paths are generated and simulated for 5 axis milling environment. Finally, the tool path (CL) data is integrated with STEP-NC (AP-238) format. The tool path algorithm and STEP- NC data is tested with various industrial parts through an automated UFUNC plugin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is part of a special issue of Applied Geochemistry focusing on reliable applications of compositional multivariate statistical methods. This study outlines the application of compositional data analysis (CoDa) to calibration of geochemical data and multivariate statistical modelling of geochemistry and grain-size data from a set of Holocene sedimentary cores from the Ganges-Brahmaputra (G-B) delta. Over the last two decades, understanding near-continuous records of sedimentary sequences has required the use of core-scanning X-ray fluorescence (XRF) spectrometry, for both terrestrial and marine sedimentary sequences. Initial XRF data are generally unusable in ‘raw-format’, requiring data processing in order to remove instrument bias, as well as informed sequence interpretation. The applicability of these conventional calibration equations to core-scanning XRF data are further limited by the constraints posed by unknown measurement geometry and specimen homogeneity, as well as matrix effects. Log-ratio based calibration schemes have been developed and applied to clastic sedimentary sequences focusing mainly on energy dispersive-XRF (ED-XRF) core-scanning. This study has applied high resolution core-scanning XRF to Holocene sedimentary sequences from the tidal-dominated Indian Sundarbans, (Ganges-Brahmaputra delta plain). The Log-Ratio Calibration Equation (LRCE) was applied to a sub-set of core-scan and conventional ED-XRF data to quantify elemental composition. This provides a robust calibration scheme using reduced major axis regression of log-ratio transformed geochemical data. Through partial least squares (PLS) modelling of geochemical and grain-size data, it is possible to derive robust proxy information for the Sundarbans depositional environment. The application of these techniques to Holocene sedimentary data offers an improved methodological framework for unravelling Holocene sedimentation patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tide gauge data are identified as legacy data given the radical transition between observation method and required output format associated with tide gauges over the 20th-century. Observed water level variation through tide-gauge records is regarded as the only significant basis for determining recent historical variation (decade to century) in mean sea-level and storm surge. There are limited tide gauge records that cover the 20th century, such that the Belfast (UK) Harbour tide gauge would be a strategic long-term (110 years) record, if the full paper-based records (marigrams) were digitally restructured to allow for consistent data analysis. This paper presents the methodology of extracting a consistent time series of observed water levels from the 5 different Belfast Harbour tide gauges’ positions/machine types, starting late 1901. Tide-gauge data was digitally retrieved from the original analogue (daily) records by scanning the marigrams and then extracting the sequential tidal elevations with graph-line seeking software (Ungraph™). This automation of signal extraction allowed the full Belfast series to be retrieved quickly, relative to any manual x–y digitisation of the signal. Restructuring variably lengthed tidal data sets to a consistent daily, monthly and annual file format was undertaken by project-developed software: Merge&Convert and MergeHYD allow consistent water level sampling both at 60 min (past standard) and 10 min intervals, the latter enhancing surge measurement. Belfast tide-gauge data have been rectified, validated and quality controlled (IOC 2006 standards). The result is a consistent annual-based legacy data series for Belfast Harbour that includes over 2 million tidal-level data observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transient simulations are widely used in studying the past climate as they provide better comparison with any exisiting proxy data. However, multi-millennial transient simulations using coupled climate models are usually computationally very expensive. As a result several acceleration techniques are implemented when using numerical simulations to recreate past climate. In this study, we compare the results from transient simulations of the present and the last interglacial with and without acceleration of the orbital forcing, using the comprehensive coupled climate model CCSM3 (Community Climate System Model 3). Our study shows that in low-latitude regions, the simulation of long-term variations in interglacial surface climate is not significantly affected by the use of the acceleration technique (with an acceleration factor of 10) and hence, large-scale model-data comparison of surface variables is not hampered. However, in high-latitude regions where the surface climate has a direct connection to the deep ocean, e.g. in the Southern Ocean or the Nordic Seas, acceleration-induced biases in sea-surface temperature evolution may occur with potential influence on the dynamics of the overlying atmosphere. The data provided here are from both accelerated and non-accelerated runs as decadal mean values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document specifies the NetCDF file format of EGO-gliders that is used to distribute glider data, metadata and technical data. It documents the standards used therein; this includes naming conventions as well as metadata content. It was initiated in October 2012, based on OceanSITES, Argo and ANFOG user's manuals. Everyone’s Gliding Observatories - EGO is dedicated to the promotion of the glider technology and its applications. The EGO group promotes glider applications through coordination, training, liaison between providers and users, advocacy, and provision of expert advice. We intend to favor oceanographic experiments and the operational monitoring of the oceans with gliders through scientific and international collaboration. We provide news, support, information about glider projects and glider data management, as well as resources related to gliders. All EGO data are publicly available. More information about the project is available at: http://www.ego-network.org

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Understanding transcriptional regulation by genome-wide microarray studies can contribute to unravel complex relationships between genes. Attempts to standardize the annotation of microarray data include the Minimum Information About a Microarray Experiment (MIAME) recommendations, the MAGE-ML format for data interchange, and the use of controlled vocabularies or ontologies. The existing software systems for microarray data analysis implement the mentioned standards only partially and are often hard to use and extend. Integration of genomic annotation data and other sources of external knowledge using open standards is therefore a key requirement for future integrated analysis systems. Results: The EMMA 2 software has been designed to resolve shortcomings with respect to full MAGE-ML and ontology support and makes use of modern data integration techniques. We present a software system that features comprehensive data analysis functions for spotted arrays, and for the most common synthesized oligo arrays such as Agilent, Affymetrix and NimbleGen. The system is based on the full MAGE object model. Analysis functionality is based on R and Bioconductor packages and can make use of a compute cluster for distributed services. Conclusion: Our model-driven approach for automatically implementing a full MAGE object model provides high flexibility and compatibility. Data integration via SOAP-based web-services is advantageous in a distributed client-server environment as the collaborative analysis of microarray data is gaining more and more relevance in international research consortia. The adequacy of the EMMA 2 software design and implementation has been proven by its application in many distributed functional genomics projects. Its scalability makes the current architecture suited for extensions towards future transcriptomics methods based on high-throughput sequencing approaches which have much higher computational requirements than microarrays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi: 10.1594/PANGAEA.854832 (Valente et al., 2015).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At national and European levels, in various projects, data products are developed to provide end-users and stakeholders with homogeneously qualified observation compilation or analysis. Ifremer has developed a spatial data infrastructure for marine environment, called Sextant, in order to manage, share and retrieve these products for its partners and the general public. Thanks to the OGC and ISO standard and INSPIRE compliance, the infrastructure provides a unique framework to federate homogeneous descriptions and access to marine data products processed in various contexts, at national level or European level for DG research (SeaDataNet), DG Mare (EMODNET) and DG Growth (Copernicus MEMS). The discovery service of Sextant is based on the metadata catalogue. The data description is normalized according to ISO 191XX series standards and Inspire recommendations. Access to the catalogue is provided by the standard OGC service, Catalogue Service for the Web (CSW 2.0.2). Data visualization and data downloading are available through standard OGC services, Web Map Services (WMS) and Web Feature Services (WFS). Several OGC services are provided within Sextant, according to marine themes, regions and projects. Depending on the file format, WMTS services are used for large images, such as hyperspectral images, or NcWMS services for gridded data, such as climatology models. New functions are developped to improve the visualization, analyse and access to data, eg : data filtering, online spatial processing with WPS services and acces to sensor data with SOS services.