889 resultados para Open Data, Dati Aperti, Open Government Data
Resumo:
Wind-generated waves in the Kara, Laptev, and East-Siberian Seas are investigated using altimeter data from Envisat RA-2 and SARAL-AltiKa. Only isolated ice-free zones had been selected for analysis. Wind seas can be treated as pure wind-generated waves without any contamination by ambient swell. Such zones were identified using ice concentration data from microwave radiometers. Altimeter data, both significant wave height (SWH) and wind speed, for these areas were further obtained for the period 2002-2012 using Envisat RA-2 measurements, and for 2013 using SARAL-AltiKa. Dependencies of dimensionless SWH and wavelength on dimensionless wave generation spatial scale are compared to known empirical dependencies for fetch-limited wind wave development. We further check sensitivity of Ka- and Ku-band and discuss new possibilities that AltiKa's higher resolution can open.
Resumo:
Due to the growth of design size and complexity, design verification is an important aspect of the Logic Circuit development process. The purpose of verification is to validate that the design meets the system requirements and specification. This is done by either functional or formal verification. The most popular approach to functional verification is the use of simulation based techniques. Using models to replicate the behaviour of an actual system is called simulation. In this thesis, a software/data structure architecture without explicit locks is proposed to accelerate logic gate circuit simulation. We call thus system ZSIM. The ZSIM software architecture simulator targets low cost SIMD multi-core machines. Its performance is evaluated on the Intel Xeon Phi and 2 other machines (Intel Xeon and AMD Opteron). The aim of these experiments is to: • Verify that the data structure used allows SIMD acceleration, particularly on machines with gather instructions ( section 5.3.1). • Verify that, on sufficiently large circuits, substantial gains could be made from multicore parallelism ( section 5.3.2 ). • Show that a simulator using this approach out-performs an existing commercial simulator on a standard workstation ( section 5.3.3 ). • Show that the performance on a cheap Xeon Phi card is competitive with results reported elsewhere on much more expensive super-computers ( section 5.3.5 ). To evaluate the ZSIM, two types of test circuits were used: 1. Circuits from the IWLS benchmark suit [1] which allow direct comparison with other published studies of parallel simulators.2. Circuits generated by a parametrised circuit synthesizer. The synthesizer used an algorithm that has been shown to generate circuits that are statistically representative of real logic circuits. The synthesizer allowed testing of a range of very large circuits, larger than the ones for which it was possible to obtain open source files. The experimental results show that with SIMD acceleration and multicore, ZSIM gained a peak parallelisation factor of 300 on Intel Xeon Phi and 11 on Intel Xeon. With only SIMD enabled, ZSIM achieved a maximum parallelistion gain of 10 on Intel Xeon Phi and 4 on Intel Xeon. Furthermore, it was shown that this software architecture simulator running on a SIMD machine is much faster than, and can handle much bigger circuits than a widely used commercial simulator (Xilinx) running on a workstation. The performance achieved by ZSIM was also compared with similar pre-existing work on logic simulation targeting GPUs and supercomputers. It was shown that ZSIM simulator running on a Xeon Phi machine gives comparable simulation performance to the IBM Blue Gene supercomputer at very much lower cost. The experimental results have shown that the Xeon Phi is competitive with simulation on GPUs and allows the handling of much larger circuits than have been reported for GPU simulation. When targeting Xeon Phi architecture, the automatic cache management of the Xeon Phi, handles and manages the on-chip local store without any explicit mention of the local store being made in the architecture of the simulator itself. However, targeting GPUs, explicit cache management in program increases the complexity of the software architecture. Furthermore, one of the strongest points of the ZSIM simulator is its portability. Note that the same code was tested on both AMD and Xeon Phi machines. The same architecture that efficiently performs on Xeon Phi, was ported into a 64 core NUMA AMD Opteron. To conclude, the two main achievements are restated as following: The primary achievement of this work was proving that the ZSIM architecture was faster than previously published logic simulators on low cost platforms. The secondary achievement was the development of a synthetic testing suite that went beyond the scale range that was previously publicly available, based on prior work that showed the synthesis technique is valid.
Resumo:
Open Access zu öffentlich geförderten wissenschaftlichen Publikationen ist unter dem Vorzeichen der „Openness“ Teil einer zunehmend bedeutsamen globalen Entwicklung mit strukturellen Folgen für Wissenschaft, Forschung und Bildung. Dabei bedingen die jeweiligen fachkulturellen Ausgangslagen und ökonomischen Interessenskonstellationen sehr stark, in welcher Weise, mit welcher Reichweite und Akzeptanz sich das Open-Access-Paradigma konkret materialisiert. Die vorliegende Arbeit geht dieser Frage am Beispiel des inter- bzw. pluridisziplinären Feldes der Erziehungswissenschaft/Bildungsforschung nach. Zum einen werden die fachlichen und soziokulturellen Konstellationen des Publizierens im disziplinären Feld, die verlagswirtschaftlichen Marktkonstellationen sowie die informationsinfrastrukturellen Bedingungen des Fachgebietes analysiert und ein differenziertes Gesamtbild erstellt. Gestützt auf eine Online-Befragung der Fachcommunity Erziehungswissenschaft/Bildungsforschung werden weitergehende Erkenntnisse über vorhandene Open-Access-Erfahrungen im Fachgebiet und Hemmnisse bzw. Anforderungen an das neue Publikationsmodell aus der Sicht der Wissenschaftler/innen selbst – sowie explorativ aus Sicht der Studierenden und der Bildungspraxis - ermittelt. Wesentliche Faktoren bei der Betrachtung der Potenziale und Effekte von Open Access im Fachgebiet bilden die Faktoren akademischer Status und Funktion, Interdisziplinarität und fachliche Provenienz sowie das Verhältnis von Bildungspraxis und akademischem Sektor. (DIPF/Orig.)
Resumo:
Observing system experiments (OSEs) are carried out over a 1-year period to quantify the impact of Argo observations on the Mercator Ocean 0.25° global ocean analysis and forecasting system. The reference simulation assimilates sea surface temperature (SST), SSALTO/DUACS (Segment Sol multi-missions dALTimetrie, d'orbitographie et de localisation précise/Data unification and Altimeter combination system) altimeter data and Argo and other in situ observations from the Coriolis data center. Two other simulations are carried out where all Argo and half of the Argo data are withheld. Assimilating Argo observations has a significant impact on analyzed and forecast temperature and salinity fields at different depths. Without Argo data assimilation, large errors occur in analyzed fields as estimated from the differences when compared with in situ observations. For example, in the 0–300 m layer RMS (root mean square) differences between analyzed fields and observations reach 0.25 psu and 1.25 °C in the western boundary currents and 0.1 psu and 0.75 °C in the open ocean. The impact of the Argo data in reducing observation–model forecast differences is also significant from the surface down to a depth of 2000 m. Differences between in situ observations and forecast fields are thus reduced by 20 % in the upper layers and by up to 40 % at a depth of 2000 m when Argo data are assimilated. At depth, the most impacted regions in the global ocean are the Mediterranean outflow, the Gulf Stream region and the Labrador Sea. A significant degradation can be observed when only half of the data are assimilated. Therefore, Argo observations matter to constrain the model solution, even for an eddy-permitting model configuration. The impact of the Argo floats' data assimilation on other model variables is briefly assessed: the improvement of the fit to Argo profiles do not lead globally to unphysical corrections on the sea surface temperature and sea surface height. The main conclusion is that the performance of the Mercator Ocean 0.25° global data assimilation system is heavily dependent on the availability of Argo data.
Resumo:
Background: Understanding transcriptional regulation by genome-wide microarray studies can contribute to unravel complex relationships between genes. Attempts to standardize the annotation of microarray data include the Minimum Information About a Microarray Experiment (MIAME) recommendations, the MAGE-ML format for data interchange, and the use of controlled vocabularies or ontologies. The existing software systems for microarray data analysis implement the mentioned standards only partially and are often hard to use and extend. Integration of genomic annotation data and other sources of external knowledge using open standards is therefore a key requirement for future integrated analysis systems. Results: The EMMA 2 software has been designed to resolve shortcomings with respect to full MAGE-ML and ontology support and makes use of modern data integration techniques. We present a software system that features comprehensive data analysis functions for spotted arrays, and for the most common synthesized oligo arrays such as Agilent, Affymetrix and NimbleGen. The system is based on the full MAGE object model. Analysis functionality is based on R and Bioconductor packages and can make use of a compute cluster for distributed services. Conclusion: Our model-driven approach for automatically implementing a full MAGE object model provides high flexibility and compatibility. Data integration via SOAP-based web-services is advantageous in a distributed client-server environment as the collaborative analysis of microarray data is gaining more and more relevance in international research consortia. The adequacy of the EMMA 2 software design and implementation has been proven by its application in many distributed functional genomics projects. Its scalability makes the current architecture suited for extensions towards future transcriptomics methods based on high-throughput sequencing approaches which have much higher computational requirements than microarrays.
Resumo:
The European Multidisciplinary Seafloor and water-column Observatory (EMSO) European Research Infrastructure Consortium (ERIC) provides power, communications, sensors, and data infrastructure for continuous, high-resolution, (near-)real-time, interactive ocean observations across a multidisciplinary and interdisciplinary range of research areas including biology, geology, chemistry, physics, engineering, and computer science, from polar to subtropical environments, through the water column down to the abyss. Eleven deep-sea and four shallow nodes span from the Arctic through the Atlantic and Mediterranean, to the Black Sea. Coordination among the consortium nodes is being strengthened through the EMSOdev project (H2020), which will produce the EMSO Generic Instrument Module (EGIM). Early installations are now being upgraded, for example, at the Ligurian, Ionian, Azores, and Porcupine Abyssal Plain (PAP) nodes. Significant findings have been flowing in over the years; for example, high-frequency surface and subsurface water-column measurements of the PAP node show an increase in seawater pCO2 (from 339 μatm in 2003 to 353 μatm in 2011) with little variability in the mean air-sea CO2 flux. In the Central Eastern Atlantic, the Oceanic Platform of the Canary Islands open-ocean canary node (aka ESTOC station) has a long-standing time series on water column physical, biogeochemical, and acidification processes that have contributed to the assessment efforts of the Intergovernmental Panel on Climate Change (IPCC). EMSO not only brings together countries and disciplines but also allows the pooling of resources and coordination to assemble harmonized data into a comprehensive regional ocean picture, which will then be made available to researchers and stakeholders worldwide on an open and interoperable access basis.
Resumo:
A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi: 10.1594/PANGAEA.854832 (Valente et al., 2015).
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Faculdade de Ciências Médicas, Programa de Pós-Graduação em Ciências Médicas, 2012.
Resumo:
66 p.
Resumo:
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License which permits unrestricted non-commercial use, distribution, and reproduction in any medium provided the original work is properly cited.
Resumo:
Coastal lagoons represent habitats with widely heterogeneous environmental conditions, particularly as regards salinity and temperature,which fluctuate in both space and time. These characteristics suggest that physical and ecological factors could contribute to the genetic divergence among populations occurring in coastal lagoon and opencoast environments. This study investigates the genetic structure of Holothuria polii at a micro-geographic scale across theMar Menor coastal lagoon and nearbymarine areas, estimating the mitochondrial DNA variation in two gene fragments, cytochrome oxidase I (COI) and 16S rRNA (16S). Dataset of mitochondrial sequences was also used to test the influence of environmental differences between coastal lagoon andmarine waters on population genetic structure. All sampled locations exhibited high levels of haplotype diversity and low values of nucleotide diversity. Both genes showed contrasting signals of genetic differentiation (non-significant differences using COI and slight differences using 16S, which could due to different mutation rates or to differential number of exclusive haplotypes. We detected an excess of recent mutations and exclusive haplotypes, which can be generated as a result of population growth. However, selective processes can be also acting on the gene markers used; highly significant generalized additive models have been obtained considering genetic data from16S gene and independent variables such as temperature and salinity.
A method for the estimation of potential evapotranspiration and/or open pan evaporation over Brazil.
Resumo:
This paper presents a simple regression model to estimate potential evapotranspiration and/or open pan evaporation data for a wide network of stations in Brazil. The model uses the readily available data sets like geocoordinates (latitude) and precipitation as inputs. Potential evapotranspiration presents a high correlation with the precipitation during summer months and with latitude during winter months. It also shows association with longitude and elevation; the magnitude of variation appears to be very small. This model gave a R2 varying from 0.460 to 0.902 for different months. The model is also extended to weekly periods of individual years ant tested with the open pan evaporation data of Bebedouro and Mandacaru. The agreement between observed and predicted values appears to be good.
Resumo:
Our proposal aims to display the analysis techniques, methodologies as well as the most relevant results expected within the Exhibitium project framework (http://www.exhibitium.com). Awarded by the BBVA Foundation, the Exhibitium project is being developed by an international consortium of several research groups . Its main purpose is to build a comprehensive and structured data repository about temporary art exhibitions, captured from the web, to make them useful and reusable in various domains through open and interoperable data systems.
Resumo:
Crosswell data set contains a range of angles limited only by the geometry of the source and receiver configuration, the separation of the boreholes and the depth to the target. However, the wide angles reflections present in crosswell imaging result in amplitude-versus-angle (AVA) features not usually observed in surface data. These features include reflections from angles that are near critical and beyond critical for many of the interfaces; some of these reflections are visible only for a small range of angles, presumably near their critical angle. High-resolution crosswell seismic surveys were conducted over a Silurian (Niagaran) reef at two fields in northern Michigan, Springdale and Coldspring. The Springdale wells extended to much greater depths than the reef, and imaging was conducted from above and from beneath the reef. Combining the results from images obtained from above with those from beneath provides additional information, by exhibiting ranges of angles that are different for the two images, especially for reflectors at shallow depths, and second, by providing additional constraints on the solutions for Zoeppritz equations. Inversion of seismic data for impedance has become a standard part of the workflow for quantitative reservoir characterization. Inversion of crosswell data using either deterministic or geostatistical methods can lead to poor results with phase change beyond the critical angle, however, the simultaneous pre-stack inversion of partial angle stacks may be best conducted with restrictions to angles less than critical. Deterministic inversion is designed to yield only a single model of elastic properties (best-fit), while the geostatistical inversion produces multiple models (realizations) of elastic properties, lithology and reservoir properties. Geostatistical inversion produces results with far more detail than deterministic inversion. The magnitude of difference in details between both types of inversion becomes increasingly pronounced for thinner reservoirs, particularly those beyond the vertical resolution of the seismic. For any interface imaged from above and from beneath, the results AVA characters must result from identical contrasts in elastic properties in the two sets of images, albeit in reverse order. An inversion approach to handle both datasets simultaneously, at pre-critical angles, is demonstrated in this work. The main exploration problem for carbonate reefs is determining the porosity distribution. Images of elastic properties, obtained from deterministic and geostatistical simultaneous inversion of a high-resolution crosswell seismic survey were used to obtain the internal structure and reservoir properties (porosity) of Niagaran Michigan reef. The images obtained are the best of any Niagaran pinnacle reef to date.