889 resultados para Open Data, Dati Aperti, Open Government Data
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present dataset contains navigation and meteorological data measured during one campaign of the Tara Oceans Expedition. Latitude and Longitude were obtained from TSG data.
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present dataset contains navigation and meteorological data measured during one campaign of the Tara Oceans Expedition. Latitude and Longitude were obtained from TSG data.
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present dataset contains navigation and meteorological data measured during one campaign of the Tara Oceans Expedition. Latitude and Longitude were obtained from TSG data.
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present dataset contains navigation and meteorological data measured during one campaign of the Tara Oceans Expedition. Latitude and Longitude were obtained from TSG data.
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present dataset contains navigation and meteorological data measured during one campaign of the Tara Oceans Expedition. Latitude and Longitude were obtained from TSG data.
(Pages 133-183) Bottom photographs of manganese nodules fields and crusts observed in the open ocean
Resumo:
The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.
Resumo:
The Tara Oceans Expedition (2009-2013) was a global survey of ocean ecosystems aboard the Sailing Vessel Tara. It carried out extensive measurements of environmental conditions and collected plankton (viruses, bacteria, protists and metazoans) for later analysis using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data publication provides permanent links to original and updated versions of validated data files containing navigation and meteorological data collected during the entire Tara Oceans Expedition (2009-2013).
Resumo:
The book is a compilation of all available data at the time of publication (1965) on the subject of marine minerals together with the author's original ideas regarding their exploitation. It is one of the most significant publications on ocean resources. It is particularly focused on manganese deposits, their description, sedimentary setting, formation and geochemistry.
Resumo:
Internet ha rivoluzionato il modo di comunicare degli individui. Siamo testimoni della nascita e dello sviluppo di un'era caratterizzata dalla disponibilità di informazione libera e accessibile a tutti. Negli ultimi anni grazie alla diffusione di smartphone, tablet e altre tipologie di dispositivi connessi, è cambiato il fulcro dell'innovazione spostandosi dalle persone agli oggetti. E' così che nasce il concetto di Internet of Things, termine usato per descrivere la rete di comunicazione creata tra i diversi dispositivi connessi ad Internet e capaci di interagire in autonomia. Gli ambiti applicativi dell'Internet of Things spaziano dalla domotica alla sanità, dall'environmental monitoring al concetto di smart cities e così via. L'obiettivo principale di tale disciplina è quello di migliorare la vita delle persone grazie a sistemi che siano in grado di interagire senza aver bisogno dell'intervento dell'essere umano. Proprio per la natura eterogenea della disciplina e in relazione ai diversi ambiti applicativi, nell'Internet of Things si può incorrere in problemi derivanti dalla presenza di tecnologie differenti o di modalità eterogenee di memorizzazione dei dati. A questo proposito viene introdotto il concetto di Internet of Things collaborativo, termine che indica l'obiettivo di realizzare applicazioni che possano garantire interoperabilità tra i diversi ecosistemi e tra le diverse fonti da cui l'Internet of Things attinge, sfruttando la presenza di piattaforme di pubblicazione di Open Data. L'obiettivo di questa tesi è stato quello di creare un sistema per l'aggregazione di dati da due piattaforme, ThingSpeak e Sparkfun, con lo scopo di unificarli in un unico database ed estrarre informazioni significative dai dati tramite due tecniche di Data Mining: il Dictionary Learning e l'Affinity Propagation. Vengono illustrate le due metodologie che rientrano rispettivamente tra le tecniche di classificazione e di clustering.
Resumo:
A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately.
Resumo:
Here we present a case study of three cold-water coral mounds in a juvenile growth stage on top of the Pen Duick Escarpment in the Gulf of Cadiz; Alpha, Beta and Gamma mounds. Although cold-water corals are a common feature on the adjacent cliffs, mud volcanoes and open slope, no actual living cold-water coral has been observed. This multidisciplinary and integrated study comprises geophysical, sedimentological and (bio)geochemical data and aims to present a holistic view on the interaction of both environmental and geological drivers in cold-water coral mound development in the Gulf of Cadiz. Coring data evidences (past or present) methane seepage near the Pen Duick Escarpment. Several sources and pathways are proposed, among which a stratigraphic migration through uplifted Miocene series underneath the escarpment. The dominant morphology of the escarpment has influenced the local hydrodynamics within the course of the Pliocene, as documented by the emplacement of a sediment drift. Predominantly during post-Middle Pleistocene glacial episodes, favourable conditions were present for mound growth. An additional advantage for mound formation near the top of Pen Duick Escarpment is presented by seepage-related carbonate crusts which might have offered a suitable substrate for coral settling. The spatially and temporally variable character and burial stage of the observed open reef frameworks, formed by cold-water coral rubble, provides a possible model for the transition from cold-water coral reef patches towards juvenile mound. These rubble "graveyards" not only act as sediment trap but also as micro-habitat for a wide range of organisms. The presence of a fluctuating Sulphate-Methane Transition Zone has an important effect on early diagenetic processes, affecting both geochemical and physical characteristics, transforming the buried reef into a solid mound. Nevertheless, the responsible seepage fluxes seem to be locally variable. As such, the origin and evolution of the cold-water coral mounds on top of the Pen Duick Escarpment is, probably more than any other NE Atlantic cold-water coral mound province, located on the crossroads of environmental (hydrodynamic) and geological (seepage) pathways.
Resumo:
Software bug analysis is one of the most important activities in Software Quality. The rapid and correct implementation of the necessary repair influence both developers, who must leave the fully functioning software, and users, who need to perform their daily tasks. In this context, if there is an incorrect classification of bugs, there may be unwanted situations. One of the main factors to be assigned bugs in the act of its initial report is severity, which lives up to the urgency of correcting that problem. In this scenario, we identified in datasets with data extracted from five open source systems (Apache, Eclipse, Kernel, Mozilla and Open Office), that there is an irregular distribution of bugs with respect to existing severities, which is an early sign of misclassification. In the dataset analyzed, exists a rate of about 85% bugs being ranked with normal severity. Therefore, this classification rate can have a negative influence on software development context, where the misclassified bug can be allocated to a developer with little experience to solve it and thus the correction of the same may take longer, or even generate a incorrect implementation. Several studies in the literature have disregarded the normal bugs, working only with the portion of bugs considered severe or not severe initially. This work aimed to investigate this portion of the data, with the purpose of identifying whether the normal severity reflects the real impact and urgency, to investigate if there are bugs (initially classified as normal) that could be classified with other severity, and to assess if there are impacts for developers in this context. For this, an automatic classifier was developed, which was based on three algorithms (Näive Bayes, Max Ent and Winnow) to assess if normal severity is correct for the bugs categorized initially with this severity. The algorithms presented accuracy of about 80%, and showed that between 21% and 36% of the bugs should have been classified differently (depending on the algorithm), which represents somewhere between 70,000 and 130,000 bugs of the dataset.
Resumo:
Acknowledgements The authors would like to thank Jonathan Dick, Josie Geris, Jason Lessels, and Claire Tunaley for data collection and Audrey Innes for lab sample preparation. We also thank Christian Birkel for discussions about the model structure and comments on an earlier draft of the paper. Climatic data were provided by Iain Malcolm and Marine Scotland Fisheries at the Freshwater Lab, Pitlochry. Additional precipitation data were provided by the UK Meteorological Office and the British Atmospheric Data Centre (BADC).We thank the European Research Council ERC (project GA 335910 VEWA) for funding the VeWa project.