359 resultados para Argo


Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente trabalho possuiu como tema um estudo a respeito da percepção dos bibliotecários do Sistema de Bibliotecas (SIB) da Universidade Federal do Rio Grande (FURG) quanto ao Sistema de Administração de Bibliotecas (ARGO). Convém dizer que na atualidade falar dos sistemas informatizados é buscar caminhos para um suporte melhor a informação e ampliar o acesso para os usuários. Assim, o objetivo geral deste trabalho foi investigar a percepção dos bibliotecários quanto ao sistema de administração de bibliotecas ARGO. Sendo os objetivos específicos identificar os bibliotecários que atuam junto ao sistema; conhecer a opinião dos pesquisados sobre o sistema utilizado; refletir, a luz da literatura sobre os resultados da pesquisa; e apresentar a direção do SIB os resultados, de modo a colaborar com o Sistema. Foi utilizado como metodologia uma pesquisa exploratória, quali-quantitativa e também um estudo de caso. Entre os principais resultados se destaca que o sistema não possui os atributos necessários para um bom funcionamento. Nas considerações se observou que durante um tempo o ARGO atendeu as necessidades da instituição, mas devido ao grande crescimento da universidade e do volume de trabalho, ele deixou a desejar.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document does NOT address the issue of particle backscattering quality control (either real-time or delayed mode). As a preliminary step towards that goal, this document seeks to ensure that all countries deploying floats equipped with backscattering sensors document the data and metadata related to these floats properly. We produced this document in response to action item 9 from the first Bio-Argo Data Management meeting in Hyderabad (November 12-13, 2012). If the recommendations contained herein are followed, we will end up with a more uniform set of particle backscattering data within the Bio-Argo data system, allowing users to begin analyzing not only their own particle backscattering data, but also those of others, in the true spirit of Argo data sharing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document does NOT address the issue of chlorophyll-a quality control (either real-time or delayed mode). As a preliminary step towards that goal, this document seeks to ensure that all countries deploying floats equipped with chlorophyll-a sensors document the data and metadata related to these floats properly. We produced this document in response to action item 3 from the first Bio-Argo Data Management meeting in Hyderabad (November 12-13, 2012). If the recommendations contained herein are followed, we will end up with a more uniform set of chlorophyll-a data within the Bio-Argo data system, allowing users to begin analyzing not only their own chlorophyll-a data, but also those of others, in the true spirit of Argo data sharing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document does NOT address the issue of oxygen data quality control (either real-time or delayed mode). As a preliminary step towards that goal, this document seeks to ensure that all countries deploying floats equipped with oxygen sensors document the data and metadata related to these floats properly. We produced this document in response to action item 14 from the AST-10 meeting in Hangzhou (March 22-23, 2009). Action item 14: Denis Gilbert to work with Taiyo Kobayashi and Virginie Thierry to ensure DACs are processing oxygen data according to recommendations. If the recommendations contained herein are followed, we will end up with a more uniform set of oxygen data within the Argo data system, allowing users to begin analysing not only their own oxygen data, but also those of others, in the true spirit of Argo data sharing. Indications provided in this document are valid as of the date of writing this document. It is very likely that changes in sensors, calibrations and conversions equations will occur in the future. Please contact V. Thierry (vthierry@ifremer.fr) for any inconsistencies or missing information. A dedicated webpage on the Argo Data Management website (www) contains all information regarding Argo oxygen data management : current and previous version of this cookbook, oxygen sensor manuals, calibration sheet examples, examples of matlab code to process oxygen data, test data, etc..

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The international Argo program, consisting of a global array of more than 3000 free-drifting profiling floats, has now been monitoring the upper 2000 meters of the ocean for several years. One of its main proposed evolutions is to be able to reach the deeper ocean in order to better observe and understand the key role of the deep ocean in the climate system. For this purpose, Ifremer has designed the new “Deep-Arvor” profiling float: it extends the current operational depth down to 4000 meters, and measures temperature and salinity for up to 150 cycles with CTD pumping continuously and 200 cycles in spot sampling mode. High resolution profiles (up to 2000 points) can be transmitted and data are delivered in near real time according to Argo requirements. Deep-Arvor can be deployed everywhere at sea without any pre-ballasting operation and its light weight (~ 26kg) makes its launching easy. Its design was done to target a cost effective solution. Predefined spots have been allocated to add an optional oxygen sensor and a connector for an extra sensor. Extensive laboratory tests were successful. The results of the first at sea experiments showed that the expected performances of the operational prototypes had been reached (i.e. to perform up to 150 cycles). Meanwhile, the industrialization phase was completed in order to manufacture the Deep-Arvor float for the pilot experiment in 2015. In this paper, we detail all the steps of the development work and present the results from the at sea experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Observing system experiments (OSEs) are carried out over a 1-year period to quantify the impact of Argo observations on the Mercator Ocean 0.25° global ocean analysis and forecasting system. The reference simulation assimilates sea surface temperature (SST), SSALTO/DUACS (Segment Sol multi-missions dALTimetrie, d'orbitographie et de localisation précise/Data unification and Altimeter combination system) altimeter data and Argo and other in situ observations from the Coriolis data center. Two other simulations are carried out where all Argo and half of the Argo data are withheld. Assimilating Argo observations has a significant impact on analyzed and forecast temperature and salinity fields at different depths. Without Argo data assimilation, large errors occur in analyzed fields as estimated from the differences when compared with in situ observations. For example, in the 0–300 m layer RMS (root mean square) differences between analyzed fields and observations reach 0.25 psu and 1.25 °C in the western boundary currents and 0.1 psu and 0.75 °C in the open ocean. The impact of the Argo data in reducing observation–model forecast differences is also significant from the surface down to a depth of 2000 m. Differences between in situ observations and forecast fields are thus reduced by 20 % in the upper layers and by up to 40 % at a depth of 2000 m when Argo data are assimilated. At depth, the most impacted regions in the global ocean are the Mediterranean outflow, the Gulf Stream region and the Labrador Sea. A significant degradation can be observed when only half of the data are assimilated. Therefore, Argo observations matter to constrain the model solution, even for an eddy-permitting model configuration. The impact of the Argo floats' data assimilation on other model variables is briefly assessed: the improvement of the fit to Argo profiles do not lead globally to unphysical corrections on the sea surface temperature and sea surface height. The main conclusion is that the performance of the Mercator Ocean 0.25° global data assimilation system is heavily dependent on the availability of Argo data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document describes the general principles of Digital Object Identifiers (DOI). It provide examples of DOI implementation useful for AtlantOS H2020 project networks. A DOI is an allocation of unique identifier. Generally used to identify scientific publications, a DOI can be attributed to any physical, numerical or abstract resource.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recommendation for Oxygen Measurements from Argo Floats: Implementation of In-Air-Measurement Routine to Assure Highest Long-term Accuracy As Argo has entered its second decade and chemical/biological sensor technology is improving constantly, the marine biogeochemistry community is starting to embrace the successful Argo float program. An augmentation of the global float observatory, however, has to follow rather stringent constraints regarding sensor characteristics as well as data processing and quality control routines. Owing to the fairly advanced state of oxygen sensor technology and the high scientific value of oceanic oxygen measurements (Gruber et al., 2010), an expansion of the Argo core mission to routine oxygen measurements is perhaps the most mature and promising candidate (Freeland et al., 2010). In this context, SCOR Working Group 142 “Quality Control Procedures for Oxygen and Other Biogeochemical Sensors on Floats and Gliders” (www.scor-int.org/SCOR_WGs_WG142.htm) set out in 2014 to assess the current status of biogeochemical sensor technology with particular emphasis on float-readiness, develop pre- and post-deployment quality control metrics and procedures for oxygen sensors, and to disseminate procedures widely to ensure rapid adoption in the community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Every Argo data file submitted by a DAC for distribution on the GDAC has its format and data consistency checked by the Argo FileChecker. Two types of checks are applied: 1. Format checks. Ensures the file formats match the Argo standards precisely. 2. Data consistency checks. Additional data consistency checks are performed on a file after it passes the format checks. These checks do not duplicate any of the quality control checks performed elsewhere. These checks can be thought of as “sanity checks” to ensure that the data are consistent with each other. The data consistency checks enforce data standards and ensure that certain data values are reasonable and/or consistent with other information in the files. Examples of the “data standard” checks are the “mandatory parameters” defined for meta-data files and the technical parameter names in technical data files. Files with format or consistency errors are rejected by the GDAC and are not distributed. Less serious problems will generate warnings and the file will still be distributed on the GDAC. Reference Tables and Data Standards: Many of the consistency checks involve comparing the data to the published reference tables and data standards. These tables are documented in the User’s Manual. (The FileChecker implements “text versions” of these tables.)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The only method used to date to measure dissolved nitrate concentration (NITRATE) with sensors mounted on profiling floats is based on the absorption of light at ultraviolet wavelengths by nitrate ion (Johnson and Coletti, 2002; Johnson et al., 2010; 2013; D’Ortenzio et al., 2012). Nitrate has a modest UV absorption band with a peak near 210 nm, which overlaps with the stronger absorption band of bromide, which has a peak near 200 nm. In addition, there is a much weaker absorption due to dissolved organic matter and light scattering by particles (Ogura and Hanya, 1966). The UV spectrum thus consists of three components, bromide, nitrate and a background due to organics and particles. The background also includes thermal effects on the instrument and slow drift. All of these latter effects (organics, particles, thermal effects and drift) tend to be smooth spectra that combine to form an absorption spectrum that is linear in wavelength over relatively short wavelength spans. If the light absorption spectrum is measured in the wavelength range around 217 to 240 nm (the exact range is a bit of a decision by the operator), then the nitrate concentration can be determined. Two different instruments based on the same optical principles are in use for this purpose. The In Situ Ultraviolet Spectrophotometer (ISUS) built at MBARI or at Satlantic has been mounted inside the pressure hull of a Teledyne/Webb Research APEX and NKE Provor profiling floats and the optics penetrate through the upper end cap into the water. The Satlantic Submersible Ultraviolet Nitrate Analyzer (SUNA) is placed on the outside of APEX, Provor, and Navis profiling floats in its own pressure housing and is connected to the float through an underwater cable that provides power and communications. Power, communications between the float controller and the sensor, and data processing requirements are essentially the same for both ISUS and SUNA. There are several possible algorithms that can be used for the deconvolution of nitrate concentration from the observed UV absorption spectrum (Johnson and Coletti, 2002; Arai et al., 2008; Sakamoto et al., 2009; Zielinski et al., 2011). In addition, the default algorithm that is available in Satlantic sensors is a proprietary approach, but this is not generally used on profiling floats. There are some tradeoffs in every approach. To date almost all nitrate sensors on profiling floats have used the Temperature Compensated Salinity Subtracted (TCSS) algorithm developed by Sakamoto et al. (2009), and this document focuses on that method. It is likely that there will be further algorithm development and it is necessary that the data systems clearly identify the algorithm that is used. It is also desirable that the data system allow for recalculation of prior data sets using new algorithms. To accomplish this, the float must report not just the computed nitrate, but the observed light intensity. Then, the rule to obtain only one NITRATE parameter is, if the spectrum is present then, the NITRATE should be recalculated from the spectrum while the computation of nitrate concentration can also generate useful diagnostics of data quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report covers the activity of Coriolis data centre for a one-year period from September 1st 2015 to August 31th 2016.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document is the Argo GDAC cook-book. It describes the detailed implementation of the GDAC services. It ensures that both GDACs provides the same services. The GDAC is Argo's "Global Data Assembly Centre" which aggregates Argo metadata, profile, trajectory and technical files.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document is the Argo quality control manual for Dissolved oxygen concentration. It describes two levels of quality control: • The first level is the real-time system that performs a set of agreed automatic checks. • Adjustment in real-time can also be performed and the real-time system can evaluate quality flags for adjusted fields • The second level is the delayed-mode quality control system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biogeochemical-Argo is the extension of the Argo array of profiling floats to include floats that are equipped with biogeochemical sensors for pH, oxygen, nitrate, chlorophyll, suspended particles, and downwelling irradiance. Argo is a highly regarded, international program that measures the changing ocean temperature (heat content) and salinity with profiling floats distributed throughout the ocean. Newly developed sensors now allow profiling floats to also observe biogeochemical properties with sufficient accuracy for climate studies. This extension of Argo will enable an observing system that can determine the seasonal to decadal-scale variability in biological productivity, the supply of essential plant nutrients from deep-waters to the sunlit surface layer, ocean acidification, hypoxia, and ocean uptake of CO2. Biogeochemical-Argo will drive a transformative shift in our ability to observe and predict the effects of climate change on ocean metabolism, carbon uptake, and living marine resource management. Presently, vast areas of the open ocean are sampled only once per decade or less, with sampling occurring mainly in summer. Our ability to detect changes in biogeochemical processes that may occur due to the warming and acidification driven by increasing atmospheric CO2, as well as by natural climate variability, is greatly hindered by this undersampling. In close synergy with satellite systems (which are effective at detecting global patterns for a few biogeochemical parameters, but only very close to the sea surface and in the absence of clouds), a global array of biogeochemical sensors would revolutionize our understanding of ocean carbon uptake, productivity, and deoxygenation. The array would reveal the biological, chemical, and physical events that control these processes. Such a system would enable a new generation of global ocean prediction systems in support of carbon cycling, acidification, hypoxia and harmful algal blooms studies, as well as the management of living marine resources. In order to prepare for a global Biogeochemical-Argo array, several prototype profiling float arrays have been developed at the regional scale by various countries and are now operating. Examples include regional arrays in the Southern Ocean (SOCCOM ), the North Atlantic Sub-polar Gyre (remOcean ), the Mediterranean Sea (NAOS ), the Kuroshio region of the North Pacific (INBOX ), and the Indian Ocean (IOBioArgo ). For example, the SOCCOM program is deploying 200 profiling floats with biogeochemical sensors throughout the Southern Ocean, including areas covered seasonally with ice. The resulting data, which are publically available in real time, are being linked with computer models to better understand the role of the Southern Ocean in influencing CO2 uptake, biological productivity, and nutrient supply to distant regions of the world ocean. The success of these regional projects has motivated a planning meeting to discuss the requirements for and applications of a global-scale Biogeochemical-Argo program. The meeting was held 11-13 January 2016 in Villefranche-sur-Mer, France with attendees from eight nations now deploying Argo floats with biogeochemical sensors present to discuss this topic. In preparation, computer simulations and a variety of analyses were conducted to assess the resources required for the transition to a global-scale array. Based on these analyses and simulations, it was concluded that an array of about 1000 biogeochemical profiling floats would provide the needed resolution to greatly improve our understanding of biogeochemical processes and to enable significant improvement in ecosystem models. With an endurance of four years for a Biogeochemical-Argo float, this system would require the procurement and deployment of 250 new floats per year to maintain a 1000 float array. The lifetime cost for a Biogeochemical-Argo float, including capital expense, calibration, data management, and data transmission, is about $100,000. A global Biogeochemical-Argo system would thus cost about $25,000,000 annually. In the present Argo paradigm, the US provides half of the profiling floats in the array, while the EU, Austral/Asia, and Canada share most the remaining half. If this approach is adopted, the US cost for the Biogeochemical-Argo system would be ~$12,500,000 annually and ~$6,250,000 each for the EU, and Austral/Asia and Canada. This includes no direct costs for ship time and presumes that float deployments can be carried out from future research cruises of opportunity, including, for example, the international GO-SHIP program (http://www.go-ship.org). The full-scale implementation of a global Biogeochemical-Argo system with 1000 floats is feasible within a decade. The successful, ongoing pilot projects have provided the foundation and start for such a system.