884 resultados para Industrial efficiency -- Sri Lanka -- Measurement -- Data processing.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a solution to part of the problem of making robotic or semi-robotic digging equipment less dependant on human supervision. A method is described for identifying rocks of a certain size that may affect digging efficiency or require special handling. The process involves three main steps. First, by using range and intensity data from a time-of-flight (TOF) camera, a feature descriptor is used to rank points and separate regions surrounding high scoring points. This allows a wide range of rocks to be recognized because features can represent a whole or just part of a rock. Second, these points are filtered to extract only points thought to belong to the large object. Finally, a check is carried out to verify that the resultant point cloud actually represents a rock. Results are presented from field testing on piles of fragmented rock. Note to Practitioners—This paper presents an algorithm to identify large boulders in a pile of broken rock as a step towards an autonomous mining dig planner. In mining, piles of broken rock can contain large fragments that may need to be specially handled. To assess rock piles for excavation, we make use of a TOF camera that does not rely on external lighting to generate a point cloud of the rock pile. We then segment large boulders from its surface by using a novel feature descriptor and distinguish between real and false boulder candidates. Preliminary field experiments show promising results with the algorithm performing nearly as well as human test subjects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major weakness among loading models for pedestrians walking on flexible structures proposed in recent years is the various uncorroborated assumptions made in their development. This applies to spatio-temporal characteristics of pedestrian loading and the nature of multi-object interactions. To alleviate this problem, a framework for the determination of localised pedestrian forces on full-scale structures is presented using a wireless attitude and heading reference systems (AHRS). An AHRS comprises a triad of tri-axial accelerometers, gyroscopes and magnetometers managed by a dedicated data processing unit, allowing motion in three-dimensional space to be reconstructed. A pedestrian loading model based on a single point inertial measurement from an AHRS is derived and shown to perform well against benchmark data collected on an instrumented treadmill. Unlike other models, the current model does not take any predefined form nor does it require any extrapolations as to the timing and amplitude of pedestrian loading. In order to assess correctly the influence of the moving pedestrian on behaviour of a structure, an algorithm for tracking the point of application of pedestrian force is developed based on data from a single AHRS attached to a foot. A set of controlled walking tests with a single pedestrian is conducted on a real footbridge for validation purposes. A remarkably good match between the measured and simulated bridge response is found, indeed confirming applicability of the proposed framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introdução e objetivo: Atualmente as Instituições Particulares de Solidariedade Social deparam-se com mudanças de caracter social, económico e legislativo, que têm afetado o seu funcionamento e financiamento. Pelo que, impõe-se às suas direções responder às necessidades sociais com maior responsabilidade e eficiência num contexto de maior escassez de recursos. Neste sentido, o presente estudo tem como objetivo compreender o modo como as Instituições Particulares de Solidariedade Social tomam decisões, ao nível do financiamento, para um funcionamento eficiente das mesmas. Metodologia: Optou-se por realizar estudos de caso com uma amostra constituída por quatro Instituições Particulares de Solidariedade Social. A recolha de dados foi feita através de entrevistas semiestruturadas e análise documental. O tratamento de dados foi feito através de análise de conteúdo e com recurso ao software QRS Nvivo versão 10. Resultados: Os principais resultados indicam que: a) as necessidades sociais influenciam decisões de aumento e diminuição da capacidade de respostas das instituições; b) o sistema legal influencia a perpetuação de intervenções de caracter institucional; c) a conjuntura económica influencia a pressão sobre o preço da comparticipação familiar e o aumento da concorrência entre instituições; d) a escassez de recursos constitui-se como denominador comum entre instituições, influenciando decisões de investimento que assumem o financiamento público como um facto consumado; e) as práticas de liderança e gestão desenvolvidas por direções com elementos que têm conhecimentos na área financeira são mais propensas a assumir o risco e a aumentar a complexidade operativa das instituições f) as práticas de envolvimento de stakeholders internos e externos contribuem para a aquisição de apoio na prossecução dos seus objetivos. Conclusão: As tomadas de decisão das instituições com acordos com a segurança social assemelham-se por prevalecer o desenvolvimento de respostas tipificadas, com acordo com a segurança social. Apesar disso, os resultados evidenciam a importância de práticas de liderança e gestão desenvolvidas com a presença de elementos com conhecimentos na área financeira, para o desenvolvimento de respostas tipificadas com rentabilidade económica. Salienta-se ainda que o desenvolvimento de práticas de envolvimento de stakeholders internos e externos, baseados na responsabilização e transparência, promovem o alcance de apoios para assegurar o desenvolvimento das atividades institucionais, com maior incidência na instituição sem acordos com a segurança social, mas que os mesmos não asseguram a sua eficiência económica.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Business Process Management (BPM) is able to organize and frame a company focusing in the improvement or assurance of performance in order to gain competitive advantage. Although it is believed that BPM improves various aspects of organizational performance, there has been a lack of empirical evidence about this. The present study has the purpose to develop a model to show the impact of business process management in organizational performance. To accomplish that, the theoretical basis required to know the elements that configurate BPM and the measures that can evaluate the BPM success on organizational performance is built through a systematic literature review (SLR). Then, a research model is proposed according to SLR results. Empirical data will be collected from a survey of  larg and mid-sized industrial and service companies headquartered in Brazil. A quantitative analysis will be performed using structural equation modeling (SEM) to show if the direct effects among BPM and organizational performance can be considered statistically significant. At the end will discuss these results and their managerial and cientific implications.Keywords: Business process management (BPM). Organizational performance. Firm performance. Business models. Structural Equation Modeling. Systematic Literature Review.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in the massively parallel computational abilities of graphical processing units (GPUs) have increased their use for general purpose computation, as companies look to take advantage of big data processing techniques. This has given rise to the potential for malicious software targeting GPUs, which is of interest to forensic investigators examining the operation of software. The ability to carry out reverse-engineering of software is of great importance within the security and forensics elds, particularly when investigating malicious software or carrying out forensic analysis following a successful security breach. Due to the complexity of the Nvidia CUDA (Compute Uni ed Device Architecture) framework, it is not clear how best to approach the reverse engineering of a piece of CUDA software. We carry out a review of the di erent binary output formats which may be encountered from the CUDA compiler, and their implications on reverse engineering. We then demonstrate the process of carrying out disassembly of an example CUDA application, to establish the various techniques available to forensic investigators carrying out black-box disassembly and reverse engineering of CUDA binaries. We show that the Nvidia compiler, using default settings, leaks useful information. Finally, we demonstrate techniques to better protect intellectual property in CUDA algorithm implementations from reverse engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A crescente urbanização global tem como consequência o aumento dos níveis de poluentes na atmosfera e a respetiva deterioração da qualidade do ar. O controlo da poluição atmosférica e monitorização da qualidade do ar são passos fundamentais para implementar estratégias de redução e estimular a consciência ambiental dos cidadãos. Com este intuito, existem várias técnicas e tecnologias que podem ser usadas para monitorizar a qualidade do ar. A utilização de microsensores surge como uma ferramenta inovadora para a monitorização da qualidade do ar. E, apesar dos desempenhos dos microsensores permitirem uma nova estratégia, resultando em respostas rápidas, baixos custos operacionais e eficiências elevadas, que não podem ser alcançados apenas com abordagens convencionais, ainda é necessário aprofundar o conhecimento a fim de integrar estas novas tecnologias, particularmente quanto à verificação do desempenho dos sensores comparativamente aos métodos de referência em campanhas experimentais. Esta dissertação, desenvolvida no Instituto do Ambidente e Desenvolvimento em forma de estágio, teve como objetivo a avaliação do desempenho de sensores de baixo custo comparativamente com os métodos de referência, tendo como base uma campanha de monitorização da qualidade do ar realizada no centro de Aveiro durante 2 semanas de outubro de 2014. De forma mais específica pretende-se perceber até que ponto se podem utilizar sensores de baixo custo que cumpram os requisitos especificados na legislação e as especificidades das normas, estabelecendo assim um protocolo de avaliação de microsensores. O trabalho realizado passou ainda pela caracterização da qualidade do ar no centro de Aveiro para o período da campanha de monitorização. A aplicação de microsensores eletroquímicos, MOS e OPC em paralelo com equipamento de referência neste estudo de campo permitiu avaliar a fiabilidade e a incerteza destas novas tecnologias de monitorização. Com este trabalho verificou-se que os microsensores eletroquímicos são mais precisos comparativamente aos microsensores baseados em óxidos metálicos, apresentando correlações fortes com os métodos de referência para diversos poluentes. Por sua vez, os resultados obtidos pelos contadores óticos de partículas foram satisfatórios, contudo poderiam ser melhorados quer pelo modo de amostragem, quer pelo método de tratamento de dados aplicado. Idealmente, os microsensores deveriam apresentar fortes correlações com o método de referência e elevada eficiência de recolha de dados. No entanto, foram identificados alguns problemas na eficiência de recolha de dados dos sensores que podem estar relacionados com a humidade relativa e temperaturas elevadas durante a campanha, falhas de comunicação intermitentes e, também, a instabilidade e reatividade causada por gases interferentes. Quando as limitações das tecnologias de sensores forem superadas e os procedimentos adequados de garantia e controlo de qualidade possam ser cumpridos, os sensores de baixo custo têm um grande potencial para permitir a monitorização da qualidade do ar com uma elevada cobertura espacial, sendo principalmente benéfico em áreas urbanas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A crescente urbanização global tem como consequência o aumento dos níveis de poluentes na atmosfera e a respetiva deterioração da qualidade do ar. O controlo da poluição atmosférica e monitorização da qualidade do ar são passos fundamentais para implementar estratégias de redução e estimular a consciência ambiental dos cidadãos. Com este intuito, existem várias técnicas e tecnologias que podem ser usadas para monitorizar a qualidade do ar. A utilização de microsensores surge como uma ferramenta inovadora para a monitorização da qualidade do ar. E, apesar dos desempenhos dos microsensores permitirem uma nova estratégia, resultando em respostas rápidas, baixos custos operacionais e eficiências elevadas, que não podem ser alcançados apenas com abordagens convencionais, ainda é necessário aprofundar o conhecimento a fim de integrar estas novas tecnologias, particularmente quanto à verificação do desempenho dos sensores comparativamente aos métodos de referência em campanhas experimentais. Esta dissertação, desenvolvida no Instituto do Ambidente e Desenvolvimento em forma de estágio, teve como objetivo a avaliação do desempenho de sensores de baixo custo comparativamente com os métodos de referência, tendo como base uma campanha de monitorização da qualidade do ar realizada no centro de Aveiro durante 2 semanas de outubro de 2014. De forma mais específica pretende-se perceber até que ponto se podem utilizar sensores de baixo custo que cumpram os requisitos especificados na legislação e as especificidades das normas, estabelecendo assim um protocolo de avaliação de microsensores. O trabalho realizado passou ainda pela caracterização da qualidade do ar no centro de Aveiro para o período da campanha de monitorização. A aplicação de microsensores eletroquímicos, MOS e OPC em paralelo com equipamento de referência neste estudo de campo permitiu avaliar a fiabilidade e a incerteza destas novas tecnologias de monitorização. Com este trabalho verificou-se que os microsensores eletroquímicos são mais precisos comparativamente aos microsensores baseados em óxidos metálicos, apresentando correlações fortes com os métodos de referência para diversos poluentes. Por sua vez, os resultados obtidos pelos contadores óticos de partículas foram satisfatórios, contudo poderiam ser melhorados quer pelo modo de amostragem, quer pelo método de tratamento de dados aplicado. Idealmente, os microsensores deveriam apresentar fortes correlações com o método de referência e elevada eficiência de recolha de dados. No entanto, foram identificados alguns problemas na eficiência de recolha de dados dos sensores que podem estar relacionados com a humidade relativa e temperaturas elevadas durante a campanha, falhas de comunicação intermitentes e, também, a instabilidade e reatividade causada por gases interferentes. Quando as limitações das tecnologias de sensores forem superadas e os procedimentos adequados de garantia e controlo de qualidade possam ser cumpridos, os sensores de baixo custo têm um grande potencial para permitir a monitorização da qualidade do ar com uma elevada cobertura espacial, sendo principalmente benéfico em áreas urbanas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a low cost architecture for development of synchronized phasor measurement units (PMU). The device is intended to be connected in the low voltage grid, which allows the monitoring of transmission and distribution networks. Developments of this project include a complete PMU, with instrumentation module for use in low voltage network, GPS module to provide the sync signal and time stamp for the measures, processing unit with the acquisition system, phasor estimation and formatting data according to the standard and finally, communication module for data transmission. For the development and evaluation of the performance of this PMU, it was developed a set of applications in LabVIEW environment with specific features that let analyze the behavior of the measures and identify the sources of error of the PMU, as well as to apply all the tests proposed by the standard. The first application, useful for the development of instrumentation, consists of a function generator integrated with an oscilloscope, which allows the generation and acquisition of signals synchronously, in addition to the handling of samples. The second and main, is the test platform, with capabality of generating all tests provided by the synchronized phasor measurement standard IEEE C37.118.1, allowing store data or make the analysis of the measurements in real time. Finally, a third application was developed to evaluate the results of the tests and generate calibration curves to adjust the PMU. The results include all the tests proposed by synchrophasors standard and an additional test that evaluates the impact of noise. Moreover, through two prototypes connected to the electrical installation of consumers in same distribution circuit, it was obtained monitoring records that allowed the identification of loads in consumer and power quality analysis, beyond the event detection at the distribution and transmission levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three types of forecasts of the total Australian production of macadamia nuts (t nut-in-shell) have been produced early each year since 2001. The first is a long-term forecast, based on the expected production from the tree census data held by the Australian Macadamia Society, suitably scaled up for missing data and assumed new plantings each year. These long-term forecasts range out to 10 years in the future, and form a basis for industry and market planning. Secondly, a statistical adjustment (termed the climate-adjusted forecast) is made annually for the coming crop. As the name suggests, climatic influences are the dominant factors in this adjustment process, however, other terms such as bienniality of bearing, prices and orchard aging are also incorporated. Thirdly, industry personnel are surveyed early each year, with their estimates integrated into a growers and pest-scouts forecast. Initially conducted on a 'whole-country' basis, these models are now constructed separately for the six main production regions of Australia, with these being combined for national totals. Ensembles or suites of step-forward regression models using biologically-relevant variables have been the major statistical method adopted, however, developing methodologies such as nearest-neighbour techniques, general additive models and random forests are continually being evaluated in parallel. The overall error rates average 14% for the climate forecasts, and 12% for the growers' forecasts. These compare with 7.8% for USDA almond forecasts (based on extensive early-crop sampling) and 6.8% for coconut forecasts in Sri Lanka. However, our somewhatdisappointing results were mainly due to a series of poor crops attributed to human reasons, which have now been factored into the models. Notably, the 2012 and 2013 forecasts averaged 7.8 and 4.9% errors, respectively. Future models should also show continuing improvement, as more data-years become available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Solar Intensity X-ray and particle Spectrometer (SIXS) on board BepiColombo's Mercury Planetary Orbiter (MPO) will study solar energetic particles moving towards Mercury and solar X-rays on the dayside of Mercury. The SIXS instrument consists of two detector sub-systems; X-ray detector SIXS-X and particle detector SIXS-P. The SIXS-P subdetector will detect solar energetic electrons and protons in a broad energy range using a particle telescope approach with five outer Si detectors around a central CsI(Tl) scintillator. The measurements made by the SIXS instrument are necessary for other instruments on board the spacecraft. SIXS data will be used to study the Solar X-ray corona, solar flares, solar energetic particles, the Hermean magnetosphere, and solar eruptions. The SIXS-P detector was calibrated by comparing experimental measurement data from the instrument with Geant4 simulation data. Calibration curves were produced for the different side detectors and the core scintillator for electrons and protons, respectively. The side detector energy response was found to be linear for both electrons and protons. The core scintillator energy response to protons was found to be non-linear. The core scintillator calibration for electrons was omitted due to insufficient experimental data. The electron and proton acceptance of the SIXS-P detector was determined with Geant4 simulations. Electron and proton energy channels are clean in the main energy range of the instrument. At higher energies, protons and electrons produce non-ideal response in the energy channels. Due to the limited bandwidth of the spacecraft's telemetry, the particle measurements made by SIXS-P have to be pre-processed in the data processing unit of the SIXS instrument. A lookup table was created for the pre-processing of data with Geant4 simulations, and the ability of the lookup table to provide spectral information from a simulated electron event was analysed. The lookup table produces clean electron and proton channels and is able to separate protons and electrons. Based on a simulated solar energetic electron event, the incident electron spectrum cannot be determined from channel particle counts with a standard analysis method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dataloggerit ovat tärkeitä mittaustekniikassa käytettäviä mittalaitteita, joiden tarkoituksena on kerätä talteen mittausdataa pitkiltä aikaväleiltä. Dataloggereita voidaan käyttää esimerkiksi teollista prosessia osana olevien toimilaitteiden tai kotitalouden energiajärjestelmän seurannassa. Teollisen luokan dataloggerit ovat yleensä hinnaltaan satojen tai tuhansien eurojen luokkaa. Työssä pyrittiin löytämään teollisen luokan laitteille halpa ja helppokäyttöinen vaihtoehto, joka on kuitenkin riittävän tehokas ja toimiva. Työssä suunniteltiin ja toteutettiin dataloggeri Raspberry Pi-alustalle ja testattiin sitä oikeaa teollista ympäristöä vastaavissa olosuhteissa. Kirjallisuudesta ja internet artikkeleista etsittiin samankaltaisia laite- ja ohjelmistoratkaisuja ja niitä käytettiin dataloggausjärjestelmän pohjana. Raspberry Pi-alustalle koodattiin yksinkertainen Python-kielinen data-loggausohjelma, joka käyttää Modbus-tiedonsiirtoprotokollaa. Testien perusteella voidaan todeta, että toteutettu dataloggeri on toimiva ja kykenee kaupallisten dataloggereiden tasoiseen mittaukseen ainakin pienillä näytteistystaajuuksilla. Toteutettu dataloggeri on myös huomattavasti kaupallisia dataloggereita halvempi. Helppokäyttöisyyden näkökulmasta dataloggerissa havaittiin puutteita, joita käydään läpi jatkokehitysideoiden muodossa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A avaliação do desempenho e a sua aplicação são, no actual enquadramento socioeconómico, cada vez mais necessárias, como forma de melhorar a eficácia e a eficiência entre as organizações. Para a prática profissional verifica-se que a implementação da avaliação de desempenho, apresenta lacunas que podem comprometê-la. Sabendo da importância do profissional para o sucesso da implementação da avaliação de desempenho, toma-se fundamental, analisar a sua percepção face ao modelo e sua implementação. O problema da instrução do presente estudo pretende saber quais as percepções dos técnicos de radiologia avaliados sujeitos ao modelo de avaliação de desempenho implantado no Hospital Curry Cabral. Tendo como objectivo ao nível da gestão, minimizar os obstáculos de implementação e maximizar os pontos fortes. Após pesquisa bibliográfica sobre os principais conceitos, foi definido domo objectivo da investigação empírica: - Comparar as percepções dos técnicos de radiologia submetidos ao modelo de avaliação desempenho ”Pró-Activo”. Recorrendo a uma metodologia exploratória e descritiva, estudámos o impacto dessas ferramentas utilizadas dentro da Unidade Hospitalar do estudo. A metodologia de recolha de informação aos profissionais expostos do estudo assentou em questionários de perguntas fechadas e abertas ambas com uma abordagem de carácter quantitativo. Para a análise e tratamento dos dados utilizou-se programas informáticos. As principais conclusões: verifica-se a falta de formação para todos os envolvidos, um processo desprovido de imparcialidade, neutralidade e rigor, bem como uma motivação geral dos profissionais em matéria de avaliação de desempenho. ABSTRACT - The evaluation of performance and its implementation are, in the current socioeconomic framing, each time more necessary as form to improve the effectiveness and the efficiency amidst the organizations. For the practical professional it is verified that the implementation of the performance evaluation, presents gaps that can compromise it. Knowing the importance of the individuals in the success of the implementation of the performance evaluation, it becomes basic to analyze its perceptions face to the model and its implementation. The problem of inquiry of this study pretends to know which the perceptions of the technicians of radiology evaluated and appraisers face to the models of evaluation of performance implanted in the Portuguese Hospitals (Hospital Curry Cabral). The purpose to the level of the management is to minimize the obstacles of implementation and to maximize the strong points. Bibliographical research on the main concepts was effectuated, after what we define the objectives of empirica inquiry: - To compare the perceptions of the radiology technicians subjected to models of performance evaluation “Pro-Activa”. Using an exploratory and descriptive approach, studied the impact of these tools used in the hospitals of the study. The methodology for collecting information to professionals out of the study based on questionnaires of both open and closed questions with a quantitative approach to nature. For analysis and data processing software was used. The main conclusions: there is the lack of training for all involved, a process devoid of impartiality, neutrality and accuracy, as well as a general motivation of the professionals regarding the evaluation of performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The only method used to date to measure dissolved nitrate concentration (NITRATE) with sensors mounted on profiling floats is based on the absorption of light at ultraviolet wavelengths by nitrate ion (Johnson and Coletti, 2002; Johnson et al., 2010; 2013; D’Ortenzio et al., 2012). Nitrate has a modest UV absorption band with a peak near 210 nm, which overlaps with the stronger absorption band of bromide, which has a peak near 200 nm. In addition, there is a much weaker absorption due to dissolved organic matter and light scattering by particles (Ogura and Hanya, 1966). The UV spectrum thus consists of three components, bromide, nitrate and a background due to organics and particles. The background also includes thermal effects on the instrument and slow drift. All of these latter effects (organics, particles, thermal effects and drift) tend to be smooth spectra that combine to form an absorption spectrum that is linear in wavelength over relatively short wavelength spans. If the light absorption spectrum is measured in the wavelength range around 217 to 240 nm (the exact range is a bit of a decision by the operator), then the nitrate concentration can be determined. Two different instruments based on the same optical principles are in use for this purpose. The In Situ Ultraviolet Spectrophotometer (ISUS) built at MBARI or at Satlantic has been mounted inside the pressure hull of a Teledyne/Webb Research APEX and NKE Provor profiling floats and the optics penetrate through the upper end cap into the water. The Satlantic Submersible Ultraviolet Nitrate Analyzer (SUNA) is placed on the outside of APEX, Provor, and Navis profiling floats in its own pressure housing and is connected to the float through an underwater cable that provides power and communications. Power, communications between the float controller and the sensor, and data processing requirements are essentially the same for both ISUS and SUNA. There are several possible algorithms that can be used for the deconvolution of nitrate concentration from the observed UV absorption spectrum (Johnson and Coletti, 2002; Arai et al., 2008; Sakamoto et al., 2009; Zielinski et al., 2011). In addition, the default algorithm that is available in Satlantic sensors is a proprietary approach, but this is not generally used on profiling floats. There are some tradeoffs in every approach. To date almost all nitrate sensors on profiling floats have used the Temperature Compensated Salinity Subtracted (TCSS) algorithm developed by Sakamoto et al. (2009), and this document focuses on that method. It is likely that there will be further algorithm development and it is necessary that the data systems clearly identify the algorithm that is used. It is also desirable that the data system allow for recalculation of prior data sets using new algorithms. To accomplish this, the float must report not just the computed nitrate, but the observed light intensity. Then, the rule to obtain only one NITRATE parameter is, if the spectrum is present then, the NITRATE should be recalculated from the spectrum while the computation of nitrate concentration can also generate useful diagnostics of data quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The CATARINA Leg1 cruise was carried out from June 22 to July 24 2012 on board the B/O Sarmiento de Gamboa, under the scientific supervision of Aida Rios (CSIC-IIM). It included the occurrence of the OVIDE hydrological section that was performed in June 2002, 2004, 2006, 2008 and 2010, as part of the CLIVAR program (name A25) ), and under the supervision of Herlé Mercier (CNRSLPO). This section begins near Lisbon (Portugal), runs through the West European Basin and the Iceland Basin, crosses the Reykjanes Ridge (300 miles north of Charlie-Gibbs Fracture Zone, and ends at Cape Hoppe (southeast tip of Greenland). The objective of this repeated hydrological section is to monitor the variability of water mass properties and main current transports in the basin, complementing the international observation array relevant for climate studies. In addition, the Labrador Sea was partly sampled (stations 101-108) between Greenland and Newfoundland, but heavy weather conditions prevented the achievement of the section south of 53°40’N. The quality of CTD data is essential to reach the first objective of the CATARINA project, i.e. to quantify the Meridional Overturning Circulation and water mass ventilation changes and their effect on the changes in the anthropogenic carbon ocean uptake and storage capacity. The CATARINA project was mainly funded by the Spanish Ministry of Sciences and Innovation and co-funded by the Fondo Europeo de Desarrollo Regional. The hydrological OVIDE section includes 95 surface-bottom stations from coast to coast, collecting profiles of temperature, salinity, oxygen and currents, spaced by 2 to 25 Nm depending on the steepness of the topography. The position of the stations closely follows that of OVIDE 2002. In addition, 8 stations were carried out in the Labrador Sea. From the 24 bottles closed at various depth at each stations, samples of sea water are used for salinity and oxygen calibration, and for measurements of biogeochemical components that are not reported here. The data were acquired with a Seabird CTD (SBE911+) and an SBE43 for the dissolved oxygen, belonging to the Spanish UTM group. The software SBE data processing was used after decoding and cleaning the raw data. Then, the LPO matlab toolbox was used to calibrate and bin the data as it was done for the previous OVIDE cruises, using on the one hand pre and post-cruise calibration results for the pressure and temperature sensors (done at Ifremer) and on the other hand the water samples of the 24 bottles of the rosette at each station for the salinity and dissolved oxygen data. A final accuracy of 0.002°C, 0.002 psu and 0.04 ml/l (2.3 umol/kg) was obtained on final profiles of temperature, salinity and dissolved oxygen, compatible with international requirements issued from the WOCE program.