966 resultados para DATA ACQUISITION


Relevância:

60.00% 60.00%

Publicador:

Resumo:

It has long been known that cholera outbreaks can be initiated when Vibrio cholerae, the bacterium that causes cholera, is present in drinking water in sufficient numbers to constitute an infective dose, if ingested by humans. Outbreaks associated with drinking or bathing in unpurified river or brackish water may directly or indirectly depend on such conditions as water temperature, nutrient concentration, and plankton production that may be favorable for growth and reproduction of the bacterium. Although these environmental parameters have routinely been measured by using water samples collected aboard research ships, the available data sets are sparse and infrequent. Furthermore, shipboard data acquisition is both expensive and time-consuming. Interpolation to regional scales can also be problematic. Although the bacterium, V. cholerae, cannot be sensed directly, remotely sensed data can be used to infer its presence. In the study reported here, satellite data were used to monitor the timing and spread of cholera. Public domain remote sensing data for the Bay of Bengal were compared directly with cholera case data collected in Bangladesh from 1992–1995. The remote sensing data included sea surface temperature and sea surface height. It was discovered that sea surface temperature shows an annual cycle similar to the cholera case data. Sea surface height may be an indicator of incursion of plankton-laden water inland, e.g., tidal rivers, because it was also found to be correlated with cholera outbreaks. The extensive studies accomplished during the past 25 years, confirming the hypothesis that V. cholerae is autochthonous to the aquatic environment and is a commensal of zooplankton, i.e., copepods, when combined with the findings of the satellite data analyses, provide strong evidence that cholera epidemics are climate-linked.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta dissertação apresenta o desenvolvimento de uma plataforma inercial autônoma com três graus de liberdade para aplicação em estabilização de sensores - por exemplo, gravimétricos estacionários e embarcados - podendo ser utilizada também para estabilização de câmeras. O sistema é formado pela Unidade de Medida Inercial, IMU, desenvolvida utilizando um sensor micro eletromecânico, MEMS - que possui acelerômetro, giroscópio e magnetômetros nos três eixos de orientação - e um microcontrolador para aquisição, processamento e envio dos dados ao sistema de controle e aquisição de dados. Para controle dos ângulos de inclinação e orientação da plataforma, foi implementado um controlador PID digital utilizando microcontrolador. Este recebe os dados da IMU e fornece os sinais de controle utilizando as saídas PWM que acionam os motores, os quais controlam a posição da plataforma. Para monitoramento da plataforma foi desenvolvido um programa para aquisição de dados em tempo real em ambiente Matlab, por meio do qual se pode visualizar e gravar os sinais da IMU, os ângulos de inclinação e a velocidade angular. Testou-se um sistema de transmissão de dados por rádio frequência entre a IMU e o sistema de aquisição de dados e controle para avaliar a possibilidade da não utilização de slip rings ou fios entre o eixo de rotação e os quadros da plataforma. Entretanto, verificou-se a inviabilidade da transmissão em razão da baixa velocidade de transmissão e dos ruídos captados pelo receptor de rádio frequência durante osmovimentos da plataforma. Sendo assim, dois pares de fios trançados foram utilizados fios para conectar o sensor inercial ao sistema de aquisição e processamento.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O trabalho aborda a aplicação da técnica de reconciliação de dados para o balanço da movimentação de gás natural em uma malha de escoamento de gás não processado, elaborando também um método de cálculo rápido de inventário de um duto. Foram aplicadas, separadamente, a reconciliação volumétrica à condição padrão de medição e a reconciliação mássica, bem como realizadas comparações dos resultados em relação ao balanço original e verificação do balanço resultante de energia em termos de poder calorífico superior. Dois conjuntos de pesos foram aplicados, um arbitrado de acordo com o conhecimento prévio da qualidade do sistema de medição de cada um dos pontos, outro baseado no inverso da variância dos volumes diários apurados no período. Ambos apresentaram bons resultados e o segundo foi considerado o mais apropriado. Por meio de uma abordagem termodinâmica, foi avaliado o potencial impacto, ao balanço, da condensação de parte da fase gás ao longo do escoamento e a injeção de um condensado de gás natural não estabilizado por uma das fontes. Ambos tendem a impactar o balanço, sendo o resultado esperado um menor volume, massa e energia de fase gás na saída. Outros fatores de considerável impacto na qualidade dos dados e no resultado final da reconciliação são a qualidade da medição de saída do sistema e a representatividade da composição do gás neste ponto. O inventário é calculado a partir de uma regressão que se baseia em um regime permanente de escoamento, o que pode apresentar maior desvio quando fortes transientes estão ocorrendo no último dia do mês, porém a variação de inventário ao longo do mês possui baixo impacto no balanço. Concluiu-se que a reconciliação volumétrica é a mais apropriada para este sistema, pois os dados reconciliados levam os balanços mássicos e de energia em termos de poder calorífico, ambos na fase gás, para dentro do perfil esperado de comportamento. Embora um balanço volumétrico nulo apenas da fase gás não seja por si só o comportamento esperado quando se considera os efeitos descritos, para desenvolver um balanço mais robusto é necessário considerar as frações líquidas presentes no sistema, agregando maior dificuldade na aquisição e qualidade dos dados.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

É importante que as redes elétricas tenham altos índices de confiabilidade, de forma a se manter a agilidade e a manutenção ideais para um melhor funcionamento. Por outro lado, o crescimento inesperado da carga, falhas em equipamentos e uma parametrização inadequada das funções de proteção tornam a análise de eventos de proteção mais complexas e demoradas. Além disso, a quantidade de informações que pode ser obtida de relés digitais modernos tem crescido constantemente. Para que seja possível uma rápida tomada de decisão e manutenção, esse projeto de pesquisa teve como objetivo a implementação de um sistema completo de diagnóstico que é ativado automaticamente quando um evento de proteção ocorrer. As informações a serem analisadas são obtidas de uma base de dados e de relés de proteção, via protocolo de comunicação IEC 61850 e arquivos de oscilografia. O trabalho aborda o sistema Smart Grid completo incluindo: a aquisição de dados nos relés, detalhando o sistema de comunicação desenvolvido através de um software com um cliente IEC61850 e um servidor OPC e um software com um cliente OPC, que é ativado por eventos configurados para dispará-lo (por exemplo, atuação da proteção); o sistema de pré-tratamento de dados, onde os dados provenientes dos relés e equipamentos de proteção são filtrados, pré-processados e formatados; e o sistema de diagnóstico. Um banco de dados central mantém atualizados os dados de todas essas etapas. O sistema de diagnóstico utiliza algoritmos convencionais e técnicas de inteligência artificial, em particular, um sistema especialista. O sistema especialista foi desenvolvido para lidar com diferentes conjuntos de dados de entrada e com uma possível falta de dados, sempre garantindo a entrega de diagnósticos. Foram realizados testes e simulações para curtos-circuitos (trifásico, dupla-fase, dupla-fase-terra e fase-terra) em alimentadores, transformadores e barras de uma subestação. Esses testes incluíram diferentes estados do sistema de proteção (funcionamento correto e impróprio). O sistema se mostrou totalmente eficaz tanto no caso de disponibilidade completa quanto parcial de informações, sempre fornecendo um diagnóstico do curto-circuito e analisando o funcionamento das funções de proteção da subestação. Dessa forma, possibilita-se uma manutenção muito mais eficiente pelas concessionárias de energia, principalmente no que diz respeito à prevenção de defeitos em equipamentos, rápida resposta a problemas, e necessidade de reparametrização das funções de proteção. O sistema foi instalado com sucesso em uma subestação de distribuição da Companhia Paulista de Força e Luz.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Rock mass characterization requires a deep geometric understanding of the discontinuity sets affecting rock exposures. Recent advances in Light Detection and Ranging (LiDAR) instrumentation currently allow quick and accurate 3D data acquisition, yielding on the development of new methodologies for the automatic characterization of rock mass discontinuities. This paper presents a methodology for the identification and analysis of flat surfaces outcropping in a rocky slope using the 3D data obtained with LiDAR. This method identifies and defines the algebraic equations of the different planes of the rock slope surface by applying an analysis based on a neighbouring points coplanarity test, finding principal orientations by Kernel Density Estimation and identifying clusters by the Density-Based Scan Algorithm with Noise. Different sources of information —synthetic and 3D scanned data— were employed, performing a complete sensitivity analysis of the parameters in order to identify the optimal value of the variables of the proposed method. In addition, raw source files and obtained results are freely provided in order to allow to a more straightforward method comparison aiming to a more reproducible research.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this project, we propose the implementation of a 3D object recognition system which will be optimized to operate under demanding time constraints. The system must be robust so that objects can be recognized properly in poor light conditions and cluttered scenes with significant levels of occlusion. An important requirement must be met: the system must exhibit a reasonable performance running on a low power consumption mobile GPU computing platform (NVIDIA Jetson TK1) so that it can be integrated in mobile robotics systems, ambient intelligence or ambient assisted living applications. The acquisition system is based on the use of color and depth (RGB-D) data streams provided by low-cost 3D sensors like Microsoft Kinect or PrimeSense Carmine. The range of algorithms and applications to be implemented and integrated will be quite broad, ranging from the acquisition, outlier removal or filtering of the input data and the segmentation or characterization of regions of interest in the scene to the very object recognition and pose estimation. Furthermore, in order to validate the proposed system, we will create a 3D object dataset. It will be composed by a set of 3D models, reconstructed from common household objects, as well as a handful of test scenes in which those objects appear. The scenes will be characterized by different levels of occlusion, diverse distances from the elements to the sensor and variations on the pose of the target objects. The creation of this dataset implies the additional development of 3D data acquisition and 3D object reconstruction applications. The resulting system has many possible applications, ranging from mobile robot navigation and semantic scene labeling to human-computer interaction (HCI) systems based on visual information.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Irrigated agriculture is usually performed in semi-arid regions despite scarcity of water resources. Therefore, optimal irrigation management by monitoring the soil is essential, and assessing soil hydraulic properties and water flow dynamics is presented as a first measure. For this purpose, the control of volumetric water content, θ, and pressure head, h, is required. This study adopted two types of monitoring strategies in the same experimental plot to control θ and h in the vadose zone: i) non-automatic and more time-consuming; ii) automatic connected to a datalogger. Water flux was modelled with Hydrus-1D using the data collected from both acquisition strategies independently (3820 daily values for the automatic; less than 1000 for the non-automatic). Goodness-of-fit results reported a better adjustment in case of automatic sensors. Both model outputs adequately predicted the general trend of θ and h, but with slight differences in computed annual drainage (711 mm and 774 mm). Soil hydraulic properties were inversely estimated from both data acquisition systems. Major differences were obtained in the saturated volumetric water content, θs, and the n and α van Genuchten model shape parameters. Saturated hydraulic conductivity, Ks, shown lower variability with a coefficient of variation range from 0.13 to 0.24 for the soil layers defined. Soil hydraulic properties were better assessed through automatic data acquisition as data variability was lower and accuracy was higher.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The commercial data acquisition systems used for seismic exploration are usually expensive equipment. In this work, a low cost data acquisition system (Geophonino) has been developed for recording seismic signals from a vertical geophone. The signal goes first through an instrumentation amplifier, INA155, which is suitable for low amplitude signals like the seismic noise, and an anti-aliasing filter based on the MAX7404 switched-capacitor filter. After that, the amplified and filtered signal is digitized and processed by Arduino Due and registered in an SD memory card. Geophonino is configured for continuous registering, where the sampling frequency, the amplitude gain and the registering time are user-defined. The complete prototype is an open source and open hardware system. It has been tested by comparing the registered signals with the ones obtained through different commercial data recording systems and different kind of geophones. The obtained results show good correlation between the tested measurements, presenting Geophonino as a low-cost alternative system for seismic data recording.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper present a study on the behaviour of tabique walls, concerning its fire resistance. This work is based on the experimental analysis of real scale tabique panels. Such walls were made in pine wood with an earth-based mortar finishing. In order to assess the earth-based mortar thickness effect on the fire resistance of the wall, three specimens were tested with three different mortar thicknesses of 15 mm, 10 mm and 5 mm. The earth-based mortar was previously analysed in the laboratory. The wooden structures were constructed based on traditional tabique technique. The experimental models were tested in a fire-resistance furnace, according to the ISO 834 standard fire. Temperatures were recorded using two data acquisition systems (spot measuring and field measuring). Fire resistance of test elements is expressed as the time during which the appropriate criteria have been satisfied so that one can predict the time before collapse, increasing both people and property safety. The obtained results are of great importance as they allow to improve the knowledge on tabique walls behaviour subjected to fire conditions. Two performance criteria were verified: the integrity criteria and the insulation criteria.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of analogue model experiments in geology is to simulate structures in nature under specific imposed boundary conditions using materials whose rheological properties are similar to those of rocks in nature. In the late 1980s, X-ray computed tomography (CT) was first applied to the analysis of such models. In early studies only a limited number of cross-sectional slices could be recorded because of the time involved in CT data acquisition, the long cooling periods for the X-ray source and computational capacity. Technological improvements presently allow an almost unlimited number of closely spaced serial cross-sections to be acquired and calculated. Computer visualization software allows a full 3D analysis of every recorded stage. Such analyses are especially valuable when trying to understand complex geological structures, commonly with lateral changes in 3D geometry. Periodic acquisition of volumetric data sets in the course of the experiment makes it possible to carry out a 4D analysis of the model, i.e. 3D analysis through time. Examples are shown of 4D analysis of analogue models that tested the influence of lateral rheological changes on the structures obtained in contractional and extensional settings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Species distribution patterns in planktonic foraminiferal assemblages are fundamental to the understanding of the determinants of their ecology. Until now, data used to identify such distribution patterns was mainly acquired using the standard >150 µm sieve size. However, given that assemblage shell size-range in planktonic foraminifera is not constant, this data acquisition practice could introduce artefacts in the distributional data. Here, we investigated the link between assemblage shell size-range and diversity in Recent planktonic foraminifera by analysing multiple sieve-size fractions in 12 samples spanning all bioprovinces of the Atlantic Ocean. Using five diversity indices covering various aspects of community structure, we found that counts from the >63 µm fraction in polar oceans and the >125 µm elsewhere sufficiently approximate maximum diversity in all Recent assemblages. Diversity values based on counts from the >150 µm fraction significantly underestimate maximum diversity in the polar and surprisingly also in the tropical provinces. Although the new methodology changes the shape of the diversity/sea-surface temperature (SST) relationship, its strength appears unaffected. Our analysis reveals that increasing diversity in planktonic foraminiferal assemblages is coupled with a progressive addition of larger species that have distinct, offset shell-size distributions. Thus, the previously documented increase in overall assemblage shell size-range towards lower latitudes is linked to an expanding shell-size disparity between species from the same locality. This observation supports the idea that diversity and shell size-range disparity in foraminiferal assemblages are the result of niche separation. Increasing SST leads to enhanced surface water stratification and results in vertical niche separation, which permits ecological specialisation. Specific deviations from the overall diversity and shell-size disparity latitudinal pattern are seen in regions of surface-water instability, indicating that coupled shell-size and diversity measurements could be used to reconstruct water column structures of past oceans.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Modeling natural phenomena from 3D information enhances our understanding of the environment. Dense 3D point clouds are increasingly used as highly detailed input datasets. In addition to the capturing techniques of point clouds with LiDAR, low-cost sensors have been released in the last few years providing access to new research fields and facilitating 3D data acquisition for a broader range of applications. This letter presents an analysis of different speleothem features using 3D point clouds acquired with the gaming device Microsoft® Kinect. We compare the Kinect sensor with terrestrial LiDAR reference measurements using the KinFu pipeline for capturing complete 3D objects (< 4m**3). The results demonstrate the suitability of the Kinect to capture flowstone walls and to derive morphometric parameters of cave features. Although the chosen capturing strategy (KinFu) reveals a high correlation (R2=0.92) of stalagmite morphometry along the vertical object axis, a systematic overestimation (22% for radii and 44% for volume) is found. The comparison of flowstone wall datasets predominantly shows low differences (mean of 1 mm with 7 mm standard deviation) of the order of the Kinect depth precision. For both objects the major differences occur at strongly varying and curved surface structures (e.g. with fine concave parts).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Drill cores are essential for the study of deep-sea sediments and on-land sites because often no suitable outcrop is available or accessible. These cores form the backbone of stratigraphical studies using and combining various dating techniques. Cyclostratigraphy is usually based on fast and inexpensive measurements of physical sediment properties. One indirect but highly valuable proxy for reconstructing the sediment composition and variability is sediment color. However, cracks and other disturbances in sediment cores may dramatically influence the quality of color data retrieved either directly from photospectrometry or derived from core image analysis. Here we present simple but powerful algorithms to extract color data from core images, and focus on routines to exclude cracks from these images. Results are discussed using the example of an ODP core from the Ceara Rise in the Central Atlantic. The crack correction approach presented highly improves the quality of color data and allows the easy incorporation of cracked cores into studies based on core images. This facilitates the quick and inexpensive generation of large color datasets directly from quantified core images, for cyclostratigraphy and other purposes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Narcolepsy with cataplexy is a rare disease with an estimated prevalence of 0.02% in European populations. Narcolepsy shares many features of rare disorders, in particular the lack of awareness of the disease with serious consequences for healthcare supply. Similar to other rare diseases, only a few European countries have registered narcolepsy cases in databases of the International Classification of Diseases or in registries of the European health authorities. A promising approach to identify disease-specific adverse health effects and needs in healthcare delivery in the field of rare diseases is to establish a distributed expert network. A first and important step is to create a database that allows collection, storage and dissemination of data on narcolepsy in a comprehensive and systematic way. Here, the first prospective web-based European narcolepsy database hosted by the European Narcolepsy Network is introduced. The database structure, standardization of data acquisition and quality control procedures are described, and an overview provided of the first 1079 patients from 18 European specialized centres. Due to its standardization this continuously increasing data pool is most promising to provide a better insight into many unsolved aspects of narcolepsy and related disorders, including clear phenotype characterization of subtypes of narcolepsy, more precise epidemiological data and knowledge on the natural history of narcolepsy, expectations about treatment effects, identification of post-marketing medication side-effects, and will contribute to improve clinical trial designs and provide facilities to further develop phase III trials.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Federal Highway Administration, Office of Policy Planning, Office of International Program, Washington, D.C.