935 resultados para Electronic data processing - Distributed processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

El poder disponer de la instrumentación y los equipos electrónicos resulta vital en el diseño de circuitos analógicos. Permiten realizar las pruebas necesarias y el estudio para el buen funcionamiento de estos circuitos. Los equipos se pueden diferenciar en instrumentos de excitación, los que proporcionan las señales al circuito, y en instrumentos de medida, los que miden las señales generadas por el circuito. Estos equipos sirven de gran ayuda pero a su vez tienen un precio elevado lo que impide en muchos casos disponer de ellos. Por esta principal desventaja, se hace necesario conseguir un dispositivo de bajo coste que sustituya de alguna manera a los equipos reales. Si el instrumento es de medida, este sistema de bajo coste puede ser implementado mediante un equipo hardware encargado de adquirir los datos y una aplicación ejecutándose en un ordenador donde analizarlos y presentarlos en la pantalla. En el caso de que el instrumento sea de excitación, el único cometido del sistema hardware es el de proporcionar las señales cuya configuración ha enviado el ordenador. En un equipo real, es el propio equipo el que debe realizar todas esas acciones: adquisición, procesamiento y presentación de los datos. Además, la dificultad de realizar modificaciones o ampliaciones de las funcionalidades en un instrumento tradicional con respecto a una aplicación de queda patente. Debido a que un instrumento tradicional es un sistema cerrado y uno cuya configuración o procesamiento de datos es hecho por una aplicación, algunas de las modificaciones serían realizables modificando simplemente el software del programa de control, por lo que el coste de las modificaciones sería menor. En este proyecto se pretende implementar un sistema hardware que tenga las características y realice las funciones del equipamiento real que se pueda encontrar en un laboratorio de electrónica. También el desarrollo de una aplicación encargada del control y el análisis de las señales adquiridas, cuya interfaz gráfica se asemeje a la de los equipos reales para facilitar su uso. ABSTRACT. The instrumentation and electronic equipment are vital for the design of analogue circuits. They enable to perform the necessary testing and study for the proper functioning of these circuits. The devices can be classified into the following categories: excitation instruments, which transmit the signals to the circuit, and measuring instruments, those in charge of measuring the signals produced by the circuit. This equipment is considerably helpful, however, its high price often makes it hardly accessible. For this reason, low price equipment is needed in order to replace real devices. If the instrument is measuring, this low cost system can be implemented by hardware equipment to acquire the data and running on a computer where analyzing and present on the screen application. In case of an excitation the instrument, the only task of the hardware system is to provide signals which sent the computer configuration. In a real instrument, is the instrument itself that must perform all these actions: acquisition, processing and presentation of data. Moreover, the difficulty of making changes or additions to the features in traditional devices with respect to an application running on a computer is evident. This is due to the fact that a traditional instrument is a closed system and its configuration or data processing is made by an application. Therefore, certain changes can be made just by modifying the control program software. Consequently, the cost of these modifications is lower. This project aims to implement a hardware system with the same features and functions of any real device, available in an electronics laboratory. Besides, it aims to develop an application for the monitoring and analysis of acquired signals. This application is provided with a graphic interface resembling those of real devices in order to facilitate its use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Internet das Coisas é um novo paradigma de comunicação que estende o mundo virtual (Internet) para o mundo real com a interface e interação entre objetos. Ela possuirá um grande número de dispositivos heteregôneos interconectados, que deverá gerar um grande volume de dados. Um dos importantes desafios para seu desenvolvimento é se guardar e processar esse grande volume de dados em aceitáveis intervalos de tempo. Esta pesquisa endereça esse desafio, com a introdução de serviços de análise e reconhecimento de padrões nas camadas inferiores do modelo de para Internet das Coisas, que procura reduzir o processamento nas camadas superiores. Na pesquisa foram analisados os modelos de referência para Internet das Coisas e plataformas para desenvolvimento de aplicações nesse contexto. A nova arquitetura de implementada estende o LinkSmart Middeware pela introdução de um módulo para reconhecimento de padrões, implementa algoritmos para estimação de valores, detecção de outliers e descoberta de grupos nos dados brutos, oriundos de origens de dados. O novo módulo foi integrado à plataforma para Big Data Hadoop e usa as implementações algorítmicas do framework Mahout. Este trabalho destaca a importância da comunicação cross layer integrada à essa nova arquitetura. Nos experimentos desenvolvidos na pesquisa foram utilizadas bases de dados reais, provenientes do projeto Smart Santander, de modo a validar da nova arquitetura de IoT integrada aos serviços de análise e reconhecimento de padrões e a comunicação cross-layer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nas últimas décadas, a poluição sonora tornou-se um grande problema para a sociedade. É por esta razão que a indústria tem aumentado seus esforços para reduzir a emissão de ruído. Para fazer isso, é importante localizar quais partes das fontes sonoras são as que emitem maior energia acústica. Conhecer os pontos de emissão é necessário para ter o controle das mesmas e assim poder reduzir o impacto acústico-ambiental. Técnicas como \"beamforming\" e \"Near-Field Acoustic Holography\" (NAH) permitem a obtenção de imagens acústicas. Essas imagens são obtidas usando um arranjo de microfones localizado a uma distância relativa de uma fonte emissora de ruído. Uma vez adquiridos os dados experimentais pode-se obter a localização e magnitude dos principais pontos de emissão de ruído. Do mesmo modo, ajudam a localizar fontes aeroacústicas e vibro acústicas porque são ferramentas de propósito geral. Usualmente, estes tipos de fontes trabalham em diferentes faixas de frequência de emissão. Recentemente, foi desenvolvida a transformada de Kronecker para arranjos de microfones, a qual fornece uma redução significativa do custo computacional quando aplicada a diversos métodos de reconstrução de imagens, desde que os microfones estejam distribuídos em um arranjo separável. Este trabalho de mestrado propõe realizar medições com sinais reais, usando diversos algoritmos desenvolvidos anteriormente em uma tese de doutorado, quanto à qualidade do resultado obtido e à complexidade computacional, e o desenvolvimento de alternativas para tratamento de dados quando alguns microfones do arranjo apresentarem defeito. Para reduzir o impacto de falhas em microfones e manter a condição de que o arranjo seja separável, foi desenvolvida uma alternativa para utilizar os algoritmos rápidos, eliminando-se apenas os microfones com defeito, de maneira que os resultados finais serão obtidos levando-se em conta todos os microfones do arranjo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Subsidence is a natural hazard that affects wide areas in the world causing important economic costs annually. This phenomenon has occurred in the metropolitan area of Murcia City (SE Spain) as a result of groundwater overexploitation. In this work aquifer system subsidence is investigated using an advanced differential SAR interferometry remote sensing technique (A-DInSAR) called Stable Point Network (SPN). The SPN derived displacement results, mainly the velocity displacement maps and the time series of the displacement, reveal that in the period 2004–2008 the rate of subsidence in Murcia metropolitan area doubled with respect to the previous period from 1995 to 2005. The acceleration of the deformation phenomenon is explained by the drought period started in 2006. The comparison of the temporal evolution of the displacements measured with the extensometers and the SPN technique shows an average absolute error of 3.9±3.8 mm. Finally, results from a finite element model developed to simulate the recorded time history subsidence from known water table height changes compares well with the SPN displacement time series estimations. This result demonstrates the potential of A-DInSAR techniques to validate subsidence prediction models as an alternative to using instrumental ground based techniques for validation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

3D sensors provides valuable information for mobile robotic tasks like scene classification or object recognition, but these sensors often produce noisy data that makes impossible applying classical keypoint detection and feature extraction techniques. Therefore, noise removal and downsampling have become essential steps in 3D data processing. In this work, we propose the use of a 3D filtering and down-sampling technique based on a Growing Neural Gas (GNG) network. GNG method is able to deal with outliers presents in the input data. These features allows to represent 3D spaces, obtaining an induced Delaunay Triangulation of the input space. Experiments show how the state-of-the-art keypoint detectors improve their performance using GNG output representation as input data. Descriptors extracted on improved keypoints perform better matching in robotics applications as 3D scene registration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Santas Justa and Rufina Gothic church (fourteenth century) has suffered several physical, mechanical, chemical, and biochemical types of pathologies along its history: rock alveolization, efflorescence, biological activity, and capillary ascent of groundwater. However, during the last two decades, a new phenomenon has seriously affected the church: ground subsidence caused by aquifer overexploitation. Subsidence is a process that affects the whole Vega Baja of the Segura River basin and consists of gradual sinking in the ground surface caused by soil consolidation due to a pore pressure decrease. This phenomenon has been studied by differential synthetic aperture radar interferometry techniques, which illustrate settlements up to 100 mm for the 1993–2009 period for the whole Orihuela city. Although no differential synthetic aperture radar interferometry information is available for the church due to the loss of interferometric coherence, the spatial analysis of nearby deformation combined with fieldwork has advanced the current understanding on the mechanisms that affect the Santas Justa and Rufina church. These results show the potential interest and the limitations of using this remote sensing technique as a complementary tool for the forensic analysis of building structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Federal Highway Administration, Structures and Applied Mechanics Division, Washington, D.C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"Papers presented at the 35th annual Clinic on Library Applications of Data Processing, March 22-24, 1998."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, East Liberty, Ohio

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (M. S.)--University of Illinois at Urbana-Champaign.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although managers consider accurate, timely, and relevant information as critical to the quality of their decisions, evidence of large variations in data quality abounds. Over a period of twelve months, the action research project reported herein attempted to investigate and track data quality initiatives undertaken by the participating organisation. The investigation focused on two types of errors: transaction input errors and processing errors. Whenever the action research initiative identified non-trivial errors, the participating organisation introduced actions to correct the errors and prevent similar errors in the future. Data quality metrics were taken quarterly to measure improvements resulting from the activities undertaken during the action research project. The action research project results indicated that for a mission-critical database to ensure and maintain data quality, commitment to continuous data quality improvement is necessary. Also, communication among all stakeholders is required to ensure common understanding of data quality improvement goals. The action research project found that to further substantially improve data quality, structural changes within the organisation and to the information systems are sometimes necessary. The major goal of the action research study is to increase the level of data quality awareness within all organisations and to motivate them to examine the importance of achieving and maintaining high-quality data.