887 resultados para DATA-ACQUISITION SYSTEM
Resumo:
El proyecto trata del desarrollo de un software para realizar el control de la medida de la distribución de intensidad luminosa en luminarias LED. En el trascurso del proyecto se expondrán fundamentos teóricos sobre fotometría básica, de los cuales se extraen las condiciones básicas para realizar dicha medida. Además se realiza una breve descripción del hardware utilizado en el desarrollo de la máquina, el cual se basa en una placa de desarrollo Arduino Mega 2560, que, gracias al paquete de Labview “LIFA” (Labview Interface For Arduino”), será posible utilizarla como tarjeta de adquisición de datos mediante la cual poder manejar tanto sensores como actuadores, para las tareas de control. El instrumento de medida utilizado en este proyecto es el BTS256 de la casa GigaHerzt-Optik, del cual se dispone de un kit de desarrollo tanto en lenguaje C++ como en Labview, haciendo posible programar aplicaciones basadas en este software para realizar cualquier tipo de adaptación a las necesidades del proyecto. El software está desarrollado en la plataforma Labview 2013, esto es gracias a que se dispone del kit de desarrollo del instrumento de medida, y del paquete LIFA. El objetivo global del proyecto es realizar la caracterización de luminarias LED, de forma que se obtengan medidas suficientes de la distribución de intensidad luminosa. Los datos se recogerán en un archivo fotométrico específico, siguiendo la normativa IESNA 2002 sobre formato de archivos fotométricos, que posteriormente será utilizado en la simulación y estudio de instalaciones reales de la luminaria. El sistema propuesto en este proyecto, es un sistema basado en fotometría tipo B, utilizando coordenadas VH, desarrollando un algoritmo de medida que la luminaria describa un ángulo de 180º en ambos ejes, con una resolución de 5º para el eje Vertical y 22.5º para el eje Horizontal, almacenando los datos en un array que será escrito en el formato exigido por la normativa. Una vez obtenidos los datos con el instrumento desarrollado, el fichero generado por la medida, es simulado con el software DIALux, obteniendo unas medidas de iluminación en la simulación que serán comparadas con las medidas reales, intentando reproducir en la simulación las condiciones reales de medida. ABSTRACT. The project involves the development of software for controlling the measurement of light intensity distribution in LEDs. In the course of the project theoretical foundations on basic photometry, of which the basic conditions for such action are extracted will be presented. Besides a brief description of the hardware used in the development of the machine, which is based on a Mega Arduino plate 2560 is made, that through the package Labview "LIFA" (Interface For Arduino Labview "), it is possible to use as data acquisition card by which to handle both sensors and actuators for control tasks. The instrument used in this project is the BTS256 of GigaHerzt-Optik house, which is available a development kit in both C ++ language as LabView, making it possible to program based on this software applications for any kind of adaptation to project needs. The software is developed in Labview 2013 platform, this is thanks to the availability of the SDK of the measuring instrument and the LIFA package. The overall objective of the project is the characterization of LED lights, so that sufficient measures the light intensity distribution are obtained. Data will be collected on a specific photometric file, following the rules IESNA 2002 on photometric format files, which will then be used in the simulation and study of actual installations of the luminaire. The proposed in this project is a system based on photometry type B system using VH coordinates, developing an algorithm as the fixture describe an angle of 180 ° in both axes, with a resolution of 5 ° to the vertical axis and 22.5º for the Horizontal axis, storing data in an array to be written in the format required by the regulations. After obtaining the data with the instrument developed, the file generated by the measure, is simulated with DIALux software, obtaining measures of lighting in the simulation will be compared with the actual measurements, trying to play in the simulation the actual measurement conditions .
Resumo:
O trabalho aborda a aplicação da técnica de reconciliação de dados para o balanço da movimentação de gás natural em uma malha de escoamento de gás não processado, elaborando também um método de cálculo rápido de inventário de um duto. Foram aplicadas, separadamente, a reconciliação volumétrica à condição padrão de medição e a reconciliação mássica, bem como realizadas comparações dos resultados em relação ao balanço original e verificação do balanço resultante de energia em termos de poder calorífico superior. Dois conjuntos de pesos foram aplicados, um arbitrado de acordo com o conhecimento prévio da qualidade do sistema de medição de cada um dos pontos, outro baseado no inverso da variância dos volumes diários apurados no período. Ambos apresentaram bons resultados e o segundo foi considerado o mais apropriado. Por meio de uma abordagem termodinâmica, foi avaliado o potencial impacto, ao balanço, da condensação de parte da fase gás ao longo do escoamento e a injeção de um condensado de gás natural não estabilizado por uma das fontes. Ambos tendem a impactar o balanço, sendo o resultado esperado um menor volume, massa e energia de fase gás na saída. Outros fatores de considerável impacto na qualidade dos dados e no resultado final da reconciliação são a qualidade da medição de saída do sistema e a representatividade da composição do gás neste ponto. O inventário é calculado a partir de uma regressão que se baseia em um regime permanente de escoamento, o que pode apresentar maior desvio quando fortes transientes estão ocorrendo no último dia do mês, porém a variação de inventário ao longo do mês possui baixo impacto no balanço. Concluiu-se que a reconciliação volumétrica é a mais apropriada para este sistema, pois os dados reconciliados levam os balanços mássicos e de energia em termos de poder calorífico, ambos na fase gás, para dentro do perfil esperado de comportamento. Embora um balanço volumétrico nulo apenas da fase gás não seja por si só o comportamento esperado quando se considera os efeitos descritos, para desenvolver um balanço mais robusto é necessário considerar as frações líquidas presentes no sistema, agregando maior dificuldade na aquisição e qualidade dos dados.
Resumo:
É importante que as redes elétricas tenham altos índices de confiabilidade, de forma a se manter a agilidade e a manutenção ideais para um melhor funcionamento. Por outro lado, o crescimento inesperado da carga, falhas em equipamentos e uma parametrização inadequada das funções de proteção tornam a análise de eventos de proteção mais complexas e demoradas. Além disso, a quantidade de informações que pode ser obtida de relés digitais modernos tem crescido constantemente. Para que seja possível uma rápida tomada de decisão e manutenção, esse projeto de pesquisa teve como objetivo a implementação de um sistema completo de diagnóstico que é ativado automaticamente quando um evento de proteção ocorrer. As informações a serem analisadas são obtidas de uma base de dados e de relés de proteção, via protocolo de comunicação IEC 61850 e arquivos de oscilografia. O trabalho aborda o sistema Smart Grid completo incluindo: a aquisição de dados nos relés, detalhando o sistema de comunicação desenvolvido através de um software com um cliente IEC61850 e um servidor OPC e um software com um cliente OPC, que é ativado por eventos configurados para dispará-lo (por exemplo, atuação da proteção); o sistema de pré-tratamento de dados, onde os dados provenientes dos relés e equipamentos de proteção são filtrados, pré-processados e formatados; e o sistema de diagnóstico. Um banco de dados central mantém atualizados os dados de todas essas etapas. O sistema de diagnóstico utiliza algoritmos convencionais e técnicas de inteligência artificial, em particular, um sistema especialista. O sistema especialista foi desenvolvido para lidar com diferentes conjuntos de dados de entrada e com uma possível falta de dados, sempre garantindo a entrega de diagnósticos. Foram realizados testes e simulações para curtos-circuitos (trifásico, dupla-fase, dupla-fase-terra e fase-terra) em alimentadores, transformadores e barras de uma subestação. Esses testes incluíram diferentes estados do sistema de proteção (funcionamento correto e impróprio). O sistema se mostrou totalmente eficaz tanto no caso de disponibilidade completa quanto parcial de informações, sempre fornecendo um diagnóstico do curto-circuito e analisando o funcionamento das funções de proteção da subestação. Dessa forma, possibilita-se uma manutenção muito mais eficiente pelas concessionárias de energia, principalmente no que diz respeito à prevenção de defeitos em equipamentos, rápida resposta a problemas, e necessidade de reparametrização das funções de proteção. O sistema foi instalado com sucesso em uma subestação de distribuição da Companhia Paulista de Força e Luz.
Resumo:
In this project, we propose the implementation of a 3D object recognition system which will be optimized to operate under demanding time constraints. The system must be robust so that objects can be recognized properly in poor light conditions and cluttered scenes with significant levels of occlusion. An important requirement must be met: the system must exhibit a reasonable performance running on a low power consumption mobile GPU computing platform (NVIDIA Jetson TK1) so that it can be integrated in mobile robotics systems, ambient intelligence or ambient assisted living applications. The acquisition system is based on the use of color and depth (RGB-D) data streams provided by low-cost 3D sensors like Microsoft Kinect or PrimeSense Carmine. The range of algorithms and applications to be implemented and integrated will be quite broad, ranging from the acquisition, outlier removal or filtering of the input data and the segmentation or characterization of regions of interest in the scene to the very object recognition and pose estimation. Furthermore, in order to validate the proposed system, we will create a 3D object dataset. It will be composed by a set of 3D models, reconstructed from common household objects, as well as a handful of test scenes in which those objects appear. The scenes will be characterized by different levels of occlusion, diverse distances from the elements to the sensor and variations on the pose of the target objects. The creation of this dataset implies the additional development of 3D data acquisition and 3D object reconstruction applications. The resulting system has many possible applications, ranging from mobile robot navigation and semantic scene labeling to human-computer interaction (HCI) systems based on visual information.
Resumo:
Femicide, defined as the killings of females by males because they are females, is becoming recognized worldwide as an important ongoing manifestation of gender inequality. Despite its high prevalence or widespread prevalence, only a few countries have specific registries about this issue. This study aims to assemble expert opinion regarding the strategies which might feasibly be employed to promote, develop and implement an integrated and differentiated femicide data collection system in Europe at both the national and international levels. Concept mapping methodology was followed, involving 28 experts from 16 countries in generating strategies, sorting and rating them with respect to relevance and feasibility. The experts involved were all members of the EU-Cost-Action on femicide, which is a scientific network of experts on femicide and violence against women across Europe. As a result, a conceptual map emerged, consisting of 69 strategies organized in 10 clusters, which fit into two domains: “Political action” and “Technical steps”. There was consensus among participants regarding the high relevance of strategies to institutionalize national databases and raise public awareness through different stakeholders, while strategies to promote media involvement were identified as the most feasible. Differences in perceived priorities according to the level of human development index of the experts’ countries were also observed.
Resumo:
Federal Highway Administration, Office of Safety and Traffic Operations, Washington, D.C.
Resumo:
Virginia Department of Transportation, Richmond
Resumo:
Federal Highway Administration, Office of Safety and Traffic Operations Research and Development, McLean, Va.
Resumo:
Federal Highway Administration, Washington, D.C.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
This paper reviews the key features of an environment to support domain users in spatial information system (SIS) development. It presents a full design and prototype implementation of a repository system for the storage and management of metadata, focusing on a subset of spatial data integrity constraint classes. The system is designed to support spatial system development and customization by users within the domain that the system will operate.
Resumo:
Objective: An estimation of cut-off points for the diagnosis of diabetes mellitus (DM) based on individual risk factors. Methods: A subset of the 1991 Oman National Diabetes Survey is used, including all patients with a 2h post glucose load >= 200 mg/dl (278 subjects) and a control group of 286 subjects. All subjects previously diagnosed as diabetic and all subjects with missing data values were excluded. The data set was analyzed by use of the SPSS Clementine data mining system. Decision Tree Learners (C5 and CART) and a method for mining association rules (the GRI algorithm) are used. The fasting plasma glucose (FPG), age, sex, family history of diabetes and body mass index (BMI) are input risk factors (independent variables), while diabetes onset (the 2h post glucose load >= 200 mg/dl) is the output (dependent variable). All three techniques used were tested by use of crossvalidation (89.8%). Results: Rules produced for diabetes diagnosis are: A- GRI algorithm (1) FPG>=108.9 mg/dl, (2) FPG>=107.1 and age>39.5 years. B- CART decision trees: FPG >=110.7 mg/dl. C- The C5 decision tree learner: (1) FPG>=95.5 and 54, (2) FPG>=106 and 25.2 kg/m2. (3) FPG>=106 and =133 mg/dl. The three techniques produced rules which cover a significant number of cases (82%), with confidence between 74 and 100%. Conclusion: Our approach supports the suggestion that the present cut-off value of fasting plasma glucose (126 mg/dl) for the diagnosis of diabetes mellitus needs revision, and the individual risk factors such as age and BMI should be considered in defining the new cut-off value.
Resumo:
A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.
Resumo:
The explosive growth in microprocessor technology and the increasing use of computers to store information has increased the demand for data communication channels. Because of this, data communication to mobile vehicles is increasing rapidly. In addition, data communication is seen as a method of relieving the current congestion of mobile radio telephone bands in the U.K. Highly reliable data communication over mobile radio channels is particularly difficult to achieve, primarily due to fading caused by multipath interference. In this thesis a data communication system is described for use over radio channels impaired by multipath interference. The thesis first describes radio communication in general, and multipath interference In particular. The practical aspects of fading channels are stressed because of their importance in the development of the system. The current U.K. land mobile radio scene is then reviewed, with particular emphasis on the use of existing mobile radio equipment for data communication purposes. The development of the data communication system is then described. This system is microprocessor based and uses an advanced form of automatic request repeat (ARQ) operation. It can be configured to use either existing radio-telephone equipment, totally new equipment specifically designed for data communication, or any combination of the two. Due to its adaptability, the system can automatically optimise itself for use over any channel, even if the channel parameters are changing rapidly. Results obtained from a particular implementation of the system, which is described in full, are presented. These show how the operation of the system has to change to accomodate changes in the channel. Comparisons are made between the practical results and the theoretical limits of the system.
Resumo:
This thesis describes the development of a complete data visualisation system for large tabular databases, such as those commonly found in a business environment. A state-of-the-art 'cyberspace cell' data visualisation technique was investigated and a powerful visualisation system using it was implemented. Although allowing databases to be explored and conclusions drawn, it had several drawbacks, the majority of which were due to the three-dimensional nature of the visualisation. A novel two-dimensional generic visualisation system, known as MADEN, was then developed and implemented, based upon a 2-D matrix of 'density plots'. MADEN allows an entire high-dimensional database to be visualised in one window, while permitting close analysis in 'enlargement' windows. Selections of records can be made and examined, and dependencies between fields can be investigated in detail. MADEN was used as a tool for investigating and assessing many data processing algorithms, firstly data-reducing (clustering) methods, then dimensionality-reducing techniques. These included a new 'directed' form of principal components analysis, several novel applications of artificial neural networks, and discriminant analysis techniques which illustrated how groups within a database can be separated. To illustrate the power of the system, MADEN was used to explore customer databases from two financial institutions, resulting in a number of discoveries which would be of interest to a marketing manager. Finally, the database of results from the 1992 UK Research Assessment Exercise was analysed. Using MADEN allowed both universities and disciplines to be graphically compared, and supplied some startling revelations, including empirical evidence of the 'Oxbridge factor'.