887 resultados para DATA-ACQUISITION SYSTEM
Resumo:
The aim of this Interdisciplinary Higher Degrees project was the development of a high-speed method of photometrically testing vehicle headlamps, based on the use of image processing techniques, for Lucas Electrical Limited. Photometric testing involves measuring the illuminance produced by a lamp at certain points in its beam distribution. Headlamp performance is best represented by an iso-lux diagram, showing illuminance contours, produced from a two-dimensional array of data. Conventionally, the tens of thousands of measurements required are made using a single stationary photodetector and a two-dimensional mechanical scanning system which enables a lamp's horizontal and vertical orientation relative to the photodetector to be changed. Even using motorised scanning and computerised data-logging, the data acquisition time for a typical iso-lux test is about twenty minutes. A detailed study was made of the concept of using a video camera and a digital image processing system to scan and measure a lamp's beam without the need for the time-consuming mechanical movement. Although the concept was shown to be theoretically feasible, and a prototype system designed, it could not be implemented because of the technical limitations of commercially-available equipment. An alternative high-speed approach was developed, however, and a second prototype syqtem designed. The proposed arrangement again uses an image processing system, but in conjunction with a one-dimensional array of photodetectors and a one-dimensional mechanical scanning system in place of a video camera. This system can be implemented using commercially-available equipment and, although not entirely eliminating the need for mechanical movement, greatly reduces the amount required, resulting in a predicted data acquisiton time of about twenty seconds for a typical iso-lux test. As a consequence of the work undertaken, the company initiated an 80,000 programme to implement the system proposed by the author.
Resumo:
This paper addresses the problem of obtaining complete, detailed reconstructions of textureless shiny objects. We present an algorithm which uses silhouettes of the object, as well as images obtained under changing illumination conditions. In contrast with previous photometric stereo techniques, ours is not limited to a single viewpoint but produces accurate reconstructions in full 3D. A number of images of the object are obtained from multiple viewpoints, under varying lighting conditions. Starting from the silhouettes, the algorithm recovers camera motion and constructs the object's visual hull. This is then used to recover the illumination and initialize a multiview photometric stereo scheme to obtain a closed surface reconstruction. There are two main contributions in this paper: First, we describe a robust technique to estimate light directions and intensities and, second, we introduce a novel formulation of photometric stereo which combines multiple viewpoints and, hence, allows closed surface reconstructions. The algorithm has been implemented as a practical model acquisition system. Here, a quantitative evaluation of the algorithm on synthetic data is presented together with complete reconstructions of challenging real objects. Finally, we show experimentally how, even in the case of highly textured objects, this technique can greatly improve on correspondence-based multiview stereo results.
Resumo:
We present novel Terahertz (THz) emitting optically pumped Quantum Dot (QD) photoconductive (PC) materials and antenna structures on their basis both for pulsed and CW pumping regimes. Full text Quantum dot and microantenna design - Presented here are design considerations for the semiconductor materials in our novel QD-based photoconductive antenna (PCA) structures, metallic microantenna designs, and their implementation as part of a complete THz source or transceiver system. Layers of implanted QDs can be used for the photocarrier lifetime shortening mechanism[1,2]. In our research we use InAs:GaAs QD structures of varying dot layer number and distributed Bragg reflector(DBR)reflectivity range. According to the observed dependence of carrier lifetimes on QD layer periodicity [3], it is reasonable to assume that electron lifetimes can be potentially reduced down to 0.45ps in such structures. Both of these features; long excitation wavelength and short carriers lifetime predict possible feasibility of QD antennas for THz generation and detection. In general, relatively simple antenna configurations were used here, including: coplanar stripline (CPS); Hertzian-type dipoles; bow-ties for broadband and log-spiral(LS)or log-periodic(LP)‘toothed’ geometriesfor a CW operation regime. Experimental results - Several lasers are used for antenna pumping: Ti:Sapphire femtosecond laser, as well as single-[4], double-[5] wavelength, and pulsed [6] QD lasers. For detection of the THz signal different schemes and devices were used, e.g. helium-cooled bolometer, Golay cell and a second PCA for coherent THz detection in a traditional time-domain measurement scheme.Fig.1shows the typical THz output power trend from a 5 um-gap LPQD PCA pumped using a tunable QD LD with optical pump spectrum shown in (b). Summary - QD-based THz systems have been demonstrated as a feasible and highly versatile solution. The implementation of QD LDs as pump sources could be a major step towards ultra-compact, electrically controllable transceiver system that would increase the scope of data analysis due to the high pulse repetition rates of such LDs [3], allowing real-time THz TDS and data acquisition. Future steps in development of such systems now lie in the further investigation of QD-based THz PCA structures and devices, particularly with regards to their compatibilitywith QD LDs as pump sources. [1]E. U. Rafailov et al., “Fast quantum-dot saturable absorber for passive mode-locking of solid-State lasers,”Photon.Tech.Lett., IEEE, vol. 16 pp. 2439-2441(2004) [2]E. Estacio, “Strong enhancement of terahertz emission from GaAs in InAs/GaAs quantum dot structures. Appl.Phys.Lett., vol. 94 pp. 232104 (2009) [3]C. Kadow et al., “Self-assembled ErAs islands in GaAs: Growth and subpicosecond carrier dynamics,” Appl. Phys. Lett., vol. 75 pp. 3548-3550 (1999) [4]T. Kruczek, R. Leyman, D. Carnegie, N. Bazieva, G. Erbert, S. Schulz, C. Reardon, and E. U. Rafailov, “Continuous wave terahertz radiation from an InAs/GaAs quantum-dot photomixer device,” Appl. Phys. Lett., vol. 101(2012) [5]R. Leyman, D. I. Nikitichev, N. Bazieva, and E. U. Rafailov, “Multimodal spectral control of a quantum-dot diode laser for THz difference frequency generation,” Appl. Phys. Lett., vol. 99 (2011) [6]K.G. Wilcox, M. Butkus, I. Farrer, D.A. Ritchie, A. Tropper, E.U. Rafailov, “Subpicosecond quantum dot saturable absorber mode-locked semiconductor disk laser, ” Appl. Phys. Lett. Vol 94, 2511 © 2014 IEEE.
Resumo:
The sheer volume of citizen weather data collected and uploaded to online data hubs is immense. However as with any citizen data it is difficult to assess the accuracy of the measurements. Within this project we quantify just how much data is available, where it comes from, the frequency at which it is collected, and the types of automatic weather stations being used. We also list the numerous possible sources of error and uncertainty within citizen weather observations before showing evidence of such effects in real data. A thorough intercomparison field study was conducted, testing popular models of citizen weather stations. From this study we were able to parameterise key sources of bias. Most significantly the project develops a complete quality control system through which citizen air temperature observations can be passed. The structure of this system was heavily informed by the results of the field study. Using a Bayesian framework the system learns and updates its estimates of the calibration and radiation-induced biases inherent to each station. We then show the benefit of correcting for these learnt biases over using the original uncorrected data. The system also attaches an uncertainty estimate to each observation, which would provide real world applications that choose to incorporate such observations with a measure on which they may base their confidence in the data. The system relies on interpolated temperature and radiation observations from neighbouring professional weather stations for which a Bayesian regression model is used. We recognise some of the assumptions and flaws of the developed system and suggest further work that needs to be done to bring it to an operational setting. Such a system will hopefully allow applications to leverage the additional value citizen weather data brings to longstanding professional observing networks.
Resumo:
The aim of our research in the first half of 2011 was to find out what were the administrative, regulative and other problems that were specific obstacles to local economic development and what best practices can be found in local economies. During our research we carried out interviews with leaders and local professionals in five medium size towns of Hungary. We stated that the most obstructive factors were the imperfection of vocational training, the excessively bureaucratic administrative proceedings (supplying of data, acquisition of authority permits, the attitude of the authorities, etc.) and the system of application and finding sources of funds. We think that the most innovative solutions are the good examples of the institutionalized co-operation between local governments and local businesses. We've come to the conclusion that there is a need for reducing administrative burdens for the sake of local economic development.
Resumo:
Governmental accountability is the requirement of government entities to be accountable to the citizenry in order to justify the raising and expenditure of public resources. The concept of service efforts and accomplishments measurement for government programs was introduced by the Governmental Accounting Standards Board (GASB) in Service Efforts and Accomplishments Reporting: Its Time Has Come (1990). This research tested the feasibility of implementing the concept for the Federal-aid highway construction program and identified factors affecting implementation with a case study of the District of Columbia. Changes in condition and performance ratings for specific highway segments in 15 projects, before and after construction expenditures, were evaluated using data provided by the Federal Highway Administration. The results of the evaluation indicated difficulty in drawing conclusions on the state program performance, as a whole. The state program reflects problems within the Federally administered program that severely limit implementation of outcome-oriented performance measurement. Major problems identified with data acquisition are: data reliability, availability, compatibility and consistency among states. Other significant factors affecting implementation are institutional barriers and political barriers. Institutional issues in the Federal Highway Administration include the lack of integration of the fiscal project specific database with the Highway Performance Monitoring System database. The Federal Highway Administration has the ability to resolve both of the data problems, however interviews with key Federal informants indicate this will not occur without external directives and changes to the Federal “stewardship” approach to program administration. ^ The findings indicate many issues must be resolved for successful implementation of outcome-oriented performance measures in the Federal-aid construction program. The issues are organizational and political in nature, however in the current environment resolution is possible. Additional research is desirable and would be useful in overcoming the obstacles to successful implementation. ^
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.
Resumo:
Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.
Resumo:
The future power grid will effectively utilize renewable energy resources and distributed generation to respond to energy demand while incorporating information technology and communication infrastructure for their optimum operation. This dissertation contributes to the development of real-time techniques, for wide-area monitoring and secure real-time control and operation of hybrid power systems. ^ To handle the increased level of real-time data exchange, this dissertation develops a supervisory control and data acquisition (SCADA) system that is equipped with a state estimation scheme from the real-time data. This system is verified on a specially developed laboratory-based test bed facility, as a hardware and software platform, to emulate the actual scenarios of a real hybrid power system with the highest level of similarities and capabilities to practical utility systems. It includes phasor measurements at hundreds of measurement points on the system. These measurements were obtained from especially developed laboratory based Phasor Measurement Unit (PMU) that is utilized in addition to existing commercially based PMU’s. The developed PMU was used in conjunction with the interconnected system along with the commercial PMU’s. The tested studies included a new technique for detecting the partially islanded micro grids in addition to several real-time techniques for synchronization and parameter identifications of hybrid systems. ^ Moreover, due to numerous integration of renewable energy resources through DC microgrids, this dissertation performs several practical cases for improvement of interoperability of such systems. Moreover, increased number of small and dispersed generating stations and their need to connect fast and properly into the AC grids, urged this work to explore the challenges that arise in synchronization of generators to the grid and through introduction of a Dynamic Brake system to improve the process of connecting distributed generators to the power grid.^ Real time operation and control requires data communication security. A research effort in this dissertation was developed based on Trusted Sensing Base (TSB) process for data communication security. The innovative TSB approach improves the security aspect of the power grid as a cyber-physical system. It is based on available GPS synchronization technology and provides protection against confidentiality attacks in critical power system infrastructures. ^
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
The motion capture is a main tool for quantitative motion analyses. Since the XIX century, several motion caption systems have been developed for biomechanics study, animations, games and movies. The biomechanics and kinesiology involves and depends on knowledge from distinct fields, the engineering and health sciences. A precise human motion analysis requires knowledge from both fields. It is necessary then the use of didactics tools and methods for research and teaching for learning aid. The devices for analysis and motion capture currently that are found on the market and on educational institutes presents difficulties for didactical practice, which are the difficulty of transportation, high cost and limited freedom for the user towards the data acquisition. Therefore, the motion analysis is qualitatively performed or is quantitatively performed in highly complex laboratories. Based is these problems, this work presents the development of a motion capture system for didactic use hence a cheap, light, portable and easily used device with a free software. This design includes the selection of the device, the software development for that and tests. The developed system uses the device Kinect, from Microsoft, for its low cost, low weight, portability and easy use, and delivery tree-dimensional data with only one peripheral device. The proposed programs use the hardware to make motion captures, store them, reproduce them, process the motion data and graphically presents the data.
Resumo:
The reduction in energy consumption is the main requirement to be satisfied in refrigeration and air conditioning by mechanical vapor compression system. In automotive system isn´t different. Thermal analyses in these systems are crucial for a better performance in automotive air conditioner. This work aims to evaluate the conditions of use of R134A refrigerant (used in vehicles) and compare with R437A (alternative refrigerant), varying the speed of the electric fan in the evaporator. All tests were performed in automotive air conditioning unit ATR600, simulating the thermal conditions of the system. The equipment is instrumented for data acquisition temperature, condensation and evaporation pressures and electrical power consumed to determine the coefficient of performance of the cycle. The system was tested under rotations of 800, 1600 and 2400 rpm with constant load of R- 134a. It occurred with the same conditions with R437A. Both recommended by the manufacturer. The results show that the best system performance occurs in the rotation of 800 RPM for both refrigerants.
Resumo:
A large series of laboratory ice crushing experiments was performed to investigate the effects of external boundary condition and indenter contact geometry on ice load magnitude under crushing conditions. Four boundary conditions were considered: dry cases, submerged cases, and cases with the presence of snow and granular ice material on the indenter surface. Indenter geometries were a flat plate, wedge shaped indenter, (reverse) conical indenter, and spherical indenter. These were impacted with artificially produced ice specimens of conical shape with 20° and 30° cone angles. All indenter – ice combinations were tested in dry and submerged environments at 1 mm/s and 100 mm/s indentation rates. Additional tests with the flat indentation plate were conducted at 10 mm/s impact velocity and a subset of scenarios with snow and granular ice material was evaluated. The tests were performed using a material testing system (MTS) machine located inside a cold room at an ambient temperature of - 7°C. Data acquisition comprised time, vertical force, and displacement. In several tests with the flat plate and wedge shaped indenter, supplementary information on local pressure patterns and contact area were obtained using tactile pressure sensors. All tests were recorded with a high speed video camera and still photos were taken before and after each test. Thin sections were taken of some specimens as well. Ice loads were found to strongly depend on contact condition, interrelated with pre-existing confinement and indentation rate. Submergence yielded higher forces, especially at the high indentation rate. This was very evident for the flat indentation plate and spherical indenter, and with restrictions for the wedge shaped indenter. No indication was found for the conical indenter. For the conical indenter it was concluded that the structural restriction due to the indenter geometry was dominating. The working surface for the water to act was not sufficient to influence the failure processes and associated ice loads. The presence of snow and granular ice significantly increased the forces at the low indentation rate (with the flat indentation plate) that were higher compared to submerged cases and far above the dry contact condition. Contact area measurements revealed a correlation of higher forces with a concurrent increase in actual contact area that depended on the respective boundary condition. In submergence, ice debris constitution was changed; ice extrusion, as well as crack development and propagation were impeded. Snow and granular ice seemed to provide additional material sources for establishing larger contact areas. The dry contact condition generally had the smallest real contact area, as well as the lowest forces. The comparison of nominal and measured contact areas revealed distinct deviations. The incorporation of those differences in contact process pressures-area relationships indicated that the overall process pressure was not substantially affected by the increased loads.