22 resultados para Data quality problems
em Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho"
Resumo:
This paper presents the results of the investigations that were done to identify and to quantify the power quality problems resultant from the actions done to improve the efficiency on electric energy consumption. The efficiencies of several electric devices were evaluated, among them: fluorescent bulb, electronic ballast, soft-starter, temperature controller for showers, dimmer and others. This evaluation allowed to establish a cause/effect analysis of the power quality.
Resumo:
This work presents one software developed to process solar radiation data. This software can be used in meteorological and climatic stations, and also as a support for solar radiation measurements in researches of solar energy availability allowing data quality control, statistical calculations and validation of models, as well as ease interchanging of data. (C) 1999 Elsevier B.V. Ltd. All rights reserved.
Resumo:
Nowadays, with the expansion of the reference stations networks, several positioning techniques have been developed and/or improved. Among them, the VRS (Virtual Reference Station) concept has been very used. In this paper the goal is to generate VRS data in a modified technique. In the proposed methodology the DD (double difference) ambiguities are not computed. The network correction terms are obtained using only atmospheric (ionospheric and tropospheric) models. In order to carry out the experiments it was used data of five reference stations from the GPS Active Network of West of São Paulo State and an extra station. To evaluate the VRS data quality it was used three different strategies: PPP (Precise Point Positioning) and Relative Positioning in static and kinematic modes, and DGPS (Differential GPS). Furthermore, the VRS data were generated in the position of a real reference station. The results provided by the VRS data agree quite well with those of the real file data.
Resumo:
In geophysics and seismology, raw data need to be processed to generate useful information that can be turned into knowledge by researchers. The number of sensors that are acquiring raw data is increasing rapidly. Without good data management systems, more time can be spent in querying and preparing datasets for analyses than in acquiring raw data. Also, a lot of good quality data acquired at great effort can be lost forever if they are not correctly stored. Local and international cooperation will probably be reduced, and a lot of data will never become scientific knowledge. For this reason, the Seismological Laboratory of the Institute of Astronomy, Geophysics and Atmospheric Sciences at the University of São Paulo (IAG-USP) has concentrated fully on its data management system. This report describes the efforts of the IAG-USP to set up a seismology data management system to facilitate local and international cooperation. © 2011 by the Istituto Nazionale di Geofisica e Vulcanologia. All rights reserved.
Resumo:
Wireless Sensor Networks (WSNs) can be used to monitor hazardous and inaccessible areas. In these situations, the power supply (e.g. battery) of each node cannot be easily replaced. One solution to deal with the limited capacity of current power supplies is to deploy a large number of sensor nodes, since the lifetime and dependability of the network will increase through cooperation among nodes. Applications on WSN may also have other concerns, such as meeting temporal deadlines on message transmissions and maximizing the quality of information. Data fusion is a well-known technique that can be useful for the enhancement of data quality and for the maximization of WSN lifetime. In this paper, we propose an approach that allows the implementation of parallel data fusion techniques in IEEE 802.15.4 networks. One of the main advantages of the proposed approach is that it enables a trade-off between different user-defined metrics through the use of a genetic machine learning algorithm. Simulations and field experiments performed in different communication scenarios highlight significant improvements when compared with, for instance, the Gur Game approach or the implementation of conventional periodic communication techniques over IEEE 802.15.4 networks. © 2013 Elsevier B.V. All rights reserved.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de NÃvel Superior (CAPES)
Resumo:
In the poultry industry, the use of water with adequate physical, chemical and microbiological quality it is of fundamental importance. Since many birds have access to the same water source, quality problems will affect a great number of animals. The drinking water plays an important role in the transmission of some bacterial, viral and protozoan diseases that are among the most common poultry diseases. Important factors to prevent waterborne diseases in broiler production are the protection of supply sources, water disinfection and the quality control of microbiological, chemical and physical characteristics. Water is an essential nutrient for birds and therefore quality preservation is fundamental for good herd performance. The farmer may prevent many diseases in bird flocks by controlling the quality of the ingested water, will certainly result in decreased costs and increased profit, two essential aims of animal production nowadays.
Resumo:
The Brazilian Geodetic Network started to be established in the early 40's, employing classical surveying methods, such as triangulation and trilateration. With the introduction of satellite positioning systems, such as TRANSIT and GPS, that network was densified. That data was adjusted by employing a variety of methods, yielding distortions in the network that need to be understood. In this work, we analyze and interpret study cases in an attempt to understand the distortions in the Brazilian network. For each case, we performed the network adjustment employing the GHOST software suite. The results show that the distortion is least sensitive to the removal of invar baselines in the classical network. The network would be more affected by the inexistence of Laplace stations and Doppler control points, with differences up to 4.5 m.
Resumo:
The CMS High-Level Trigger (HLT) is responsible for ensuring that data samples with potentially interesting events are recorded with high efficiency and good quality. This paper gives an overview of the HLT and focuses on its commissioning using cosmic rays. The selection of triggers that were deployed is presented and the online grouping of triggered events into streams and primary datasets is discussed. Tools for online and offline data quality monitoring for the HLT are described, and the operational performance of the muon HLT algorithms is reviewed. The average time taken for the HLT selection and its dependence on detector and operating conditions are presented. The HLT performed reliably and helped provide a large dataset. This dataset has proven to be invaluable for understanding the performance of the trigger and the CMS experiment as a whole. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
The weather and climate has a direct influence in agriculture, it affects all stages of farming, since soil preparation to harvest. Meteorological data derived from automatic or conventional weather stations are used to monitor these effects. These meteorological data has problems like difficulty of data access and low density of meteorological stations in Brazil. Meteorological data from atmospheric models, such as ECMWF (European Center for Medium-Range Weather Forecast) can be an alternative. Thus, the aim of this study was to compare 10-day period precipitation, maximum and minimum air temperature data from the ECMWF model with interpolated maps from 33 weather stations in Sao Paulo state between 2005 and 2010 and generate statistical maps pixel by pixel. Statistical index showed spatially satisfactory (most of the results with R 2 > 0.60, d > 0.7, RMSE < 5°C and < 50 mm; Es < 5°C and < 24 mm) in period and ECMWF model can be recommended for use in the Sao Paulo state.
Resumo:
Acoustic Doppler current profilers are currently the main option for flow measurement and hydrodynamic monitoring of streams, replacing traditional methods. The spread of such equipment is mainly due to their operational advantages ranging from speed measurement to the greatest detail and amount of information generated about the hydrodynamics of hydrometric sections. As in the use of traditional methods and equipments, the use of acoustic Doppler profilers should be guided by the pursuit of data quality, since these are the basis for project and management of water resources constructions and systems. In this sense, the paper presents an analysis of measurement uncertainties of a hydrometric campaign held in Sapucaà River (Piranguinho-MG), using two different Doppler profilers - a Rio Grande ADCP 1200 kHz and a Qmetrix Qliner. 10 measurements were performed with each equipment consecutively, following the literature quality protocols, and later, a Type A uncertainty analysis (statistical analysis of several independent observations of the input under the same conditions). The measurements of the ADCP and Qliner presented, respectively, standard uncertainties of 0.679% and 0.508% compared with the averages. These results are satisfactory and acceptable when compared to references in the literature, indicating that the use of Doppler profilers is valid for expansion and upgrade of streamflow measurement networks and generation of hydrological data.
Resumo:
Pós-graduação em Matemática Universitária - IGCE
Resumo:
Conselho Nacional de Desenvolvimento CientÃfico e Tecnológico (CNPq)
Resumo:
Pós-graduação em Alimentos e Nutrição - FCFAR
Resumo:
Pós-graduação em Geociências e Meio Ambiente - IGCE