987 resultados para Measurement errors
Resumo:
A literature survey and a theoretical study were performed to characterize residential chimney conditions for flue gas flow measurements. The focus is on Pitot-static probes to give sufficient basis for the development and calibration of a velocity pressure averaging probe suitable for the continuous dynamic (i.e. non steady state) measurement of the low flow velocities present in residential chimneys. The flow conditions do not meet the requirements set in ISO 10780 and ISO 3966 for Pitot-static probe measurements, and the methods and their uncertainties are not valid. The flow velocities in residential chimneys from a heating boiler under normal operating condi-tions are shown to be so low that they in some conditions result in voiding the assumptions of non-viscous fluid justifying the use of the quadratic Bernoulli equation. A non-linear Reynolds number dependent calibration coefficient that is correcting for the viscous effects is needed to avoid significant measurement errors. The wide range of flow velocity during normal boiler operation also results in the flow type changing from laminar, across the laminar to turbulent transition region, to fully turbulent flow, resulting in significant changes of the velocity profile during dynamic measurements. In addition, the short duct lengths (and changes of flow direction and duct shape) used in practice are shown to result in that the measurements are done in the hydrodynamic entrance region where the flow velocity profiles most likely are neither symmetrical nor fully developed. A measurement method insensitive to velocity profile changes is thus needed, if the flow velocity profile cannot otherwise be determined or predicted with reasonable accuracy for the whole measurement range. Because of particulate matter and condensing fluids in the flue gas it is beneficial if the probe can be constructed so that it can easily be taken out for cleaning, and equipped with a locking mechanism to always ensure the same alignment in the duct without affecting the calibration. The literature implies that there may be a significant time lag in the measurements of low flow rates due to viscous effects in the internal impact pressure passages of Pitot probes, and the significance in the discussed application should be studied experimentally. The measured differential pressures from Pitot-static probes in residential chimney flows are so low that the calibration and given uncertainties of commercially available pressure transducers are not adequate. The pressure transducers should be calibrated specifically for the application, preferably in combination with the probe, and the significance of all different error sources should be investigated carefully. Care should be taken also with the temperature measurement, e.g. with averaging of several sensors, as significant temperature gradients may be present in flue gas ducts.
Resumo:
GPS tracking of mobile objects provides spatial and temporal data for a broad range of applications including traffic management and control, transportation routing and planning. Previous transport research has focused on GPS tracking data as an appealing alternative to travel diaries. Moreover, the GPS based data are gradually becoming a cornerstone for real-time traffic management. Tracking data of vehicles from GPS devices are however susceptible to measurement errors – a neglected issue in transport research. By conducting a randomized experiment, we assess the reliability of GPS based traffic data on geographical position, velocity, and altitude for three types of vehicles; bike, car, and bus. We find the geographical positioning reliable, but with an error greater than postulated by the manufacturer and a non-negligible risk for aberrant positioning. Velocity is slightly underestimated, whereas altitude measurements are unreliable.
Resumo:
The accurate measurement of a vehicle’s velocity is an essential feature in adaptive vehicle activated sign systems. Since the velocities of the vehicles are acquired from a continuous wave Doppler radar, the data collection becomes challenging. Data accuracy is sensitive to the calibration of the radar on the road. However, clear methodologies for in-field calibration have not been carefully established. The signs are often installed by subjective judgment which results in measurement errors. This paper develops a calibration method based on mining the data collected and matching individual vehicles travelling between two radars. The data was cleaned and prepared in two ways: cleaning and reconstructing. The results showed that the proposed correction factor derived from the cleaned data corresponded well with the experimental factor done on site. In addition, this proposed factor showed superior performance to the one derived from the reconstructed data.
Resumo:
Partindo de uma avaliação sobre o contexto mundial de descentralização fiscal e de democratização em que o Brasil se encontrava no final do século XX, essa tese apresenta na primeira parte uma análise empírica para países em desenvolvimento evidenciando o condicionamento do tipo de regime de governo na relação entre descentralização fiscal e tamanho de governo. Estimações por system-GMM para países em desenvolvimento mostram que existe um nível de descentralização fiscal, entre 20% e 30%, que uma vez superado, resulta em democracias com tamanhos de governos menores do que as ditaduras. Esses resultado, que chama a atenção tanto para os governos locais, como para a influência da democracia no gasto público, estimulou a continuação da pesquisa na avaliação da eficiência dos gastos municipais no Brasil e sua relação com o voto. Assim, no segundo ensaio, são calculados indicadores de evolução da eficiência e da produtividade do gasto municipal (fatores de Malmquist) entre 2004 e 2008, para as áreas da saúde e educação. Os resultados da análise por fronteira estocástica mostram que tanto na educação, como na saúde, houve avanços na fronteira de produção (TFPC, em média, de 18.7%, na educação e de 14.2% na saúde) por avanços de mudança técnica (Technical Change - TC), ao invés de elevação da eficiência (Technical Efficiency Change – TEC). No último ensaio, os indicadores de eficiência e de produtividade são usados para testar a hipótese de que o eleitor municipal premia com o voto os prefeitos que melhoraram a eficiência do gasto da educação e/ou saúde em sua gestão. Os resultados não rejeitam a hipótese para a educação, mas rejeitam para a saúde. A fim de tratar prováveis erros de medida das variáveis de produtividade, as estimações são instrumentalizadas em regressões por dois estágios.
Resumo:
O conceito de paridade coberta de juros sugere que, na ausência de barreiras para arbitragem entre mercados, o diferencial de juros entre dois ativos, idênticos em todos os pontos relevantes, com exceção da moeda de denominação, na ausência de risco de variação cambial deve ser igual a zero. Porém, uma vez que existam riscos não diversificáveis, representados pelo risco país, inerentes a economias emergentes, os investidores exigirão uma taxa de juros maior que a simples diferença entre as taxas de juros doméstica e externa. Este estudo tem por objetivo avaliar se o ajustamento das condições de paridade coberta de juros por prêmios de risco é suficiente para a validação da relação de não-arbitragem para o mercado brasileiro, durante o período de 2007 a 2010. O risco país contamina todos os ativos financeiros emitidos em uma determinada economia e pode ser descrito como a somatória do risco de default (ou risco soberano) e do risco de conversibilidade percebidos pelo mercado. Para a estimação da equação de não arbitragem foram utilizadas regressões por Mínimos Quadrados Ordinários, parâmetros variantes no tempo (TVP) e Mínimos Quadrados Recursivos, e os resultados obtidos não são conclusivos sobre a validação da relação de paridade coberta de juros, mesmo ajustando para prêmio de risco. Erros de medidas de dados, custo de transação e intervenções e políticas restritivas no mercado de câmbio podem ter contribuído para este resultado.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Ensure the integrity of the pipeline network is an extremely important factor in the oil and gas industry. The engineering of pipelines uses sophisticated robotic inspection tools in-line known as instrumented pigs. Several relevant factors difficult the inspection of pipelines, especially in offshore field which uses pipelines with multi-diameters, radii of curvature accentuated, wall thickness of the pipe above the conventional, multi-phase flow and so on. Within this context, appeared a new instrumented Pig, called Feeler PIG, for detection and sizing of thickness loss in pipelines with internal damage. This tool was developed to overcome several limitations that other conventional instrumented pigs have during the inspection. Several factors influence the measurement errors of the pig affecting the reliability of the results. This work shows different operating conditions and provides a test rig for feeler sensors of an inspection pig under different dynamic loads. The results of measurements of the damage type of shoulder and holes in a cyclic flat surface are evaluated, as well as a mathematical model for the sensor response and their errors from the actual behavior
Resumo:
This work describes the use of a large aperture PVDF receiver in the measurement of density of liquids and elastic constants of composite materials. The density measurement of several liquids is obtained with the accuracy of less than 0.2% using a conventional NDT emitter transducer and a 70-mm diameter, 52-μm P(VDF-TrFE) membrane with gold electrodes. The determination of the elastic constants of composite materials is based in the measurement of phase velocity. It is shown that the diffraction can lead to errors around 1% in the velocity measurement when using a pair of ultrasonic transducers (1MHz and 19mm diameter) operating in transmission-reception mode separated by a distance of 100 mm. This effect is negligible when using a pair of 10-MHz transducers. On the other hand, the dispersion at 10 MHz can result in errors of about 0.5%, measuring the velocity in composite materials. The use of an 80-mm diameter, 52-μm thick PVDF membrane receiver allows measuring the phase velocity without the diffraction effects.
Resumo:
The upcoming solar maximum, which is expected to reach its peak around May 2013, occurs at a time when our reliance on high-precision GNSS has reached unprecedented proportions. The perturbations of the ionosphere caused by increased solar activity pose a major threat to these applications. This is particularly true in equatorial regions where high exposure to solar-induced disturbances is coupled with explosive growth of precise GNSS applications. Along with the various types of solar-induced ionospheric disturbances, strong scintillations are amongst the most challenging, causing phase measurement errors up to full losses of lock for several satellites. Brazil, which heavily relies on high-precision GNSS, is one of the most affected regions due notably to the proximity to the southern crest of the ionospheric equatorial anomaly and to the South Atlantic Magnetic Anomaly. In the framework of the CIGALA project, we developed the PolaRxS™, a GNSS receiver dedicated to the monitoring of ionospheric scintillation indices not only in the GPS L1 band but for all operational and upcoming constellations and frequency bands. A network of these receivers was deployed across the whole Brazilian territory in order to first investigate and secondly to mitigate the impact of scintillation on the different signals, ensuring high precision GNSS availability and integrity in the area. This paper reports on the validation of the PolaRxS™ receiver as an ionospheric scintillation monitor and the first results of the analysis of the data collected with the CIGALA network.
Resumo:
The automatic characterization of particles in metallographic images has been paramount, mainly because of the importance of quantifying such microstructures in order to assess the mechanical properties of materials common used in industry. This automated characterization may avoid problems related with fatigue and possible measurement errors. In this paper, computer techniques are used and assessed towards the accomplishment of this crucial industrial goal in an efficient and robust manner. Hence, the use of the most actively pursued machine learning classification techniques. In particularity, Support Vector Machine, Bayesian and Optimum-Path Forest based classifiers, and also the Otsu's method, which is commonly used in computer imaging to binarize automatically simply images and used here to demonstrated the need for more complex methods, are evaluated in the characterization of graphite particles in metallographic images. The statistical based analysis performed confirmed that these computer techniques are efficient solutions to accomplish the aimed characterization. Additionally, the Optimum-Path Forest based classifier demonstrated an overall superior performance, both in terms of accuracy and speed. © 2012 Elsevier Ltd. All rights reserved.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
Pós-graduação em Engenharia Civil - FEIS
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Although measurement errors can impair statistical analysis, reliability analysis has been neglected in applied microbiology. This study assessed the intra-rater reproducibility of the Agar-based method for estimation of phospholipase activity (Pz). Pz readings were performed twice by two examiners (E1, E2), either directly on plates or in photos, and both black and white backgrounds were used. Pz values were taken from one or triplicate colonies from each sample (n=30). Intra-examiner reproducibility was estimated using Intraclass Correlation Coefficient (ICC). For both examiners, reading triplicate (ICCE1=0.91, ICCE2=0.86) was better than one colony (ICCE1=0.86, ICCE2=0.80). E1 had an excellent concordance when measurements were performed on photos using a white background (ICC=0.95) and a good concordance in the other conditions