969 resultados para Dimensional measurement accuracy


Relevância:

80.00% 80.00%

Publicador:

Resumo:

With a wide range of applications benefiting from dense network air temperature observations but with limitations of costs, existing siting guidelines and risk of damage to sensors, new methods are required to gain a high resolution understanding of the spatio-temporal patterns of urban meteorological phenomena such as the urban heat island or precision farming needs. With the launch of a new generation of low cost sensors it is possible to deploy a network to monitor air temperature at finer spatial resolutions. Here we investigate the Aginova Sentinel Micro (ASM) sensor with a bespoke radiation shield (together < US$150) which can provide secure near-real-time air temperature data to a server utilising existing (or user deployed) Wireless Fidelity (Wi-Fi) networks. This makes it ideally suited for deployment where wireless communications readily exist, notably urban areas. Assessment of the performance of the ASM relative to traceable standards in a water bath and atmospheric chamber show it to have good measurement accuracy with mean errors < ± 0.22 °C between -25 and 30 °C, with a time constant in ambient air of 110 ± 15 s. Subsequent field tests of it within the bespoke shield also had excellent performance (root-mean-square error = 0.13 °C) over a range of meteorological conditions relative to a traceable operational UK Met Office platinum resistance thermometer. These results indicate that the ASM and bespoke shield are more than fit-for-purpose for dense network deployment in urban areas at relatively low cost compared to existing observation techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Stalagmites are natural archives containing detailed information on continental climate variability of the past. Microthermometric measurements of fluid inclusion homogenisation temperatures allow determination of stalagmite formation temperatures by measuring the radius of stable laser-induced vapour bubbles inside the inclusions. A reliable method for precisely measuring the radius of vapour bubbles is presented. The method is applied to stalagmite samples for which the formation temperature is known. An assessment of the bubble radius measurement accuracy and how this error influences the uncertainty in determining the formation temperature is provided. We demonstrate that the nominal homogenisation temperature of a single inclusion can be determined with an accuracy of ±0.25 °C, if the volume of the inclusion is larger than 105 μm3. With this method, we could measure in a proof-of-principle investigation that the formation temperature of 10–20 yr old inclusions in a stalagmite taken from the Milandre cave is 9.87 ± 0.80 °C, while the mean annual surface temperature, that in the case of the Milandre cave correlates well with the cave temperature, was 9.6 ± 0.15 °C, calculated from actual measurements at that time, showing a very good agreement. Formation temperatures of inclusions formed during the last 450 yr are found in a temperature range between 8.4 and 9.6 °C, which corresponds to the calculated average surface temperature. Paleotemperatures can thus be determined within ±1.0 °C.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this article, we present the EM-algorithm for performing maximum likelihood estimation of an asymmetric linear calibration model with the assumption of skew-normally distributed error. A simulation study is conducted for evaluating the performance of the calibration estimator with interpolation and extrapolation situations. As one application in a real data set, we fitted the model studied in a dimensional measurement method used for calculating the testicular volume through a caliper and its calibration by using ultrasonography as the standard method. By applying this methodology, we do not need to transform the variables to have symmetrical errors. Another interesting aspect of the approach is that the developed transformation to make the information matrix nonsingular, when the skewness parameter is near zero, leaves the parameter of interest unchanged. Model fitting is implemented and the best choice between the usual calibration model and the model proposed in this article was evaluated by developing the Akaike information criterion, Schwarz`s Bayesian information criterion and Hannan-Quinn criterion.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Smart microgrids offer a new challenging domain for power theories and metering techniques because they include a variety of intermittent power sources which positively impact on power flow and distribution losses but may cause voltage asymmetry and frequency variation. In smart microgrids, the voltage distortion and asymmetry in presence of poly-phase nonlinear loads can be also greater than in usual distribution lines fed by the utility, thus affecting measurement accuracy and possibly causing tripping of protections. In such a context, a reconsideration of power theories is required since they form the basis for supply and load characterization. A revision of revenue metering techniques is also suggested to ensure a correct penalization of the loads for their responsibility in generating reactive power, voltage asymmetry, and distortion. This paper shows that the conservative power theory provides a suitable background to cope with smart grids characterization and metering needs. Simulation and experimental results show the properties of the proposed approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Smart micro-grids offer a new challenging domain for power theories and metering techniques, because they include a variety of intermittent power sources which positively impact on power flow and distribution losses, but may cause voltage asymmetry and frequency variation. Due to the limited power capability of smart micro-grids, the voltage distortion can also get worse (in case of supplying non-linear loads), affecting measurement accuracy and possibly causing tripping of protections. In such a context, a reconsideration of power theories is required, since they form the basis for supply and load characterization. A revision of revenue metering techniques is also needed, to ensure a correct penalization of the loads for their responsibility in generating reactive power, voltage unbalance and distortion. This paper shows that the Conservative Power Theory (CPT) provides a suitable background to cope with smart grids characterization and metering needs. Experimental results validate the proposed approach. © 2010 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

ABSTRACT Background: Patients with dementia may be unable to describe their symptoms, and caregivers frequently suffer emotional burden that can interfere with judgment of the patient's behavior. The Neuropsychiatric Inventory-Clinician rating scale (NPI-C) was therefore developed as a comprehensive and versatile instrument to assess and accurately measure neuropsychiatric symptoms (NPS) in dementia, thereby using information from caregiver and patient interviews, and any other relevant available data. The present study is a follow-up to the original, cross-national NPI-C validation, evaluating the reliability and concurrent validity of the NPI-C in quantifying psychopathological symptoms in dementia in a large Brazilian cohort. Methods: Two blinded raters evaluated 312 participants (156 patient-knowledgeable informant dyads) using the NPI-C for a total of 624 observations in five Brazilian centers. Inter-rater reliability was determined through intraclass correlation coefficients for the NPI-C domains and the traditional NPI. Convergent validity included correlations of specific domains of the NPI-C with the Brief Psychiatric Rating Scale (BPRS), the Cohen-Mansfield Agitation Index (CMAI), the Cornell Scale for Depression in Dementia (CSDD), and the Apathy Inventory (AI). Results: Inter-rater reliability was strong for all NPI-C domains. There were high correlations between NPI-C/delusions and BPRS, NPI-C/apathy-indifference with the AI, NPI-C/depression-dysphoria with the CSDD, NPI-C/agitation with the CMAI, and NPI-C/aggression with the CMAI. There was moderate correlation between the NPI-C/aberrant vocalizations and CMAI and the NPI-C/hallucinations with the BPRS. Conclusion: The NPI-C is a comprehensive tool that provides accurate measurement of NPS in dementia with high concurrent validity and inter-rater reliability in the Brazilian setting. In addition to universal assessment, the NPI-C can be completed by individual domains. © International Psychogeriatric Association 2013.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Ciências Cartográficas - FCT

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this study is to evaluate the measurement accuracy of endodontic files obtained by digital and conventional radiographies in primary teeth. Kerr and Hedströen files (# 20), with the reference as the apparent length of tooth, were inserted in the root canal of 18 extracted primary teeth, which were x-rayed by digital and conventional techniques. Measurements from a reference point to the apical end were carried out by an experienced operator twice in a week. An electronic ruler was used for the digital method and a caliper was used for the conventional method. The data were subjected to Pearson correlation test and Student´s t test (p = 0.05). The correlation between the first and the second measurements was r = 0.99, regardless the type of file and method. Comparing the measurements within the methods, the agreement was r = 0.96 for Kerr and r = 0.95 for Hedströen files. The values of length files in the digital radiographies were statistically lower than that obtained in the conventional radiographies (p = 0.02). However, the values obtained by the two methods were statistically similar to real length of teeth for Kerr files (p = 0.29) and for Hedströen files (p = 0.18). The digital radiography was a more trustful method to obtain the lengths of endodontic files.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a proposal for the automation of the camera calibration process, locating and measuring image points in coded targets with sub-pixel precision. This automatic technique helps minimize localization errors, regardless of camera orientation and image scale. To develop this technique, several types of coded targets were analyzed and the ARUCO type was chosen due to its simplicity, ability to represent up to 1024 different targets and availability of source code implemented with the OpenCV library. ARUCO targets were generated and two calibration sheets were assembled to be used for the acquisition of images for camera calibration. The developed software can locate targets in the acquired images and it automatically extracts the coordinates of the four corners with sub-pixel accuracy. Experiments were conducted with real data showing that the targets are correctly identified unless excessive noise or fragmentation occurs mainly in the outer target square. The results with the calibration of a low cost camera showed that the process works and that the measurement accuracy of the corners achieves sub-pixel precision.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of computer-assisted technologies such as CAD - Computed Aided Design, CAM - Computed Aided Manufacturing, CAE - Computed Aided Engineering and CNC - Computed Numerical Control, are priorities in engineering and product designers. However, the dimensional measurement between the virtual and the real product design requires research, and dissemination procedures among its users. This work aims to use these technologies, through analysis and measurement of a CNC milling machine, designed and assembled in the university. Through the use of 3D scanning, and analyzing images of the machined samples, and its original virtual files, it was possible to compare the sizes of these samples in counterposition to the original virtual dimensions, we can state that the distortions between the real and virtual, are within acceptable limits for this type of equipment. As a secondary objective, this work seeks to disseminate and make more accessible the use of these technologies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Die obere Troposphäre / untere Stratosphäre (UTLS: Upper Troposphere / Lower Stratosphere)ist die Übergangsgregion zwischen den dynamisch, chemisch und mikrophysikalisch sehr verschiedenen untersten Atmosphärenschichten, der Troposphäre und der Stratosphäre. Strahlungsaktive Spurengase, wie zum Beispiel Wasserdampf (H2O), Ozon (O3) oder Kohlenstoffdioxid (CO2), und Wolken in der UTLS beeinflussen das Strahlungsbudget der Atmosphäre und das globale Klima. Mögliche Veränderungen in den Verteilungen und Konzentrationen dieser Spurengase modifizieren den Strahlungsantrieb der Atmosphäre und können zum beobachteten Klimawandel beitragen. Ziel dieser Arbeit ist es, Austausch- und Mischungsprozesse innerhalb der UTLS besser zu verstehen und damit Veränderungen der Spurengaszusammensetzung dieser Region genauer prognostizieren zu können. Grundlage hierfür bilden flugzeuggetragene in-situ Spurengasmessungen in der UTLS, welche während der Flugzeugmesskampagnen TACTS / ESMVal 2012 und AIRTOSS - ICE 2013 durchgeführt wurden. Hierbei wurde bei den Messungen von AIRTOSS - ICE 2013 das im Rahmen dieser Arbeit aufgebaute UMAQS (University of Mainz Airborne QCLbased Spectrometer) - Instrument zur Messung der troposphärischen Spurengase Distickstoffmonoxid (N2O) und Kohlenstoffmonoxid (CO) eingesetzt. Dieses erreicht bei einer zeitlichen Auflösung von 1 s eine Messunsicherheit von 0,39 ppbv und 1,39 ppbv der N2O bzw. CO-Mischungsverhältnisse. Die hohe Zeitauflösung und Messgenauigkeit der N2O- und CO- Daten erlaubt die Untersuchung von kleinskaligen Austauschprozessen zwischen Troposphäre und Stratosphäre im Bereich der Tropopause auf räumlichen Skalen kleiner 200 m. Anhand der N2O-Daten von AIRTOSS - ICE 2013 können in-situ detektierte Zirruspartikel in eisübersättigter Luft oberhalb der N2O-basierten chemischen Tropopause nachgewiesen werden. Mit Hilfe der N2O-CO-Korrelation sowie der Analyse von ECMWF-Modelldaten und der Berechnung von Rückwärtstrajektorien kann deren Existenz auf das irreversible Vermischen von troposphärischen und stratosphärischen Luftmassen zurückgeführt werden. Mit den in-situ Messungen von N2O, CO und CH4 (Methan) von TACTS und ESMVal 2012 werden die großräumigen Spurengasverteilungen bis zu einer potentiellen Temperatur von Theta = 410 K in der extratropischen Stratosphäre untersucht. Hierbei kann eine Verjüngung der Luftmassen in der extratropischen Stratosphäre mit Delta Theta > 30 K (relativ zur dynamischen Tropopause) über den Zeitraum der Messkampagne (28.08.2012 - 27.09.2012) nachgewiesen werden. Die Korrelation von N2O mit O3 zeigt, dass diese Verjüngung aufgrund des verstärkten Eintrages von Luftmassen aus der tropischen unteren Stratosphäre verursacht wird. Diese werden über den flachen Zweig der Brewer-Dobson-Zirkulation auf Zeitskalen von wenigen Wochen in die extratropische Stratosphäre transportiert. Anhandrnder Analyse der CO-O3-Korrelation eines Messfluges vom 30.08.2012 wird das irreversible Einmischen von Luftmassen aus der tropischen Stratosphäre in die Extratropen auf Isentropen mit Theta > 380 K identifiziert. Rückwärtstrajektorien zeigen, dass der Ursprung der eingemischten tropischen Luftmassen im Bereich der sommerlichen Antizyklone des asiatischen Monsuns liegt.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nowadays the environmental issues and the climatic change play fundamental roles in the design of urban spaces. Our cities are growing in size, many times only following immediate needs without a long-term vision. Consequently, the sustainable development has become not only an ethical but also a strategic need: we can no longer afford an uncontrolled urban expansion. One serious effect of the territory industrialisation process is the increase of urban air and surfaces temperatures compared to the outlying rural surroundings. This difference in temperature is what constitutes an urban heat island (UHI). The purpose of this study is to provide a clarification on the role of urban surfacing materials in the thermal dynamics of an urban space, resulting in useful indications and advices in mitigating UHI. With this aim, 4 coloured concrete bricks were tested, measuring their emissivity and building up their heat release curves using infrared thermography. Two emissivity evaluation procedures were carried out and subsequently put in comparison. Samples performances were assessed, and the influence of the colour on the thermal behaviour was investigated. In addition, some external pavements were analysed. Albedo and emissivity parameters were evaluated in order to understand their thermal behaviour in different conditions. Surfaces temperatures were recorded in a one-day measurements campaign. ENVI-met software was used to simulate how the tested materials would behave in two typical urban scenarios: a urban canyon and a urban heat basin. Improvements they can carry to the urban microclimate were investigated. Emissivities obtained for the bricks ranged between 0.92 and 0.97, suggesting a limited influence of the colour on this parameter. Nonetheless, white concrete brick showed the best thermal performance, whilst the black one the worst; red and yellow ones performed pretty identical intermediate trends. De facto, colours affected the overall thermal behaviour. Emissivity parameter was measured in the outdoor work, getting (as expected) high values for the asphalts. Albedo measurements, conducted with a sunshine pyranometer, proved the improving effect given by the yellow paint in terms of solar reflection, and the bad influence of haze on the measurement accuracy. ENVI-met simulations gave a demonstration on the effectiveness in thermal improving of some tested materials. In particular, results showed good performances for white bricks and granite in the heat basin scenario, and painted concrete and macadam in the urban canyon scenario. These materials can be considered valuable solutions in UHI mitigation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Beim Laser-Sintern wird das Pulverbett durch Heizstrahler vorgeheizt, um an der Pulveroberfläche eine Temperatur knapp unterhalb des Materialschmelzpunktes zu erzielen. Dabei soll die Temperaturverteilung auf der Oberfläche möglichst homogen sein, um gleiche Bauteileigenschaften im gesamten Bauraum zu erzielen und den Bauteilverzug gering zu halten. Erfahrungen zeigen jedoch sehr inhomogene Temperaturverteilungen, weshalb oftmals die Integration von neuen oder optimierten Prozessüberwachungssystemen in die Anlagen gefordert wird. Ein potentiell einsetzbares System sind Thermographiekameras, welche die flächige Aufnahme von Oberflächentemperaturen und somit Aussagen über die Temperaturen an der Pulverbettoberfläche erlauben. Dadurch lassen sich kalte Bereiche auf der Oberfläche identifizieren und bei der Prozessvorbereitung berücksichtigen. Gleichzeitig ermöglicht die Thermografie eine Beobachtung der Temperaturen beim Lasereingriff und somit das Ableiten von Zusammenhängen zwischen Prozessparametern und Schmelzetemperaturen. Im Rahmen der durchgeführten Untersuchungen wurde ein IR-Kamerasystem erfolgreich als Festeinbau in eine Laser-Sinteranlage integriert und Lösungen für die hierbei auftretenden Probleme erarbeitet. Anschließend wurden Untersuchungen zur Temperaturverteilung auf der Pulverbettoberfläche sowie zu den Einflussfaktoren auf deren Homogenität durchgeführt. In weiteren Untersuchungen wurden die Schmelzetemperaturen in Abhängigkeit verschiedener Prozessparameter ermittelt. Auf Basis dieser Messergebnisse wurden Aussagen über erforderliche Optimierungen getroffen und die Nutzbarkeit der Thermografie beim Laser-Sintern zur Prozessüberwachung, -regelung sowie zur Anlagenwartung als erster Zwischenstand der Untersuchungen bewertet.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mass spectrometric analysis of elemental and isotopic compositions of several NIST standards is performed by a miniature laser ablation/ionisation reflectron-type time-of-flight mass spectrometer (LMS) using a fs-laser ablation ion source (775 nm, 190 fs, 1 kHz). The results of the mass spectrometric studies indicate that in a defined range of laser irradiance (fluence) and for a certain number of accumulations of single laser shot spectra, the measurements of isotope abundances can be conducted with a measurement accuracy at the per mill level and at the per cent level for isotope concentrations higher and lower than 100 ppm, respectively. Also the elemental analysis can be performed with a good accuracy. The LMS instrument combined with a fs-laser ablation ion source exhibits similar detection efficiency for both metallic and non-metallic elements. Relative sensitivity coefficients were determined and found to be close to one, which is of considerable importance for the development of standard-less instruments. Negligible thermal effects, sample damage and excellent characteristics of the fs-laser beam are thought to be the main reason for substantial improvement of the instrumental performance compared to other laser ablation mass spectrometers.