997 resultados para Calibration tests


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Theories can be produced by individuals seeking a good reputation of knowledge. Hence, a significant question is how to test theories anticipating that they might have been produced by (potentially uninformed) experts who prefer their theories not to be rejected. If a theory that predicts exactly like the data generating process is not rejected with high probability then the test is said to not reject the truth. On the other hand, if a false expert, with no knowledge over the data generating process, can strategically select theories that will not be rejected then the test can be ignorantly passed. These tests have limited use because they cannot feasibly dismiss completely uninformed experts. Many tests proposed in the literature (e.g., calibration tests) can be ignorantly passed. Dekel and Feinberg (2006) introduced a class of tests that seemingly have some power of dismissing uninformed experts. We show that some tests from their class can also be ignorantly passed. One of those tests, however, does not reject the truth and cannot be ignorantly passed. Thus, this empirical test can dismiss false experts.We also show that a false reputation of knowledge can be strategically sustained for an arbitrary, but given, number of periods, no matted which test is used (provided that it does not reject the truth). However, false experts can be discredited, even with bounded data sets, if the domain of permissible theories is mildly restricted.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hardness is a property largely used in material specifications, mechanical and metallurgical research and quality control of several materials. Specifically for timber, Janka hardness is a simple, quick and easy test, with good correlations with the compression parallel to grain strength, a strong reference in structural classification for this material. More recently, international studies have reported the use of Brinell hardness for timber assessment which resumes the advantages previously mentioned for Janka hardness and make it easier to be performed in the field, especially because of the lower magnitude of the involved loads. A first generation of an equipment for field evaluation of hardness in wood - Portable Hardness tester for wood - based on Brinell hardness has already been developed by the Research Group on Forest Products from FCA/UNESP, Brazil, with very good correlations between the evaluated hardness and several other mechanical properties of the material when performing tests with different species of native and reforested wood (traditionally used as ties - sleepers - in railways). This paper presents results obtained in the experimental program with the first generation of this equipment and preliminary tests with its second generation, which uses accelerometers to substitute the indentation measurements in wood. For the first generation of the equipment functional and calibration tests were carried out using 16 native and reforestation timber lots, among there E. citriodora, E. tereticornis, E. saligna, E. urophylla, E. grandis, Goupia glabra and Bagassa guianenses, with different origins and ages. The results obtained confirm its potential in the classification of specimens, with inclusion errors varying from 4.5% to 16.6%.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Energia na Agricultura) - FCA

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ein wesentlicher Anteil an organischem Kohlenstoff, der in der Atmosphäre vorhanden ist, wird als leichtflüchtige organische Verbindungen gefunden. Diese werden überwiegend durch die Biosphäre freigesetzt. Solche biogenen Emissionen haben einen großen Einfluss auf die chemischen und physikalischen Eigenschaften der Atmosphäre, indem sie zur Bildung von bodennahem Ozon und sekundären organischen Aerosolen beitragen. Um die Bildung von bodennahem Ozon und von sekundären organischen Aerosolen besser zu verstehen, ist die technische Fähigkeit zur genauen Messung der Summe dieser flüchtigen organischen Substanzen notwendig. Häufig verwendete Methoden sind nur auf den Nachweis von spezifischen Nicht-Methan-Kohlenwasserstoffverbindungen fokussiert. Die Summe dieser Einzelverbindungen könnte gegebenenfalls aber nur eine Untergrenze an atmosphärischen organischen Kohlenstoffkonzentrationen darstellen, da die verfügbaren Methoden nicht in der Lage sind, alle organischen Verbindungen in der Atmosphäre zu analysieren. Einige Studien sind bekannt, die sich mit der Gesamtkohlenstoffbestimmung von Nicht-Methan-Kohlenwasserstoffverbindung in Luft beschäftigt haben, aber Messungen des gesamten organischen Nicht-Methan-Verbindungsaustauschs zwischen Vegetation und Atmosphäre fehlen. Daher untersuchten wir die Gesamtkohlenstoffbestimmung organische Nicht-Methan-Verbindungen aus biogenen Quellen. Die Bestimmung des organischen Gesamtkohlenstoffs wurde durch Sammeln und Anreichern dieser Verbindungen auf einem festen Adsorptionsmaterial realisiert. Dieser erste Schritt war notwendig, um die stabilen Gase CO, CO2 und CH4 von der organischen Kohlenstofffraktion zu trennen. Die organischen Verbindungen wurden thermisch desorbiert und zu CO2 oxidiert. Das aus der Oxidation entstandene CO2 wurde auf einer weiteren Anreicherungseinheit gesammelt und durch thermische Desorption und anschließende Detektion mit einem Infrarot-Gasanalysator analysiert. Als große Schwierigkeiten identifizierten wir (i) die Abtrennung von CO2 aus der Umgebungsluft von der organischen Kohlenstoffverbindungsfaktion während der Anreicherung sowie (ii) die Widerfindungsraten der verschiedenen Nicht-Methan-Kohlenwasserstoff-verbindungen vom Adsorptionsmaterial, (iii) die Wahl des Katalysators sowie (iiii) auftretende Interferenzen am Detektor des Gesamtkohlenstoffanalysators. Die Wahl eines Pt-Rd Drahts als Katalysator führte zu einem bedeutenden Fortschritt in Bezug auf die korrekte Ermittlung des CO2-Hintergrund-Signals. Dies war notwendig, da CO2 auch in geringen Mengen auf der Adsorptionseinheit während der Anreicherung der leichtflüchtigen organischen Substanzen gesammelt wurde. Katalytische Materialien mit hohen Oberflächen stellten sich als unbrauchbar für diese Anwendung heraus, weil trotz hoher Temperaturen eine CO2-Aufnahme und eine spätere Abgabe durch das Katalysatormaterial beobachtet werden konnte. Die Methode wurde mit verschiedenen leichtflüchtigen organischen Einzelsubstanzen sowie in zwei Pflanzenkammer-Experimenten mit einer Auswahl an VOC-Spezies getestet, die von unterschiedlichen Pflanzen emittiert wurden. Die Pflanzenkammer-messungen wurden durch GC-MS und PTR-MS Messungen begleitet. Außerdem wurden Kalibrationstests mit verschiedenen Einzelsubstanzen aus Permeations-/Diffusionsquellen durchgeführt. Der Gesamtkohlenstoffanalysator konnte den tageszeitlichen Verlauf der Pflanzenemissionen bestätigen. Allerdings konnten Abweichungen für die Mischungsverhältnisse des organischen Gesamtkohlenstoffs von bis zu 50% im Vergleich zu den begleitenden Standardmethoden beobachtet werden.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En este proyecto se evalúa la posibilidad de mejorar la producción de gas en un yacimiento en el que las reservas son todavía suficientes para ser explotadas. Para realizar esta evaluación se estudia en primer lugar la cantidad de gas que se extrae sin ningún método de mejoramiento de producción, y después se evalúa la cantidad de gas que se extraería si se aplicase el método de estimulación de yacimientos llamado fracturación hidráulica mediante unos ensayos de calibración que se han realizado en uno de los pozos que hay en el yacimiento de gas. Finalmente se hace una comparación económica de cómo mejoraría la producción de gas cuando el yacimiento no está estimulado y cuando lo está. Además, se aprovecha también para hacer una sucinta revisión de la base teórica de la fracturación hidráulica y de la ingeniería de yacimientos, así como un breve estudio de impacto ambiental de la fracturación. ABSTRACT This project is trying to assess the possibility to improve the gas production of a reservoir in which there are still enough reserves to be exploited. To do that, it is necessary to evaluate the quantity of gas that is producing without any type of stimulation method and then assess how the production will grow if a stimulation method called hydraulic fracturing is applied though some calibration tests, subsequently compare how much the incomes will be increased in case of application of the stimulation method. In addition, some of the basis and concepts of reservoir stimulation and reservoir engineering are reviewed in order to understand the project. Finally, a brief environmental impact is also assessed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a new methodology for measurement of the instantaneous average exhaust mass flow rate in reciprocating internal combustion engines to be used to determinate real driving emissions on light duty vehicles, as part of a Portable Emission Measurement System (PEMS). Firstly a flow meter, named MIVECO flow meter, was designed based on a Pitot tube adapted to exhaust gases which are characterized by moisture and particle content, rapid changes in flow rate and chemical composition, pulsating and reverse flow at very low engine speed. Then, an off-line methodology was developed to calculate the instantaneous average flow, considering the ?square root error? phenomenon. The paper includes the theoretical fundamentals, the developed flow meter specifications, the calibration tests, the description of the proposed off-line methodology and the results of the validation test carried out in a chassis dynamometer, where the validity of the mass flow meter and the methodology developed are demonstrated.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We use sunspot group observations from the Royal Greenwich Observatory (RGO) to investigate the effects of intercalibrating data from observers with different visual acuities. The tests are made by counting the number of groups RB above a variable cut-off threshold of observed total whole-spot area (uncorrected for foreshortening) to simulate what a lower acuity observer would have seen. The synthesised annual means of RB are then re-scaled to the full observed RGO group number RA using a variety of regression techniques. It is found that a very high correlation between RA and RB (rAB > 0.98) does not prevent large errors in the intercalibration (for example sunspot maximum values can be over 30 % too large even for such levels of rAB). In generating the backbone sunspot number (RBB), Svalgaard and Schatten (2015, this issue) force regression fits to pass through the scatter plot origin which generates unreliable fits (the residuals do not form a normal distribution) and causes sunspot cycle amplitudes to be exaggerated in the intercalibrated data. It is demonstrated that the use of Quantile-Quantile (“Q  Q”) plots to test for a normal distribution is a useful indicator of erroneous and misleading regression fits. Ordinary least squares linear fits, not forced to pass through the origin, are sometimes reliable (although the optimum method used is shown to be different when matching peak and average sunspot group numbers). However, other fits are only reliable if non-linear regression is used. From these results it is entirely possible that the inflation of solar cycle amplitudes in the backbone group sunspot number as one goes back in time, relative to related solar-terrestrial parameters, is entirely caused by the use of inappropriate and non-robust regression techniques to calibrate the sunspot data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An Adaptive Optic (AO) system is a fundamental requirement of 8m-class telescopes. We know that in order to obtain the maximum possible resolution allowed by these telescopes we need to correct the atmospheric turbulence. Thanks to adaptive optic systems we are able to use all the effective potential of these instruments, drawing all the information from the universe sources as best as possible. In an AO system there are two main components: the wavefront sensor (WFS) that is able to measure the aberrations on the incoming wavefront in the telescope, and the deformable mirror (DM) that is able to assume a shape opposite to the one measured by the sensor. The two subsystem are connected by the reconstructor (REC). In order to do this, the REC requires a “common language" between these two main AO components. It means that it needs a mapping between the sensor-space and the mirror-space, called an interaction matrix (IM). Therefore, in order to operate correctly, an AO system has a main requirement: the measure of an IM in order to obtain a calibration of the whole AO system. The IM measurement is a 'mile stone' for an AO system and must be done regardless of the telescope size or class. Usually, this calibration step is done adding to the telescope system an auxiliary artificial source of light (i.e a fiber) that illuminates both the deformable mirror and the sensor, permitting the calibration of the AO system. For large telescope (more than 8m, like Extremely Large Telescopes, ELTs) the fiber based IM measurement requires challenging optical setups that in some cases are also impractical to build. In these cases, new techniques to measure the IM are needed. In this PhD work we want to check the possibility of a different method of calibration that can be applied directly on sky, at the telescope, without any auxiliary source. Such a technique can be used to calibrate AO system on a telescope of any size. We want to test the new calibration technique, called “sinusoidal modulation technique”, on the Large Binocular Telescope (LBT) AO system, which is already a complete AO system with the two main components: a secondary deformable mirror with by 672 actuators, and a pyramid wavefront sensor. My first phase of PhD work was helping to implement the WFS board (containing the pyramid sensor and all the auxiliary optical components) working both optical alignments and tests of some optical components. Thanks to the “solar tower” facility of the Astrophysical Observatory of Arcetri (Firenze), we have been able to reproduce an environment very similar to the telescope one, testing the main LBT AO components: the pyramid sensor and the secondary deformable mirror. Thanks to this the second phase of my PhD thesis: the measure of IM applying the sinusoidal modulation technique. At first we have measured the IM using a fiber auxiliary source to calibrate the system, without any kind of disturbance injected. After that, we have tried to use this calibration technique in order to measure the IM directly “on sky”, so adding an atmospheric disturbance to the AO system. The results obtained in this PhD work measuring the IM directly in the Arcetri solar tower system are crucial for the future development: the possibility of the acquisition of IM directly on sky means that we are able to calibrate an AO system also for extremely large telescope class where classic IM measurements technique are problematic and, sometimes, impossible. Finally we have not to forget the reason why we need this: the main aim is to observe the universe. Thanks to these new big class of telescopes and only using their full capabilities, we will be able to increase our knowledge of the universe objects observed, because we will be able to resolve more detailed characteristics, discovering, analyzing and understanding the behavior of the universe components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Camera-laser calibration is necessary for many robotics and computer vision applications. However, existing calibration toolboxes still require laborious effort from the operator in order to achieve reliable and accurate results. This paper proposes algorithms that augment two existing trustful calibration methods with an automatic extraction of the calibration object from the sensor data. The result is a complete procedure that allows for automatic camera-laser calibration. The first stage of the procedure is automatic camera calibration which is useful in its own right for many applications. The chessboard extraction algorithm it provides is shown to outperform openly available techniques. The second stage completes the procedure by providing automatic camera-laser calibration. The procedure has been verified by extensive experimental tests with the proposed algorithms providing a major reduction in time required from an operator in comparison to manual methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Serologic methods have been used widely to test for celiac disease and have gained importance in diagnostic definition and in new epidemiologic findings. However, there is no standardization, and there are no reference protocols and materials. METHODS: The European working group on Serological Screening for Celiac Disease has defined robust noncommercial test protocols for immunoglobulin (Ig)G and IgA gliadin antibodies and for IgA autoantibodies against endomysium and tissue transglutaminase. Standard curves were linear in the decisive range, and intra-assay variation coefficients were less than 5% to 10%. Calibration was performed with a group reference serum. Joint cutoff limits were used. Seven laboratories took part in the final collaborative study on 252 randomized sera classified by histology (103 pediatric and adult patients with active celiac disease, 89 disease control subjects, and 60 blood donors). RESULTS: IgA autoantibodies against endomysium and tissue transglutaminase rendered superior sensitivity (90% and 93%, respectively) and specificity (99% and 95%, respectively) over IgA and IgG gliadin antibodies. Tissue transglutaminase antibody testing showed superior receiver operating characteristic performance compared with gliadin antibodies. The K values for interlaboratory reproducibility showed superiority for IgA endomysium (0.93) in comparison with tissue transglutaminase antibodies (0.83) and gliadin antibodies (0.82 for IgG, 0.62 for IgA). CONCLUSIONS: Basic criteria of standardization and quality assessment must be fulfilled by any given test protocol proposed for serologic investigation of celiac disease. The working group has produced robust test protocols and reference materials available for standardization to further improve reliability of serologic testing for celiac disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new Ultra-High Vacuum (UHV) reflectance spectrometer was successfully designed, making use of a Janis Industries ST-400 sample cryostat, IR Labs bolometer, and Briiker IFS 66 v/S spectrometer. Two of the noteworthy features include an in situ gold evaporator and internal reference path, both of which allow for the experiment to progress with a completely undisturbed sample position. As tested, the system was designed to operate between 4.2 K and 325 K over a frequency range of 60 - 670 cm~^. This frequency range can easily be extended through the addition of appUcable detectors. Tests were performed on SrTiOa, a highly ionic incipient ferroelectric insulator with a well known reflectance. The presence and temperatmre dependence of the lowest frequency "soft" mode were measured, as was the presence of the other two infrared modes. During the structural phase transition from cubic to tetragonal perovskite, the splitting of the second phonon mode was also observed. All of the collected data indicate good agreement with previous measurements, with a minor discrepency between the actual and recorded sample temperatures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses a new method of impedance control that has been successfully implemented on the master robot of a teleoperation system. The method involves calibrating the robot to quantify the effect of adjustable controller parameters on the impedances along its different axes. The empirical equations relating end-effector impedance to the controller's feedback gains are obtained by performing system identification tests along individual axes of the robot. With these equations, online control of end-effector stiffness and damping is possible without having to monitor joint torques or solving complex algorithms. Hard contact conditions and compliant interfaces have been effectively demonstrated on a telemanipulation test-bed using appropriate combinations of stiffness and damping settings obtained by this method.