5 resultados para Measurement in Bacteriology
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
In the context of “testing laboratory” one of the most important aspect to deal with is the measurement result. Whenever decisions are based on measurement results, it is important to have some indication of the quality of the results. In every area concerning with noise measurement many standards are available but without an expression of uncertainty, it is impossible to judge whether two results are in compliance or not. ISO/IEC 17025 is an international standard related with the competence of calibration and testing laboratories. It contains the requirements that testing and calibration laboratories have to meet if they wish to demonstrate that they operate to a quality system, are technically competent and are able to generate technically valid results. ISO/IEC 17025 deals specifically with the requirements for the competence of laboratories performing testing and calibration and for the reporting of the results, which may or may not contain opinions and interpretations of the results. The standard requires appropriate methods of analysis to be used for estimating uncertainty of measurement. In this point of view, for a testing laboratory performing sound power measurement according to specific ISO standards and European Directives, the measurement of uncertainties is the most important factor to deal with. Sound power level measurement, according to ISO 3744:1994 , performed with a limited number of microphones distributed over a surface enveloping a source is affected by a certain systematic error and a related standard deviation. Making a comparison of measurement carried out with different microphone arrays is difficult because results are affected by systematic errors and standard deviation that are peculiarities of the number of microphones disposed on the surface, their spatial position and the complexity of the sound field. A statistical approach could give an overview of the difference between sound power level evaluated with different microphone arrays and an evaluation of errors that afflict this kind of measurement. Despite the classical approach that tend to follow the ISO GUM this thesis present a different point of view of the problem related to the comparison of result obtained from different microphone arrays.
Resumo:
In this thesis we describe in detail the Monte Carlo simulation (LVDG4) built to interpret the experimental data collected by LVD and to measure the muon-induced neutron yield in iron and liquid scintillator. A full Monte Carlo simulation, based on the Geant4 (v 9.3) toolkit, has been developed and validation tests have been performed. We used the LVDG4 to determine the active vetoing and the shielding power of LVD. The idea was to evaluate the feasibility to host a dark matter detector in the most internal part, called Core Facility (LVD-CF). The first conclusion is that LVD is a good moderator, but the iron supporting structure produce a great number of neutrons near the core. The second conclusions is that if LVD is used as an active veto for muons, the neutron flux in the LVD-CF is reduced by a factor 50, of the same order of magnitude of the neutron flux in the deepest laboratory of the world, Sudbury. Finally, the muon-induced neutron yield has been measured. In liquid scintillator we found $(3.2 \pm 0.2) \times 10^{-4}$ n/g/cm$^2$, in agreement with previous measurements performed at different depths and with the general trend predicted by theoretical calculations and Monte Carlo simulations. Moreover we present the first measurement, in our knowledge, of the neutron yield in iron: $(1.9 \pm 0.1) \times 10^{-3}$ n/g/cm$^2$. That measurement provides an important check for the MC of neutron production in heavy materials that are often used as shield in low background experiments.
Resumo:
Admission blood lactate concentration has been shown to be a useful indicator of disease severity in human medicine and numerous studies have associated hyperlactatemia with patients at high risk of death who should be treated aggressively regardless of the cause of the lactate generation. The degree and duration of hyperlactacidaemia also have been correlated with the subsequent development of organ failure. Similarly, in a small number of studies about equine colic, blood lactate concentration has been investigated as a useful prognostic variable . In neonatal foals blood lactate was studied first by Magdesian (2003) who described venous blood lactate concentration in 14 normal foals during the initial 48 hours post-partum. A preliminary study about lactate concentration in foals presenting to a neonatal intensive care unit reported that surviving foals had earlier lactate clearance. The measurement of blood lactate concentration is traditionally available with a wet chemistry laboratory method or with blood-gas analyzers, for clinicians working at university or large private hospital. But this methods may not be easily accessible to many practitioners in field conditions. Several relatively inexpensive, easy to use and rapid pocket size monitors to measure lactate concentration have been validated in human patients and athletes. None of these portable lactate analyzer have been evaluated in clinically normal neonatal foals or in foals referred to a neonatal intensive care unit. The aims of this study were to validate the Lactate Scout analyzer in neonatal foals, investigating the correlation between lactate concentration in whole blood measured with the portable monitor and measured in plasma with the reference laboratory analyzer. The effect of hematocrit (Hct) on the accuracy of Lactate Scout was also evaluated. Further, we determined the utility of venous lactate measurement in critically-ill foals, describing lactate values in the most frequent neonatal pathologies, evaluating serial blood lactate measurements during hospitalization and investigating its prognostic value. The study also describes normal range for lactate in healthy neonatal foals during the first 72 hours of life.
Resumo:
Water resources management will become increasingly important in agriculture as global warming takes place. Cover crop is largely used in viticultural areas based on the many positive agronomic and environmental benefits it provides. However, it is not clear what effect the cover crop can have on water use in the vineyard. This study is designed to develop a further understanding of the role cover crops play in total water use in the vineyard and develop our understanding of the potential use of cover crops as a water management tool. Two techniques were used to measure cover crop water use, the mini-lysimeters and a portable open chamber and data from both was compared to reference evapotranspiration (ETo) (FAO guidelines). While the mini-lysimeters seemed to be limited in their ability to accurately represent the water use of the surrounding soil, the open chamber method is a reliable and suitable instrument to be used for the accurate measurement of evapotranspiration. Further, the relationship between vineyard grass water use and the contributing environmental factors thought to influence water use were analyzed. A strong relationship between total available radiation and cover crop evapotranspiration was found suggesting the possibility of an indirect method of evapotranspiration measurement in a vineyard grass cover crop. Mowing the cover crop was determined to significantly effect transpiration as shown by both the mini-lysimeter and open chamber, however, the reduction was largely dependent on the growth rate of the grass.
Resumo:
Several MCAO systems are under study to improve the angular resolution of the current and of the future generation large ground-based telescopes (diameters in the 8-40 m range). The subject of this PhD Thesis is embedded in this context. Two MCAO systems, in dierent realization phases, are addressed in this Thesis: NIRVANA, the 'double' MCAO system designed for one of the interferometric instruments of LBT, is in the integration and testing phase; MAORY, the future E-ELT MCAO module, is under preliminary study. These two systems takle the sky coverage problem in two dierent ways. The layer oriented approach of NIRVANA, coupled with multi-pyramids wavefront sensors, takes advantage of the optical co-addition of the signal coming from up to 12 NGS in a annular 2' to 6' technical FoV and up to 8 in the central 2' FoV. Summing the light coming from many natural sources permits to increase the limiting magnitude of the single NGS and to improve considerably the sky coverage. One of the two Wavefront Sensors for the mid- high altitude atmosphere analysis has been integrated and tested as a stand- alone unit in the laboratory at INAF-Osservatorio Astronomico di Bologna and afterwards delivered to the MPIA laboratories in Heidelberg, where was integrated and aligned to the post-focal optical relay of one LINC-NIRVANA arm. A number of tests were performed in order to characterize and optimize the system functionalities and performance. A report about this work is presented in Chapter 2. In the MAORY case, to ensure correction uniformity and sky coverage, the LGS-based approach is the current baseline. However, since the Sodium layer is approximately 10 km thick, the articial reference source looks elongated, especially when observed from the edge of a large aperture. On a 30-40 m class telescope, for instance, the maximum elongation varies between few arcsec and 10 arcsec, depending on the actual telescope diameter, on the Sodium layer properties and on the laser launcher position. The centroiding error in a Shack-Hartmann WFS increases proportionally to the elongation (in a photon noise dominated regime), strongly limiting the performance. To compensate for this effect a straightforward solution is to increase the laser power, i.e. to increase the number of detected photons per subaperture. The scope of Chapter 3 is twofold: an analysis of the performance of three dierent algorithms (Weighted Center of Gravity, Correlation and Quad-cell) for the instantaneous LGS image position measurement in presence of elongated spots and the determination of the required number of photons to achieve a certain average wavefront error over the telescope aperture. An alternative optical solution to the spot elongation problem is proposed in Section 3.4. Starting from the considerations presented in Chapter 3, a first order analysis of the LGS WFS for MAORY (number of subapertures, number of detected photons per subaperture, RON, focal plane sampling, subaperture FoV) is the subject of Chapter 4. An LGS WFS laboratory prototype was designed to reproduce the relevant aspects of an LGS SH WFS for the E-ELT and to evaluate the performance of different centroid algorithms in presence of elongated spots as investigated numerically and analytically in Chapter 3. This prototype permits to simulate realistic Sodium proles. A full testing plan for the prototype is set in Chapter 4.