913 resultados para Nondemolition Measurements
Resumo:
In the context of “testing laboratory” one of the most important aspect to deal with is the measurement result. Whenever decisions are based on measurement results, it is important to have some indication of the quality of the results. In every area concerning with noise measurement many standards are available but without an expression of uncertainty, it is impossible to judge whether two results are in compliance or not. ISO/IEC 17025 is an international standard related with the competence of calibration and testing laboratories. It contains the requirements that testing and calibration laboratories have to meet if they wish to demonstrate that they operate to a quality system, are technically competent and are able to generate technically valid results. ISO/IEC 17025 deals specifically with the requirements for the competence of laboratories performing testing and calibration and for the reporting of the results, which may or may not contain opinions and interpretations of the results. The standard requires appropriate methods of analysis to be used for estimating uncertainty of measurement. In this point of view, for a testing laboratory performing sound power measurement according to specific ISO standards and European Directives, the measurement of uncertainties is the most important factor to deal with. Sound power level measurement, according to ISO 3744:1994 , performed with a limited number of microphones distributed over a surface enveloping a source is affected by a certain systematic error and a related standard deviation. Making a comparison of measurement carried out with different microphone arrays is difficult because results are affected by systematic errors and standard deviation that are peculiarities of the number of microphones disposed on the surface, their spatial position and the complexity of the sound field. A statistical approach could give an overview of the difference between sound power level evaluated with different microphone arrays and an evaluation of errors that afflict this kind of measurement. Despite the classical approach that tend to follow the ISO GUM this thesis present a different point of view of the problem related to the comparison of result obtained from different microphone arrays.
Resumo:
The aim of this thesis was to study the effects of extremely low frequency (ELF) electromagnetic magnetic fields on potassium currents in neural cell lines ( Neuroblastoma SK-N-BE ), using the whole-cell Patch Clamp technique. Such technique is a sophisticated tool capable to investigate the electrophysiological activity at a single cell, and even at single channel level. The total potassium ion currents through the cell membrane was measured while exposing the cells to a combination of static (DC) and alternate (AC) magnetic fields according to the prediction of the so-called â Ion Resonance Hypothesis â. For this purpose we have designed and fabricated a magnetic field exposure system reaching a good compromise between magnetic field homogeneity and accessibility to the biological sample under the microscope. The magnetic field exposure system consists of three large orthogonal pairs of square coils surrounding the patch clamp set up and connected to the signal generation unit, able to generate different combinations of static and/or alternate magnetic fields. Such system was characterized in term of field distribution and uniformity through computation and direct field measurements. No statistically significant changes in the potassium ion currents through cell membrane were reveled when the cells were exposed to AC/DC magnetic field combination according to the afore mentioned âIon Resonance Hypothesisâ.
Resumo:
Compared with other mature engineering disciplines, fracture mechanics of concrete is still a developing field and very important for structures like bridges subject to dynamic loading. An historical point of view of what done in the field is provided and then the project is presented. The project presents an application of the Digital Image Correlation (DIC) technique for the detection of cracks at the surface of concrete prisms (500mmx100mmx100mm) subject to flexural loading conditions (Four Point Bending test). The technique provide displacement measurements of the region of interest and from this displacement field information about crack mouth opening (CMOD) are obtained and related to the applied load. The evolution of the fracture process is shown through graphs and graphical maps of the displacement at some step of the loading process. The study shows that it is possible with the DIC system to detect the appearance and evolution of cracks, even before the cracks become visually detectable.
Resumo:
Dynamische Messungen mit Quarzresonatoren Die Resonanzfrequenz von Quarzoszillatoren liegt im MHz-Bereich. Die Resonanzen haben hohe Gueten und sind somit empfindlich auf kleine Aenderungen an der Resonatoroberflaeche. 1. Es wurde ein Aufbau entwickelt, um Reibung bei hohen Oberflaechengeschwindigkeiten zu messen (v = 1 m/s). Bei Annaeherung einer Kugel steigen Resonanzfrequenz sowie -breite des Schwingquarzes an. Für groeßere Normalkraefte entsteht ein elastischer Kontakt, der die Frequenzerhoehung erklaert. Kurz vor Eintreten dieses Kontaktes durchlaeuft die Daempfung ein Maximum, das charakteristisch ist für das Auftreten von Reibung. Bei Erhoehung der Schichtdicke (0,4-2,5 nm) einer Schmiermittelbeschichtung (Perfluoropolyether) verringern sich sowohl die Hoehe als auch die Breite dieses Maximums. Es verschwindet mit vollstaendiger Belegung mit einer Monolage (ca. 2 nm). Dies wird durch einen intermittierenden Kontakt der beiden Oberflaechen erklaert. 2. Die Schwingquarzoberfläche wurde mit Polymerbuersten verschiedener Schichtdicken (12-230 nm) beschichtet. Der Loesungsmittelgehalt in diesen Filmen variiert mit dem Dampfdruck der umgebenden Toluolatmosphaere. Bei Trocknung durchlaufen die Filme einen loesungsmittelinduzierten Glasuebergang. Die Sorptionskurven (Loesungsmittelgehalt gegen Dampfdruck) zeigen eine Knick beim Glasuebergang, ihre Ableitungen dagegen eine Stufe. Fuer duenner werdende Schichten verschiebt sich diese Stufe zu niedrigerem Dampfdruck sowie geringerem Loesungsmittelgehalt. Außerdem wird sie breiter und ihre Hoehe nimmt ab.
Resumo:
This thesis investigates phenomena of vortex dynamics in type II superconductors depending on the dimensionality of the flux-line system and the strength of the driving force. In the low dissipative regime of Bi_2Sr_2CaCu_2O_{8+delta} (BSCCO) the influence of oxygen stoichiometry on flux-line tension was examined. An entanglement crossover of the vortex system at low magnetic fields was identified and a comprehensive B-T phase diagram of solid and fluid phases derived.In YBa_2Cu_3O_7 (YBCO) extremely long (>100 mm) high-quality measurement bridges allowed to extend the electric-field window in transport measurements by up to three orders of magnitude. Complementing analyses of the data conclusively produced dynamic exponents of the glass transition z~9 considerably higher than theoretically predicted and previously reported. In high-dissipative measurements a voltage instability appearing in the current-voltage characteristics of type II superconductors was observed for the first time in BSCCO and shown to result from a Larkin-Ovchinnikov flux-flow vortex instability under the influence of quasi-particle heating. However, in an analogous investigation of YBCO the instability was found to appear only in the temperature and magnetic-field regime of the vortex-glass state. Rapid-pulse measurements fully confirmed this correlation of vortex glass and instability in YBCO and revealed a constant rise time (~µs).
Resumo:
This thesis regards the Wireless Sensor Network (WSN), as one of the most important technologies for the twenty-first century and the implementation of different packet correcting erasure codes to cope with the ”bursty” nature of the transmission channel and the possibility of packet losses during the transmission. The limited battery capacity of each sensor node makes the minimization of the power consumption one of the primary concerns in WSN. Considering also the fact that in each sensor node the communication is considerably more expensive than computation, this motivates the core idea to invest computation within the network whenever possible to safe on communication costs. The goal of the research was to evaluate a parameter, for example the Packet Erasure Ratio (PER), that permit to verify the functionality and the behavior of the created network, validate the theoretical expectations and evaluate the convenience of introducing the recovery packet techniques using different types of packet erasure codes in different types of networks. Thus, considering all the constrains of energy consumption in WSN, the topic of this thesis is to try to minimize it by introducing encoding/decoding algorithms in the transmission chain in order to prevent the retransmission of the erased packets through the Packet Erasure Channel and save the energy used for each retransmitted packet. In this way it is possible extend the lifetime of entire network.
Resumo:
The g-factor is a constant which connects the magnetic moment $vec{mu}$ of a charged particle, of charge q and mass m, with its angular momentum $vec{J}$. Thus, the magnetic moment can be writen $ vec{mu}_J=g_Jfrac{q}{2m}vec{J}$. The g-factor for a free particle of spin s=1/2 should take the value g=2. But due to quantum electro-dynamical effects it deviates from this value by a small amount, the so called g-factor anomaly $a_e$, which is of the order of $10^{-3}$ for the free electron. This deviation is even bigger if the electron is exposed to high electric fields. Therefore highly charged ions, where electric field strength gets values on the order of $10^{13}-10^{16}$V/cm at the position of the bound electron, are an interesting field of investigations to test QED-calculations. In previous experiments [H"aff00,Ver04] using a single hydrogen-like ion confined in a Penning trap an accuracy of few parts in $10^{-9}$ was obtained. In the present work a new method for precise measurement of magnetic the electronic g-factor of hydrogen-like ions is discussed. Due to the unavoidable magnetic field inhomogeneity in a Penning trap, a very important contribution to the systematic uncertainty in the previous measurements arose from the elevated energy of the ion required for the measurement of its motional frequencies. Then it was necessary to extrapolate the result to vanishing energies. In the new method the energy in the cyclotron degree of freedom is reduced to the minimum attainable energy. This method consist in measuring the reduced cyclotron frequency $nu_{+}$ indirectly by coupling the axial to the reduced cyclotron motion by irradiation of the radio frequency $nu_{coup}=nu_{+}-nu_{ax}+delta$ where $delta$ is, in principle, an unknown detuning that can be obtained from the knowledge of the coupling process. Then the only unknown parameter is the desired value of $nu_+$. As a test, a measurement with, for simplicity, artificially increased axial energy was performed yielding the result $g_{exp}=2.000~047~020~8(24)(44)$. This is in perfect agreement with both the theoretical result $g_{theo}=2.000~047~020~2(6)$ and the previous experimental result $g_{exp1}=2.000~047~025~4(15)(44).$ In the experimental results the second error-bar is due to the uncertainty in the accepted value for the electron's mass. Thus, with the new method a higher accuracy in the g-factor could lead by comparison to the theoretical value to an improved value of the electron's mass. [H"af00] H. H"affner et al., Phys. Rev. Lett. 85 (2000) 5308 [Ver04] J. Verd'u et al., Phys. Rev. Lett. 92 (2004) 093002-1
Resumo:
La misura della luminosità è un obiettivo importante per tutta la fisica del modello standard e per la scoperta di nuova fisica, poiché è legata alla sezione d'urto (σ) e al rate di produzione (R) di un determinato processo dalla relazione L = R*σ. Nell'eserimento ATLAS a LHC è installato un monitor di luminosità dedicato chiamato LUCID (Luminosity measurements Using Cherenkov Integrating Detector). Grazie ai dati acquisiti durante il 2010 la valutazione off-line delle performances del LUCID e l'implementazione di controlli on-line sulla qualità dei dati raccolti è stata possibile. I dati reali sono stati confrontati con i dati Monte Carlo e le simulazioni sono state opportunamente aggiustate per ottimizzare l'accordo tra i due. La calibrazione della luminosità relativa che permette di ottenere una valutazione della luminosità assoluta è stata possibile grazie ai cosiddetti Van der Meer scan, grazie ai quale è stata ottenuta una precisione dell'11%. L'analisi della fisica del decadimento della Z è in tuttora in corso per ottenere tramite il rate a cui avviene il processo una normalizzazione della luminosità con una precisione migliore del 5%.
Resumo:
The focus of this thesis was the in-situ application of the new analytical technique "GCxGC" in both the marine and continental boundary layer, as well as in the free troposphere. Biogenic and anthropogenic VOCs were analysed and used to characterise local chemistry at the individual measurement sites. The first part of the thesis work was the characterisation of a new set of columns that was to be used later in the field. To simplify the identification, a time-of-flight mass spectrometer (TOF-MS) detector was coupled to the GCxGC. In the field the TOF-MS was substituted by a more robust and tractable flame ionisation detector (FID), which is more suitable for quantitative measurements. During the process, a variety of volatile organic compounds could be assigned to different environmental sources, e.g. plankton sources, eucalyptus forest or urban centers. In-situ measurements of biogenic and anthropogenic VOCs were conducted at the Meteorological Observatory Hohenpeissenberg (MOHP), Germany, applying a thermodesorption-GCxGC-FID system. The measured VOCs were compared to GC-MS measurements routinely conducted at the MOHP as well as to PTR-MS measurements. Furthermore, a compressed ambient air standard was measured from three different gas chromatographic instruments and the results were compared. With few exceptions, the in-situ, as well as the standard measurements, revealed good agreement between the individual instruments. Diurnal cycles were observed, with differing patterns for the biogenic and the anthropogenic compounds. The variability-lifetime relationship of compounds with atmospheric lifetimes from a few hours to a few days in presence of O3 and OH was examined. It revealed a weak but significant influence of chemistry on these short-lived VOCs at the site. The relationship was also used to estimate the average OH radical concentration during the campaign, which was compared to in-situ OH measurements (1.7 x 10^6 molecules/cm^3, 0.071 ppt) for the first time. The OH concentration ranging from 3.5 to 6.5 x 10^5 molecules/cm^3 (0.015 to 0.027 ppt) obtained with this method represents an approximation of the average OH concentration influencing the discussed VOCs from emission to measurement. Based on these findings, the average concentration of the nighttime NO3 radicals was estimated using the same approach and found to range from 2.2 to 5.0 x 10^8 molecules/cm^3 (9.2 to 21.0 ppt). During the MINATROC field campaign, in-situ ambient air measurements with the GCxGC-FID were conducted at Tenerife, Spain. Although the station is mainly situated in the free troposphere, local influences of anthropogenic and biogenic VOCs were observed. Due to a strong dust event originating from Western Africa it was possible to compare the mixing ratios during normal and elevated dust loading in the atmosphere. The mixing ratios during the dust event were found to be lower. However, this could not be attributed to heterogeneous reactions as there was a change in the wind direction from northwesterly to southeasterly during the dust event.
Resumo:
I applied the SBAS-DInSAR method to the Mattinata Fault (MF) (Southern Italy) and to the Doruneh Fault System (DFS) (Central Iran). In the first case, I observed limited internal deformation and determined the right lateral kinematic pattern with a compressional pattern in the northern sector of the fault. Using the Okada model I inverted the observed velocities defining a right lateral strike slip solution for the MF. Even if it fits the data within the uncertainties, the modeled slip rate of 13-15 mm yr-1 seems too high with respect to the geological record. Concerning the Western termination of DFS, SAR data confirms the main left lateral transcurrent kinematics of this fault segment, but reveal a compressional component. My analytical model fits successfully the observed data and quantifies the slip in ~4 mm yr-1 and ~2.5 mm yr-1 of pure horizontal and vertical displacement respectively. The horizontal velocity is compatible with geological record. I applied classic SAR interferometry to the October–December 2008 Balochistan (Central Pakistan) seismic swarm; I discerned the different contributions of the three Mw > 5.7 earthquakes determining fault positions, lengths, widths, depths and slip distributions, constraining the other source parameters using different Global CMT solutions. A well constrained solution has been obtained for the 09/12/2008 aftershock, whereas I tested two possible fault solutions for the 28-29/10/08 mainshocks. It is not possible to favor one of the solutions without independent constraints derived from geological data. Finally I approached the study of the earthquake-cycle in transcurrent tectonic domains using analog modeling, with alimentary gelatins like crust analog material. I successfully joined the study of finite deformation with the earthquake cycle study and sudden dislocation. A lot of seismic cycles were reproduced in which a characteristic earthquake is recognizable in terms of displacement, coseismic velocity and recurrence time.
Resumo:
The current design life of nuclear power plant (NPP) could potentially be extended to 80 years. During this extended plant life, all safety and operationally relevant Instrumentation & Control (I&C) systems are required to meet their designed performance requirements to ensure safe and reliable operation of the NPP, both during normal operation and subsequent to design base events. This in turn requires an adequate and documented qualification and aging management program. It is known that electrical insulation of I&C cables used in safety related circuits can degrade during their life, due to the aging effect of environmental stresses, such as temperature, radiation, vibration, etc., particularly if located in the containment area of the NPP. Thus several condition monitoring techniques are required to assess the state of the insulation. Such techniques can be used to establish a residual lifetime, based on the relationship between condition indicators and ageing stresses, hence, to support a preventive and effective maintenance program. The object of this thesis is to investigate potential electrical aging indicators (diagnostic markers) testing various I&C cable insulations subjected to an accelerated multi-stress (thermal and radiation) aging.
Resumo:
Summary PhD Thesis Jan Pollmann: This thesis focuses on global scale measurements of light reactive non-methane hydrocarbon (NMHC), in the volatility range from ethane to toluene with a special focus on ethane, propane, isobutane, butane, isopentane and pentane. Even though they only occur at the ppt level (nmol mol-1) in the remote troposphere these species can yield insight into key atmospheric processes. An analytical method was developed and subsequently evaluated to analyze NMHC from the NOAA – ERSL cooperative air sampling network. Potential analytical interferences through other atmospheric trace gases (water vapor and ozone) were carefully examined. The analytical parameters accuracy and precision were analyzed in detail. It was proven that more than 90% of the data points meet the Global Atmospheric Watch (GAW) data quality objective. Trace gas measurements from 28 measurement stations were used to derive the global atmospheric distribution profile for 4 NMHC (ethane, propane, isobutane, butane). A close comparison of the derived ethane data with previously published reports showed that northern hemispheric ethane background mixing ratio declined by approximately 30% since 1990. No such change was observed for southern hemispheric ethane. The NMHC data and trace gas data supplied by NOAA ESRL were used to estimate local diurnal averaged hydroxyl radical (OH) mixing ratios by variability analysis. Comparison of the variability derived OH with directly measured OH and modeled OH mixing ratios were found in good agreement outside the tropics. Tropical OH was on average two times higher than predicted by the model. Variability analysis was used to assess the effect of chlorine radicals on atmospheric oxidation chemistry. It was found that Cl is probably not of significant relevance on a global scale.