980 resultados para Field-measurements
Resumo:
Biomass burning represents one of the largest sources of particulate matter to the atmosphere, which results in a significant perturbation to the Earth’s radiative balance coupled with serious negative impacts on public health. Globally, biomass burning aerosols are thought to exert a small warming effect of 0.03 Wm-2, however the uncertainty is 4 times greater than the central estimate. On regional scales, the impact is substantially greater, particularly in areas such as the Amazon Basin where large, intense and frequent burning occurs on an annual basis for several months (usually from August-October). Furthermore, a growing number of people live within the Amazon region, which means that they are subject to the deleterious effects on their health from exposure to substantial volumes of polluted air. Initial results from the South American Biomass Burning Analysis (SAMBBA) field experiment, which took place during September and October 2012 over Brazil, are presented here. A suite of instrumentation was flown on-board the UK Facility for Airborne Atmospheric Measurement (FAAM) BAe-146 research aircraft and was supported by ground based measurements, with extensive measurements made in Porto Velho, Rondonia. The aircraft sampled a range of conditions with sampling of fresh biomass burning plumes, regional haze and elevated biomass burning layers within the free troposphere. The physical, chemical and optical properties of the aerosols across the region will be characterized in order to establish the impact of biomass burning on regional air quality, weather and climate.
Resumo:
[EN]A new methodology for wind field simulation or forecasting over complex terrain is introduced. The idea is to use wind measurements or predictions of the HARMONIE mesoscale model as the input data for an adaptive finite element mass consistent wind model. The method has been recently implemented in the freely-available Wind3D code. A description of the HARMONIE Non-Hydrostatic Dynamics can be found in. HARMONIE provides wind prediction with a maximum resolution about 1 Km that is refined by the finite element model in a local scale (about a few meters). An interface between both models is implemented such that the initial wind field approximation is obtained by a suitable interpolation of the HARMONIE results…
Resumo:
In the context of “testing laboratory” one of the most important aspect to deal with is the measurement result. Whenever decisions are based on measurement results, it is important to have some indication of the quality of the results. In every area concerning with noise measurement many standards are available but without an expression of uncertainty, it is impossible to judge whether two results are in compliance or not. ISO/IEC 17025 is an international standard related with the competence of calibration and testing laboratories. It contains the requirements that testing and calibration laboratories have to meet if they wish to demonstrate that they operate to a quality system, are technically competent and are able to generate technically valid results. ISO/IEC 17025 deals specifically with the requirements for the competence of laboratories performing testing and calibration and for the reporting of the results, which may or may not contain opinions and interpretations of the results. The standard requires appropriate methods of analysis to be used for estimating uncertainty of measurement. In this point of view, for a testing laboratory performing sound power measurement according to specific ISO standards and European Directives, the measurement of uncertainties is the most important factor to deal with. Sound power level measurement, according to ISO 3744:1994 , performed with a limited number of microphones distributed over a surface enveloping a source is affected by a certain systematic error and a related standard deviation. Making a comparison of measurement carried out with different microphone arrays is difficult because results are affected by systematic errors and standard deviation that are peculiarities of the number of microphones disposed on the surface, their spatial position and the complexity of the sound field. A statistical approach could give an overview of the difference between sound power level evaluated with different microphone arrays and an evaluation of errors that afflict this kind of measurement. Despite the classical approach that tend to follow the ISO GUM this thesis present a different point of view of the problem related to the comparison of result obtained from different microphone arrays.
Resumo:
Compared with other mature engineering disciplines, fracture mechanics of concrete is still a developing field and very important for structures like bridges subject to dynamic loading. An historical point of view of what done in the field is provided and then the project is presented. The project presents an application of the Digital Image Correlation (DIC) technique for the detection of cracks at the surface of concrete prisms (500mmx100mmx100mm) subject to flexural loading conditions (Four Point Bending test). The technique provide displacement measurements of the region of interest and from this displacement field information about crack mouth opening (CMOD) are obtained and related to the applied load. The evolution of the fracture process is shown through graphs and graphical maps of the displacement at some step of the loading process. The study shows that it is possible with the DIC system to detect the appearance and evolution of cracks, even before the cracks become visually detectable.
Resumo:
This thesis investigates phenomena of vortex dynamics in type II superconductors depending on the dimensionality of the flux-line system and the strength of the driving force. In the low dissipative regime of Bi_2Sr_2CaCu_2O_{8+delta} (BSCCO) the influence of oxygen stoichiometry on flux-line tension was examined. An entanglement crossover of the vortex system at low magnetic fields was identified and a comprehensive B-T phase diagram of solid and fluid phases derived.In YBa_2Cu_3O_7 (YBCO) extremely long (>100 mm) high-quality measurement bridges allowed to extend the electric-field window in transport measurements by up to three orders of magnitude. Complementing analyses of the data conclusively produced dynamic exponents of the glass transition z~9 considerably higher than theoretically predicted and previously reported. In high-dissipative measurements a voltage instability appearing in the current-voltage characteristics of type II superconductors was observed for the first time in BSCCO and shown to result from a Larkin-Ovchinnikov flux-flow vortex instability under the influence of quasi-particle heating. However, in an analogous investigation of YBCO the instability was found to appear only in the temperature and magnetic-field regime of the vortex-glass state. Rapid-pulse measurements fully confirmed this correlation of vortex glass and instability in YBCO and revealed a constant rise time (~µs).
Resumo:
The g-factor is a constant which connects the magnetic moment $vec{mu}$ of a charged particle, of charge q and mass m, with its angular momentum $vec{J}$. Thus, the magnetic moment can be writen $ vec{mu}_J=g_Jfrac{q}{2m}vec{J}$. The g-factor for a free particle of spin s=1/2 should take the value g=2. But due to quantum electro-dynamical effects it deviates from this value by a small amount, the so called g-factor anomaly $a_e$, which is of the order of $10^{-3}$ for the free electron. This deviation is even bigger if the electron is exposed to high electric fields. Therefore highly charged ions, where electric field strength gets values on the order of $10^{13}-10^{16}$V/cm at the position of the bound electron, are an interesting field of investigations to test QED-calculations. In previous experiments [H"aff00,Ver04] using a single hydrogen-like ion confined in a Penning trap an accuracy of few parts in $10^{-9}$ was obtained. In the present work a new method for precise measurement of magnetic the electronic g-factor of hydrogen-like ions is discussed. Due to the unavoidable magnetic field inhomogeneity in a Penning trap, a very important contribution to the systematic uncertainty in the previous measurements arose from the elevated energy of the ion required for the measurement of its motional frequencies. Then it was necessary to extrapolate the result to vanishing energies. In the new method the energy in the cyclotron degree of freedom is reduced to the minimum attainable energy. This method consist in measuring the reduced cyclotron frequency $nu_{+}$ indirectly by coupling the axial to the reduced cyclotron motion by irradiation of the radio frequency $nu_{coup}=nu_{+}-nu_{ax}+delta$ where $delta$ is, in principle, an unknown detuning that can be obtained from the knowledge of the coupling process. Then the only unknown parameter is the desired value of $nu_+$. As a test, a measurement with, for simplicity, artificially increased axial energy was performed yielding the result $g_{exp}=2.000~047~020~8(24)(44)$. This is in perfect agreement with both the theoretical result $g_{theo}=2.000~047~020~2(6)$ and the previous experimental result $g_{exp1}=2.000~047~025~4(15)(44).$ In the experimental results the second error-bar is due to the uncertainty in the accepted value for the electron's mass. Thus, with the new method a higher accuracy in the g-factor could lead by comparison to the theoretical value to an improved value of the electron's mass. [H"af00] H. H"affner et al., Phys. Rev. Lett. 85 (2000) 5308 [Ver04] J. Verd'u et al., Phys. Rev. Lett. 92 (2004) 093002-1
Resumo:
The term "Brain Imaging" identi�es a set of techniques to analyze the structure and/or functional behavior of the brain in normal and/or pathological situations. These techniques are largely used in the study of brain activity. In addition to clinical usage, analysis of brain activity is gaining popularity in others recent �fields, i.e. Brain Computer Interfaces (BCI) and the study of cognitive processes. In this context, usage of classical solutions (e.g. f MRI, PET-CT) could be unfeasible, due to their low temporal resolution, high cost and limited portability. For these reasons alternative low cost techniques are object of research, typically based on simple recording hardware and on intensive data elaboration process. Typical examples are ElectroEncephaloGraphy (EEG) and Electrical Impedance Tomography (EIT), where electric potential at the patient's scalp is recorded by high impedance electrodes. In EEG potentials are directly generated from neuronal activity, while in EIT by the injection of small currents at the scalp. To retrieve meaningful insights on brain activity from measurements, EIT and EEG relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of the electric �field distribution therein. The inhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeo�ff between physical accuracy and technical feasibility, which currently severely limits the capabilities of these techniques. Moreover elaboration of data recorded requires usage of regularization techniques computationally intensive, which influences the application with heavy temporal constraints (such as BCI). This work focuses on the parallel implementation of a work-flow for EEG and EIT data processing. The resulting software is accelerated using multi-core GPUs, in order to provide solution in reasonable times and address requirements of real-time BCI systems, without over-simplifying the complexity and accuracy of the head models.
Resumo:
Sequenz spezifische biomolekulare Analyseverfahren erweisen sich gerade im Hinblick auf das Humane Genom Projekt als äußerst nützlich in der Detektion von einzelnen Nukleotid Polymorphismen (SNPs) und zur Identifizierung von Genen. Auf Grund der hohen Anzahl von Basenpaaren, die zu analysieren sind, werden sensitive und effiziente Rastermethoden benötigt, welche dazu fähig sind, DNA-Proben in einer geeigneten Art und Weise zu bearbeiten. Die meisten Detektionsarten berücksichtigen die Interaktion einer verankerten Probe und des korrespondierenden Targets mit den Oberflächen. Die Analyse des kinetischen Verhaltens der Oligonukleotide auf der Sensoroberfläche ist infolgedessen von höchster Wichtigkeit für die Verbesserung bereits bekannter Detektions - Schemata. In letzter Zeit wurde die Oberflächen Plasmonen feld-verstärkte Fluoreszenz Spektroskopie (SPFS) entwickelt. Sie stellt eine kinetische Analyse - und Detektions - Methode dar, die mit doppelter Aufzeichnung, d.h. der Änderung der Reflektivität und des Fluoreszenzsignals, für das Interphasen Phänomen operiert. Durch die Verwendung von SPFS können Kinetikmessungen für die Hybridisierung zwischen Peptid Nukleinsäure (PNA), welche eine synthetisierte Nukleinsäure DNA imitiert und eine stabilere Doppelhelix formt, und DNA auf der Sensoroberfläche ausgeführt werden. Mittels einzel-, umfassend-, und titrations- Experimenten sowohl mit einer komplementär zusammenpassenden Sequenz als auch einer mismatch Sequenz können basierend auf dem Langmuir Modell die Geschwindigkeitskonstanten für die Bindungsreaktion des oligomer DNA Targets bzw. des PCR Targets zur PNA ermittelt werden. Darüber hinaus wurden die Einflüsse der Ionenstärke und der Temperatur für die PNA/DNA Hybridisierung in einer kinetischen Analyse aufgezeigt.
Resumo:
The focus of this thesis was the in-situ application of the new analytical technique "GCxGC" in both the marine and continental boundary layer, as well as in the free troposphere. Biogenic and anthropogenic VOCs were analysed and used to characterise local chemistry at the individual measurement sites. The first part of the thesis work was the characterisation of a new set of columns that was to be used later in the field. To simplify the identification, a time-of-flight mass spectrometer (TOF-MS) detector was coupled to the GCxGC. In the field the TOF-MS was substituted by a more robust and tractable flame ionisation detector (FID), which is more suitable for quantitative measurements. During the process, a variety of volatile organic compounds could be assigned to different environmental sources, e.g. plankton sources, eucalyptus forest or urban centers. In-situ measurements of biogenic and anthropogenic VOCs were conducted at the Meteorological Observatory Hohenpeissenberg (MOHP), Germany, applying a thermodesorption-GCxGC-FID system. The measured VOCs were compared to GC-MS measurements routinely conducted at the MOHP as well as to PTR-MS measurements. Furthermore, a compressed ambient air standard was measured from three different gas chromatographic instruments and the results were compared. With few exceptions, the in-situ, as well as the standard measurements, revealed good agreement between the individual instruments. Diurnal cycles were observed, with differing patterns for the biogenic and the anthropogenic compounds. The variability-lifetime relationship of compounds with atmospheric lifetimes from a few hours to a few days in presence of O3 and OH was examined. It revealed a weak but significant influence of chemistry on these short-lived VOCs at the site. The relationship was also used to estimate the average OH radical concentration during the campaign, which was compared to in-situ OH measurements (1.7 x 10^6 molecules/cm^3, 0.071 ppt) for the first time. The OH concentration ranging from 3.5 to 6.5 x 10^5 molecules/cm^3 (0.015 to 0.027 ppt) obtained with this method represents an approximation of the average OH concentration influencing the discussed VOCs from emission to measurement. Based on these findings, the average concentration of the nighttime NO3 radicals was estimated using the same approach and found to range from 2.2 to 5.0 x 10^8 molecules/cm^3 (9.2 to 21.0 ppt). During the MINATROC field campaign, in-situ ambient air measurements with the GCxGC-FID were conducted at Tenerife, Spain. Although the station is mainly situated in the free troposphere, local influences of anthropogenic and biogenic VOCs were observed. Due to a strong dust event originating from Western Africa it was possible to compare the mixing ratios during normal and elevated dust loading in the atmosphere. The mixing ratios during the dust event were found to be lower. However, this could not be attributed to heterogeneous reactions as there was a change in the wind direction from northwesterly to southeasterly during the dust event.
Resumo:
A way to investigate turbulence is through experiments where hot wire measurements are performed. Analysis of the in turbulence of a temperature gradient on hot wire measurements is the aim of this thesis work. Actually - to author's knowledge - this investigation is the first attempt to document, understand and ultimately correct the effect of temperature gradients on turbulence statistics. However a numerical approach is used since instantaneous temperature and streamwise velocity fields are required to evaluate this effect. A channel flow simulation at Re_tau = 180 is analyzed to make a first evaluation of the amount of error introduced by temperature gradient inside the domain. Hot wire data field is obtained processing the numerical flow field through the application of a proper version of the King's law, which connect voltage, velocity and temperature. A drift in mean streamwise velocity profile and rms is observed when temperature correction is performed by means of centerline temperature. A correct mean velocity pro�le is achieved correcting temperature through its mean value at each wall normal position, but a not negligible error is still present into rms. The key point to correct properly the sensed velocity from the hot wire is the knowledge of the instantaneous temperature field. For this purpose three correction methods are proposed. At the end a numerical simulation at Re_tau =590 is also evaluated in order to confirm the results discussed earlier.
Resumo:
This PhD thesis is embedded into the DFG research project SAMUM, the Saharan Mineral Dust Experiment which was initiated with the goal to investigate the optical and microphysical properties of Saharan dust aerosol, its transport, and its radiative effect. This work described the deployment of the Spectral Modular Airborne Radiation Measurement SysTem (SMART-Albedometer) in SAMUM after it has been extended in its spectral range. The SAMUM field campaign was conducted in May and June 2006 in south-eastern Morocco. At two ground stations and aboard two aircraft various measurements in an almost pure plume of Saharan dust were conducted. Airborne measurements of the spectral upwelling and downwelling irradiance are used to derive the spectral surface albedo in its typical range in the experiment region. Typical spectral types are presented and compared to the surface albedo derived from MISR satellite data. Furthermore, the radiative forcing of the observed Saharan dust is estimated in dependence on the surface albedo and its regional variations. A strong dependence of the radiative forcing not only on the surface albedo, but also on the optical properties of the dust aerosol is observed. It is unique to SAMUM that all these influential parameters have been measured in near-source Saharan dust, which made the calculations shown in this work possible.
Resumo:
Der Einsatz von Penningfallen in der Massenspektrometrie hat zu einem einmaligen Genauigkeitssprung geführt. Dadurch wurden Massenwerte verschiedenster Atome zu wichtigen Eingangsparametern bei immer mehr physikalischen Fragestellungen. Die Massenspektrometrie mit Hilfe von Penningfallen basiert auf der Bestimmung der freien Zyklotronfrequenz eines Ions in einem homogenen Magnetfeld νc=qB/(2πm). Sie wird mit Flugzeitmethode (TOF-ICR) bestimmt, wobei eine relative Massenungenauigkeit δm/m von wenigen 10^-9 bei Nukliden mit Lebensdauern von <500 ms erreicht wird. Dies wurde durch die im Rahmen dieser Arbeit erstmals in der Penningfallen-Massenspektrometrie eingesetzten Ramsey-Methode möglich. Dabei werden zeitlich separierte, oszillierenden Feldern zur resonanten Ionenanregung genutzt, um die Frequenzmessung durch die Flugzeitmethode zu verbessern. Damit wurden am Penningfallenmassenspektrometer ISOLTRAP an ISOLDE/CERN die Massen der Nuklide 26,27Al und 38,39Ca bestimmt. Alle Massen wurden in die „Atomic Mass Evaluation“ eingebettet. Die Massenwerte von 26Al und 38Ca dienten insbesondere zu Tests des Standardmodells. Um mit Massenwerten fundamentale Symmetrien oder die Quantenelektrodynamik (QED) in extremen Feldern zu testen wurde ein neues Penningfallenprojekt (PENTATRAP) für hochpräzise Massenmessungen an hochgeladenen Ionen konzipiert. In dieser Doktorarbeit wurde vornehmlich die Entwicklung der Penningfallen betrieben. Eine Neuerung bei Penningfallenexperimenten ist dabei die permanente Beobachtung des Magnetfeldes B und seiner zeitlichen Fluktuationen durch so genannte „Monitorfallen“.
Resumo:
Das Penningfallen-Massenspektrometer SHIPTRAP wurde gebaut um HochprÄazi-rnsionsmassenmessungen an schweren Radionukliden durchzufÄuhren, die in Fusions-rnreaktionen produziert und vom Geschwindigkeitsfilter SHIP vom Primärstrahl sepa-rnriert werden. Es besteht aus einer Gaszelle zur Abbremsung der hochenergetis-rnchen Reaktionsprodukte, einem RFQ-Kühler und Buncher zur Kühlung und Akku-rnmulation der Ionen und einem Doppel-Penningfallen-System um Massenmessungenrndurchzuführen. Die Masse wird durch die Messungen der Zyklotronfrequenz desrnentsprechenden Ions in einem starken homogenen Magnetfeld bestimmt. Diese Fre-rnquenz wird mit der Frequenz eines wohlbekannten Referenzions verglichen. Mitrndieser Methode können relative Fehler in der Größenordnung von 10^-8 erreicht werden. Kürzlich konnten die Massen der Nobeliumisotope 252-254No (Z=102) und desrnLawrenciumisotops 255Lr (Z=103) erstmals erfolgreich gemessen werden. Dies warenrndie ersten direkten Massenmessungen an Transuranen. Die Produktionrate dieserrnAtome lag bei etwa eins pro Sekunde und weniger. Die Ergebnisse der Massenmes-rnsungen an Nobelium bestätigen die früheren Massenwerte, die aus Q_alpha-Messungenrnabgeleitet wurden. Im Fall von 255Lr wurde der Massenexzess, der bis dahin nur ausrnsystematischen Trends abgeschätzt wurde, zum ersten Mal direkt bestimmt. DiesernErgebnisse sind ein erster Schritt für die an SHIPTRAP geplante Erforschung derrnRegion der Transurane. Das Hauptziel ist hierbei die Bestimmung der Endpunkternder alpha-Zerfallsketten, die in superschweren Elementen in der Nähe der vorhergesagtenrnStabilitätsinsel ihren Ursprung nehmen.
Resumo:
Nuclear masses are an important quantity to study nuclear structure since they reflect the sum of all nucleonic interactions. Many experimental possibilities exist to precisely measure masses, out of which the Penning trap is the tool to reach the highest precision. Moreover, absolute mass measurements can be performed using carbon, the atomic-mass standard, as a reference. The new double-Penning trap mass spectrometer TRIGA-TRAP has been installed and commissioned within this thesis work, which is the very first experimental setup of this kind located at a nuclear reactor. New technical developments have been carried out such as a reliable non-resonant laser ablation ion source for the production of carbon cluster ions and are still continued, like a non-destructive ion detection technique for single-ion measurements. Neutron-rich fission products will be available by the reactor that are important for nuclear astrophysics, especially the r-process. Prior to the on-line coupling to the reactor, TRIGA-TRAP already performed off-line mass measurements on stable and long-lived isotopes and will continue this program. The main focus within this thesis was on certain rare-earth nuclides in the well-established region of deformation around N~90. Another field of interest are mass measurements on actinoids to test mass models and to provide direct links to the mass standard. Within this thesis, the mass of 241-Am could be measured directly for the first time.
Resumo:
Nitrogen is an essential nutrient. It is for human, animal and plants a constituent element of proteins and nucleic acids. Although the majority of the Earth’s atmosphere consists of elemental nitrogen (N2, 78 %) only a few microorganisms can use it directly. To be useful for higher plants and animals elemental nitrogen must be converted to a reactive oxidized form. This conversion happens within the nitrogen cycle by free-living microorganisms, symbiotic living Rhizobium bacteria or by lightning. Humans are able to synthesize reactive nitrogen through the Haber-Bosch process since the beginning of the 20th century. As a result food security of the world population could be improved noticeably. On the other side the increased nitrogen input results in acidification and eutrophication of ecosystems and in loss of biodiversity. Negative health effects arose for humans such as fine particulate matter and summer smog. Furthermore, reactive nitrogen plays a decisive role at atmospheric chemistry and global cycles of pollutants and nutritive substances.rnNitrogen monoxide (NO) and nitrogen dioxide (NO2) belong to the reactive trace gases and are grouped under the generic term NOx. They are important components of atmospheric oxidative processes and influence the lifetime of various less reactive greenhouse gases. NO and NO2 are generated amongst others at combustion process by oxidation of atmospheric nitrogen as well as by biological processes within soil. In atmosphere NO is converted very quickly into NO2. NO2 is than oxidized to nitrate (NO3-) and to nitric acid (HNO3), which bounds to aerosol particles. The bounded nitrate is finally washed out from atmosphere by dry and wet deposition. Catalytic reactions of NOx are an important part of atmospheric chemistry forming or decomposing tropospheric ozone (O3). In atmosphere NO, NO2 and O3 are in photosta¬tionary equilibrium, therefore it is referred as NO-NO2-O3 triad. At regions with elevated NO concentrations reactions with air pollutions can form NO2, altering equilibrium of ozone formation.rnThe essential nutrient nitrogen is taken up by plants mainly by dissolved NO3- entering the roots. Atmospheric nitrogen is oxidized to NO3- within soil via bacteria by nitrogen fixation or ammonium formation and nitrification. Additionally atmospheric NO2 uptake occurs directly by stomata. Inside the apoplast NO2 is disproportionated to nitrate and nitrite (NO2-), which can enter the plant metabolic processes. The enzymes nitrate and nitrite reductase convert nitrate and nitrite to ammonium (NH4+). NO2 gas exchange is controlled by pressure gradients inside the leaves, the stomatal aperture and leaf resistances. Plant stomatal regulation is affected by climate factors like light intensity, temperature and water vapor pressure deficit. rnThis thesis wants to contribute to the comprehension of the effects of vegetation in the atmospheric NO2 cycle and to discuss the NO2 compensation point concentration (mcomp,NO2). Therefore, NO2 exchange between the atmosphere and spruce (Picea abies) on leaf level was detected by a dynamic plant chamber system under labo¬ratory and field conditions. Measurements took place during the EGER project (June-July 2008). Additionally NO2 data collected during the ECHO project (July 2003) on oak (Quercus robur) were analyzed. The used measuring system allowed simultaneously determina¬tion of NO, NO2, O3, CO2 and H2O exchange rates. Calculations of NO, NO2 and O3 fluxes based on generally small differences (∆mi) measured between inlet and outlet of the chamber. Consequently a high accuracy and specificity of the analyzer is necessary. To achieve these requirements a highly specific NO/NO2 analyzer was used and the whole measurement system was optimized to an enduring measurement precision.rnData analysis resulted in a significant mcomp,NO2 only if statistical significance of ∆mi was detected. Consequently, significance of ∆mi was used as a data quality criterion. Photo-chemical reactions of the NO-NO2-O3 triad in the dynamic plant chamber’s volume must be considered for the determination of NO, NO2, O3 exchange rates, other¬wise deposition velocity (vdep,NO2) and mcomp,NO2 will be overestimated. No significant mcomp,NO2 for spruce could be determined under laboratory conditions, but under field conditions mcomp,NO2 could be identified between 0.17 and 0.65 ppb and vdep,NO2 between 0.07 and 0.42 mm s-1. Analyzing field data of oak, no NO2 compensation point concentration could be determined, vdep,NO2 ranged between 0.6 and 2.71 mm s-1. There is increasing indication that forests are mainly a sink for NO2 and potential NO2 emissions are low. Only when assuming high NO soil emissions, more NO2 can be formed by reaction with O3 than plants are able to take up. Under these circumstance forests can be a source for NO2.