975 resultados para Heat Flux Measurement
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Changes in the oceanic heat storage (HS) can reveal important evidences of climate variability related to ocean heat fluxes. Specifically, long-term variations in HS are a powerful indicator of climate change as HS represents the balance between the net surface energy flux and the poleward heat transported by the ocean currents. HS is estimated from sea surface height anomaly measured from the altimeters TOPEX/Poseidon and Jason 1 from 1993 to 2006. To characterize and validate the altimeter-based HS in the Atlantic, we used the data from the Pilot Research Moored Array in the Tropical Atlantic (PIRATA) array. Correlations and rms differences are used as statistical figures of merit to compare the HS estimates. The correlations range from 0.50 to 0.87 in the buoys located at the equator and at the southern part of the array. In that region the rms differences range between 0.40 and 0.51 x 10(9) Jm(-2). These results are encouraging and indicate that the altimeter has the precision necessary to capture the interannual trends in HS in the Atlantic. Albeit relatively small, salinity changes can also have an effect on the sea surface height anomaly. To account for this effect, NCEP/GODAS reanalysis data are used to estimate the haline contraction. To understand which dynamical processes are involved in the HS variability, the total signal is decomposed into nonpropagating basin-scale and seasonal (HS(l)) planetary waves, mesoscale eddies, and small-scale residual components. In general, HS(l) is the dominant signal in the tropical region. Results show a warming trend of HS(l) in the past 13 years almost all over the Atlantic basin with the most prominent slopes found at high latitudes. Positive interannual trends are found in the halosteric component at high latitudes of the South Atlantic and near the Labrador Sea. This could be an indication that the salinity anomaly increased in the upper layers during this period. The dynamics of the South Atlantic subtropical gyre could also be subject to low-frequency changes caused by a trend in the halosteric component on each side of the South Atlantic Current.
Resumo:
Phosphorus is an essential element for plants and animals, playing a fundamental role in the production of biochemical energy. Despite its relevance, phosphorus is not commonly determined by instrumental neutron activation analysis (INAA), because (32)P does not emit gamma-rays in its decay. There are alternative methods for the determination of phosphorus by INAA, such as the use of beta counting or the measurement of bremsstrahlung originated from the high energy beta particle from (32)P. Here the determination of phosphorus in plant materials by measuring the bremsstrahlung production was further investigated, to optimize an analytical protocol for minimizing interferences and overcoming the poor specificity. Eight certified reference materials of plant matrices with phosphorus ranging between 171 and 5,180 mg kg(-1) were irradiated at a thermal neutron flux of 9.5 x 10(12) cm(-2) s(-1) and measured with a HPGe detector at decay times varying from 7 to 60 days. Phosphorus solutions added to a certified reference material at three levels were used for calibration. Counts accumulated in the baseline at four different regions of the gamma-ray spectra were tested for the determination of phosphorus, with better results for the 100 keV region. The Compton scattering contribution in the selected range was discounted using an experimental peak-to-Compton factor and the net areas of all peaks in the spectra with energies higher than 218 keV, i.e. Compton edge above 100 keV. Amongst the interferences investigated, the production of (32)P from sulfur, and the contribution of Compton scattering should be considered for producing good results.
Resumo:
We report a measurement of the proton-air cross section for particle production at the center-of-mass energy per nucleon of 57 TeV. This is derived from the distribution of the depths of shower maxima observed with the Pierre Auger Observatory: systematic uncertainties are studied in detail. Analyzing the tail of the distribution of the shower maxima, a proton-air cross section of [505 +/- 22(stat)(-36)(+28)(syst)] mb is found.
Resumo:
Volatile organic compounds play a critical role in ozone formation and drive the chemistry of the atmosphere, together with OH radicals. The simplest volatile organic compound methane is a climatologically important greenhouse gas, and plays a key role in regulating water vapour in the stratosphere and hydroxyl radicals in the troposphere. The OH radical is the most important atmospheric oxidant and knowledge of the atmospheric OH sink, together with the OH source and ambient OH concentrations is essential for understanding the oxidative capacity of the atmosphere. Oceanic emission and / or uptake of methanol, acetone, acetaldehyde, isoprene and dimethyl sulphide (DMS) was characterized as a function of photosynthetically active radiation (PAR) and a suite of biological parameters, in a mesocosm experiment conducted in the Norwegian fjord. High frequency (ca. 1 minute-1) methane measurements were performed using a gas chromatograph - flame ionization detector (GC-FID) in the boreal forests of Finland and the tropical forests of Suriname. A new on-line method (Comparative Reactivity Method - CRM) was developed to directly measure the total OH reactivity (sink) of ambient air. It was observed that under conditions of high biological activity and a PAR of ~ 450 μmol photons m-2 s-1, the ocean acted as a net source of acetone. However, if either of these criteria was not fulfilled then the ocean acted as a net sink of acetone. This new insight into the biogeochemical cycling of acetone at the ocean-air interface has helped to resolve discrepancies from earlier works such as Jacob et al. (2002) who reported the ocean to be a net acetone source (27 Tg yr-1) and Marandino et al. (2005) who reported the ocean to be a net sink of acetone (- 48 Tg yr-1). The ocean acted as net source of isoprene, DMS and acetaldehyde but net sink of methanol. Based on these findings, it is recommended that compound specific PAR and biological dependency be used for estimating the influence of the global ocean on atmospheric VOC budgets. Methane was observed to accumulate within the nocturnal boundary layer, clearly indicating emissions from the forest ecosystems. There was a remarkable similarity in the time series of the boreal and tropical forest ecosystem. The average of the median mixing ratios during a typical diel cycle were 1.83 μmol mol-1 and 1.74 μmol mol-1 for the boreal forest ecosystem and tropical forest ecosystem respectively. A flux value of (3.62 ± 0.87) x 1011 molecules cm-2 s-1 (or 45.5 ± 11 Tg CH4 yr-1 for global boreal forest area) was derived, which highlights the importance of the boreal forest ecosystem for the global budget of methane (~ 600 Tg yr-1). The newly developed CRM technique has a dynamic range of ~ 4 s-1 to 300 s-1 and accuracy of ± 25 %. The system has been tested and calibrated with several single and mixed hydrocarbon standards showing excellent linearity and accountability with the reactivity of the standards. Field tests at an urban and forest site illustrate the promise of the new method. The results from this study have improved current understanding about VOC emissions and uptake from ocean and forest ecosystems. Moreover, a new technique for directly measuring the total OH reactivity of ambient air has been developed and validated, which will be a valuable addition to the existing suite of atmospheric measurement techniques.
Resumo:
Quality control of medical radiological systems is of fundamental importance, and requires efficient methods for accurately determine the X-ray source spectrum. Straightforward measurements of X-ray spectra in standard operating require the limitation of the high photon flux, and therefore the measure has to be performed in a laboratory. However, the optimal quality control requires frequent in situ measurements which can be only performed using a portable system. To reduce the photon flux by 3 magnitude orders an indirect technique based on the scattering of the X-ray source beam by a solid target is used. The measured spectrum presents a lack of information because of transport and detection effects. The solution is then unfolded by solving the matrix equation that represents formally the scattering problem. However, the algebraic system is ill-conditioned and, therefore, it is not possible to obtain a satisfactory solution. Special strategies are necessary to circumvent the ill-conditioning. Numerous attempts have been done to solve this problem by using purely mathematical methods. In this thesis, a more physical point of view is adopted. The proposed method uses both the forward and the adjoint solutions of the Boltzmann transport equation to generate a better conditioned linear algebraic system. The procedure has been tested first on numerical experiments, giving excellent results. Then, the method has been verified with experimental measurements performed at the Operational Unit of Health Physics of the University of Bologna. The reconstructed spectra have been compared with the ones obtained with straightforward measurements, showing very good agreement.
Resumo:
Nowadays the environmental issues and the climatic change play fundamental roles in the design of urban spaces. Our cities are growing in size, many times only following immediate needs without a long-term vision. Consequently, the sustainable development has become not only an ethical but also a strategic need: we can no longer afford an uncontrolled urban expansion. One serious effect of the territory industrialisation process is the increase of urban air and surfaces temperatures compared to the outlying rural surroundings. This difference in temperature is what constitutes an urban heat island (UHI). The purpose of this study is to provide a clarification on the role of urban surfacing materials in the thermal dynamics of an urban space, resulting in useful indications and advices in mitigating UHI. With this aim, 4 coloured concrete bricks were tested, measuring their emissivity and building up their heat release curves using infrared thermography. Two emissivity evaluation procedures were carried out and subsequently put in comparison. Samples performances were assessed, and the influence of the colour on the thermal behaviour was investigated. In addition, some external pavements were analysed. Albedo and emissivity parameters were evaluated in order to understand their thermal behaviour in different conditions. Surfaces temperatures were recorded in a one-day measurements campaign. ENVI-met software was used to simulate how the tested materials would behave in two typical urban scenarios: a urban canyon and a urban heat basin. Improvements they can carry to the urban microclimate were investigated. Emissivities obtained for the bricks ranged between 0.92 and 0.97, suggesting a limited influence of the colour on this parameter. Nonetheless, white concrete brick showed the best thermal performance, whilst the black one the worst; red and yellow ones performed pretty identical intermediate trends. De facto, colours affected the overall thermal behaviour. Emissivity parameter was measured in the outdoor work, getting (as expected) high values for the asphalts. Albedo measurements, conducted with a sunshine pyranometer, proved the improving effect given by the yellow paint in terms of solar reflection, and the bad influence of haze on the measurement accuracy. ENVI-met simulations gave a demonstration on the effectiveness in thermal improving of some tested materials. In particular, results showed good performances for white bricks and granite in the heat basin scenario, and painted concrete and macadam in the urban canyon scenario. These materials can be considered valuable solutions in UHI mitigation.
Resumo:
Groundwater represents one of the most important resources of the world and it is essential to prevent its pollution and to consider remediation intervention in case of contamination. According to the scientific community the characterization and the management of the contaminated sites have to be performed in terms of contaminant fluxes and considering their spatial and temporal evolution. One of the most suitable approach to determine the spatial distribution of pollutant and to quantify contaminant fluxes in groundwater is using control panels. The determination of contaminant mass flux, requires measurement of contaminant concentration in the moving phase (water) and velocity/flux of the groundwater. In this Master Thesis a new solute flux mass measurement approach, based on an integrated control panel type methodology combined with the Finite Volume Point Dilution Method (FVPDM), for the monitoring of transient groundwater fluxes, is proposed. Moreover a new adsorption passive sampler, which allow to capture the variation of solute concentration with time, is designed. The present work contributes to the development of this approach on three key points. First, the ability of the FVPDM to monitor transient groundwater fluxes was verified during a step drawdown test at the experimental site of Hermalle Sous Argentau (Belgium). The results showed that this method can be used, with optimal results, to follow transient groundwater fluxes. Moreover, it resulted that performing FVPDM, in several piezometers, during a pumping test allows to determine the different flow rates and flow regimes that can occurs in the various parts of an aquifer. The second field test aiming to determine the representativity of a control panel for measuring mass flus in groundwater underlined that wrong evaluations of Darcy fluxes and discharge surfaces can determine an incorrect estimation of mass fluxes and that this technique has to be used with precaution. Thus, a detailed geological and hydrogeological characterization must be conducted, before applying this technique. Finally, the third outcome of this work concerned laboratory experiments. The test conducted on several type of adsorption material (Oasis HLB cartridge, TDS-ORGANOSORB 10 and TDS-ORGANOSORB 10-AA), in order to determine the optimum medium to dimension the passive sampler, highlighted the necessity to find a material with a reversible adsorption tendency to completely satisfy the request of the new passive sampling technique.
Resumo:
To evaluate whether it is feasible to measure the segmental flux of small bowel content using MR phase-contrast (PC) pulse sequences.
Resumo:
Brain functions, such as learning, orchestrating locomotion, memory recall, and processing information, all require glucose as a source of energy. During these functions, the glucose concentration decreases as the glucose is being consumed by brain cells. By measuring this drop in concentration, it is possible to determine which parts of the brain are used during specific functions and consequently, how much energy the brain requires to complete the function. One way to measure in vivo brain glucose levels is with a microdialysis probe. The drawback of this analytical procedure, as with many steadystate fluid flow systems, is that the probe fluid will not reach equilibrium with the brain fluid. Therefore, brain concentration is inferred by taking samples at multiple inlet glucose concentrations and finding a point of convergence. The goal of this thesis is to create a three-dimensional, time-dependent, finite element representation of the brainprobe system in COMSOL 4.2 that describes the diffusion and convection of glucose. Once validated with experimental results, this model can then be used to test parameters that experiments cannot access. When simulations were run using published values for physical constants (i.e. diffusivities, density and viscosity), the resulting glucose model concentrations were within the error of the experimental data. This verifies that the model is an accurate representation of the physical system. In addition to accurately describing the experimental brain-probe system, the model I created is able to show the validity of zero-net-flux for a given experiment. A useful discovery is that the slope of the zero-net-flux line is dependent on perfusate flow rate and diffusion coefficients, but it is independent of brain glucose concentrations. The model was simplified with the realization that the perfusate is at thermal equilibrium with the brain throughout the active region of the probe. This allowed for the assumption that all model parameters are temperature independent. The time to steady-state for the probe is approximately one minute. However, the signal degrades in the exit tubing due to Taylor dispersion, on the order of two minutes for two meters of tubing. Given an analytical instrument requiring a five μL aliquot, the smallest brain process measurable for this system is 13 minutes.
Resumo:
The 1s-2s interval has been measured in the muonium (;mgr;(+)e(-)) atom by Doppler-free two-photon pulsed laser spectroscopy. The frequency separation of the states was determined to be 2 455 528 941.0(9.8) MHz, in good agreement with quantum electrodynamics. The result may be interpreted as a measurement of the muon-electron charge ratio as -1-1.1(2.1)x10(-9). We expect significantly higher accuracy at future high flux muon sources and from cw laser technology.
Resumo:
The T2K collaboration reports a precision measurement of muon neutrino disappearance with an off-axis neutrino beam with a peak energy of 0.6 GeV. Near detector measurements are used to constrain the neutrino flux and cross section parameters. The Super-Kamiokande far detector, which is 295 km downstream of the neutrino production target, collected data corresponding to 3.01×1020 protons on target. In the absence of neutrino oscillations, 205±17 (syst.) events are expected to be detected and only 58 muon neutrino event candidates are observed. A fit to the neutrino rate and energy spectrum assuming three neutrino flavors, normal mass hierarchy and θ23≤π/4 yields a best-fit mixing angle sin2(2θ23)=1.000 and mass splitting |Δm232|=2.44×10−3 eV2/c4. If θ23≥π/4 is assumed, the best-fit mixing angle changes to sin2(2θ23)=0.999 and the mass splitting remains unchanged.
Resumo:
We have determined the flux of calcium, chloride and nitrate to the McMurdo Dry Valleys region by analysing snow pits for their chemical composition and their snow accumulation using multiple records spanning up to 48 years. The fluxes demonstrate patterns related to elevation and proximity to the ocean. In general, there is a strong relationship between the nitrate flux and snow accumulation, indicating that precipitation rates may have a great influence over the nitrogen concentrations in the soils of the valleys. Aeolian dust transport plays an important role in the deposition of some elements (e.g. C(2+)) into the McMurdo Dry Valleys' soils. Because of the antiquity of some of the soil surfaces in the McMurdo Dry Valleys regions, the accumulated atmospheric flux of salts to the soils has important ecological consequences. Although precipitation may be an important mechanism of salt deposition to the McMurdo Dry Valley surfaces, it is poorly understood because of difficulties in measurement and high losses from sublimation.
Resumo:
The heat waves of 2003 in Western Europe and 2010 in Russia, commonly labelled as rare climatic anomalies outside of previous experience, are often taken as harbingers of more frequent extremes in the global warming-influenced future. However, a recent reconstruction of spring–summer temperatures for WE resulted in the likelihood of significantly higher temperatures in 1540. In order to check the plausibility of this result we investigated the severity of the 1540 drought by putting forward the argument of the known soil desiccation-temperature feedback. Based on more than 300 first-hand documentary weather report sources originating from an area of 2 to 3 million km2, we show that Europe was affected by an unprecedented 11-month-long Megadrought. The estimated number of precipitation days and precipitation amount for Central and Western Europe in 1540 is significantly lower than the 100-year minima of the instrumental measurement period for spring, summer and autumn. This result is supported by independent documentary evidence about extremely low river flows and Europe-wide wild-, forest- and settlement fires. We found that an event of this severity cannot be simulated by state-of-the-art climate models.