987 resultados para Radiometric calibration
Resumo:
Intensification of agricultural production without a sound management and regulations can lead to severe environmental problems, as in Western Santa Catarina State, Brazil, where intensive swine production has caused large accumulations of manure and consequently water pollution. Natural resource scientists are asked by decision-makers for advice on management and regulatory decisions. Distributed environmental models are useful tools, since they can be used to explore consequences of various management practices. However, in many areas of the world, quantitative data for model calibration and validation are lacking. The data-intensive distributed environmental model AgNPS was applied in a data-poor environment, the upper catchment (2,520 ha) of the Ariranhazinho River, near the city of Seara, in Santa Catarina State. Steps included data preparation, cell size selection, sensitivity analysis, model calibration and application to different management scenarios. The model was calibrated based on a best guess for model parameters and on a pragmatic sensitivity analysis. The parameters were adjusted to match model outputs (runoff volume, peak runoff rate and sediment concentration) closely with the sparse observed data. A modelling grid cell resolution of 150 m adduced appropriate and computer-fit results. The rainfall runoff response of the AgNPS model was calibrated using three separate rainfall ranges (< 25, 25-60, > 60 mm). Predicted sediment concentrations were consistently six to ten times higher than observed, probably due to sediment trapping along vegetated channel banks. Predicted N and P concentrations in stream water ranged from just below to well above regulatory norms. Expert knowledge of the area, in addition to experience reported in the literature, was able to compensate in part for limited calibration data. Several scenarios (actual, recommended and excessive manure applications, and point source pollution from swine operations) could be compared by the model, using a relative ranking rather than quantitative predictions.
Resumo:
OBJECTIVE: The measurement of cardiac output is a key element in the assessment of cardiac function. Recently, a pulse contour analysis-based device without need for calibration became available (FloTrac/Vigileo, Edwards Lifescience, Irvine, CA). This study was conducted to determine if there is an impact of the arterial catheter site and to investigate the accuracy of this system when compared with the pulmonary artery catheter using the bolus thermodilution technique (PAC). DESIGN: Prospective study. SETTING: The operating room of 1 university hospital. PARTICIPANTS: Twenty patients undergoing cardiac surgery. INTERVENTIONS: CO was determined in parallel by the use of the Flotrac/Vigileo systems in the radial and femoral position (CO_rad and CO_fem) and by PAC as the reference method. Data triplets were recorded at defined time points. The primary endpoint was the comparison of CO_rad and CO_fem, and the secondary endpoint was the comparison with the PAC. MEASUREMENTS AND MAIN RESULTS: Seventy-eight simultaneous data recordings were obtained. The Bland-Altman analysis for CO_fem and CO_rad showed a bias of 0.46 L/min, precision was 0.85 L/min, and the percentage error was 34%. The Bland-Altman analysis for CO_rad and PAC showed a bias of -0.35 L/min, the precision was 1.88 L/min, and the percentage error was 76%. The Bland-Altman analysis for CO_fem and PAC showed a bias of 0.11 L/min, the precision was 1.8 L/min, and the percentage error was 69%. CONCLUSION: The FloTrac/Vigileo system was shown to not produce exactly the same CO data when used in radial and femoral arteries, even though the percentage error was close to the clinically acceptable range. Thus, the impact of the introduction site of the arterial catheter is not negligible. The agreement with thermodilution was low.
Resumo:
Among the types of remote sensing acquisitions, optical images are certainly one of the most widely relied upon data sources for Earth observation. They provide detailed measurements of the electromagnetic radiation reflected or emitted by each pixel in the scene. Through a process termed supervised land-cover classification, this allows to automatically yet accurately distinguish objects at the surface of our planet. In this respect, when producing a land-cover map of the surveyed area, the availability of training examples representative of each thematic class is crucial for the success of the classification procedure. However, in real applications, due to several constraints on the sample collection process, labeled pixels are usually scarce. When analyzing an image for which those key samples are unavailable, a viable solution consists in resorting to the ground truth data of other previously acquired images. This option is attractive but several factors such as atmospheric, ground and acquisition conditions can cause radiometric differences between the images, hindering therefore the transfer of knowledge from one image to another. The goal of this Thesis is to supply remote sensing image analysts with suitable processing techniques to ensure a robust portability of the classification models across different images. The ultimate purpose is to map the land-cover classes over large spatial and temporal extents with minimal ground information. To overcome, or simply quantify, the observed shifts in the statistical distribution of the spectra of the materials, we study four approaches issued from the field of machine learning. First, we propose a strategy to intelligently sample the image of interest to collect the labels only in correspondence of the most useful pixels. This iterative routine is based on a constant evaluation of the pertinence to the new image of the initial training data actually belonging to a different image. Second, an approach to reduce the radiometric differences among the images by projecting the respective pixels in a common new data space is presented. We analyze a kernel-based feature extraction framework suited for such problems, showing that, after this relative normalization, the cross-image generalization abilities of a classifier are highly increased. Third, we test a new data-driven measure of distance between probability distributions to assess the distortions caused by differences in the acquisition geometry affecting series of multi-angle images. Also, we gauge the portability of classification models through the sequences. In both exercises, the efficacy of classic physically- and statistically-based normalization methods is discussed. Finally, we explore a new family of approaches based on sparse representations of the samples to reciprocally convert the data space of two images. The projection function bridging the images allows a synthesis of new pixels with more similar characteristics ultimately facilitating the land-cover mapping across images.
Resumo:
The deficiency or excess of micronutrients has been determined by analyses of soil and plant tissue. In Brazil, the lack of studies that would define and standardize extraction and determination methods, as well as lack of correlation and calibration studies, makes it difficult to establish limits of concentration classes for analysis interpretation and fertilizer recommendations for crops. A specific extractor for soil analysis is sometimes chosen due to the ease of use in the laboratory and not in view of its efficiency in determining a bioavailable nutrient. The objectives of this study were to: (a) evaluate B concentrations in the soil as related to the fertilizer rate, soil depth and extractor; (b) verify the nutrient movement in the soil profile; (c) evaluate efficiency of Hot Water, Mehlich-1 and Mehlich-3 as available B extractors, using sunflower as test plant. The experimental design consisted of complete randomized blocks with four replications and treatments of five B rates (0, 2, 4, 6, and 8 kg ha-1) applied to the soil surface and evaluated at six depths (0-0.05, 0.05-0.10, 0.10-0.15, 0.15-0.20, 0.20-0.30, and 0.30-0.40 m). Boron concentrations in the soil extracted by Hot Water, Mehlich-1 and Mehlich-3 extractors increased linearly in relation to B rates at all depths evaluated, indicating B mobility in the profile. The extractors had different B extraction capacities, but were all efficient to evaluate bioavailability of the nutrient to sunflower. Mehlich-1 and Mehlich-3 can therefore be used to analyze B as well as Hot Water.
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensic science denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstrated its potential to distinguish chemically identical compounds coming from different sources. Despite the numerous applications of IRMS to a wide range of forensic materials, its implementation in a forensic framework is less straightforward than it appears. In addition, each laboratory has developed its own strategy of analysis on calibration, sequence design, standards utilisation and data treatment without a clear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose a methodological framework of the whole process using IRMS methods. We emphasize the importance of considering isotopic results as part of a whole approach, when applying this technology to a particular forensic issue. The process is divided into six different steps, which should be considered for a thoughtful and relevant application. The dissection of this process into fundamental steps, further detailed, enables a better understanding of the essential, though not exhaustive, factors that have to be considered in order to obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratory comparisons.
Resumo:
Bacterial degradation of polycyclic aromatic hydrocarbons (PAHs), ubiquitous contaminants from oil and coal, is typically limited by poor accessibility of the contaminant to the bacteria. In order to measure PAH availability in complex systems, we designed a number of diffusion-based assays with a double-tagged bacterial reporter strain Burkholderia sartisoli RP037-mChe. The reporter strain is capable of mineralizing phenanthrene (PHE) and induces the expression of enhanced green fluorescent protein (eGFP) as a function of the PAH flux to the cell. At the same time, it produces a second autofluorescent protein (mCherry) in constitutive manner. Quantitative epifluorescence imaging was deployed in order to record reporter signals as a function of PAH availability. The reporter strain expressed eGFP proportionally to dosages of naphthalene or PHE in batch liquid cultures. To detect PAH diffusion from solid materials the reporter cells were embedded in 2 cm-sized agarose gel patches, and fluorescence was recorded over time for both markers as a function of distance to the PAH source. eGFP fluorescence gradients measured on known amounts of naphthalene or PHE served as calibration for quantifying PAH availability from contaminated soils. To detect reporter gene expression at even smaller diffusion distances, we mixed and immobilized cells with contaminated soils in an agarose gel. eGFP fluorescence measurements confirmed gel patch diffusion results that exposure to 2-3 mg lampblack soil gave four times higher expression than to material contaminated with 10 or 1 (mg PHE) g(-1).
Resumo:
Calibrated BOLD fMRI is a promising alternative to the classic BOLD contrast due to its reduced venous sensitivity and greater physiological specificity. The delayed adoption of this technique for cognitive studies may stem partly from a lack of information on the reproducibility of these measures in the context of cognitive tasks. In this study we have explored the applicability and reproducibility of a state-of-the-art calibrated BOLD technique using a complex functional task at 7 tesla. Reproducibility measures of BOLD, CBF, CMRO2 flow-metabolism coupling n and the calibration parameter M were compared and interpreted for three ROIs. We found an averaged intra-subject variation of CMRO2 of 8% across runs and 33% across days. BOLD (46% across runs, 36% across days), CBF (33% across runs, 46% across days) and M (41% across days) showed significantly higher intra-subject variability. Inter-subject variability was found to be high for all quantities, though CMRO2 was the most consistent across brain regions. The results of this study provide evidence that calibrated BOLD may be a viable alternative for longitudinal and cognitive MRI studies.
Resumo:
Soil moisture is the property which most greatly influences the soil dielectric constant, which is also influenced by soil mineralogy. The aim of this study was to determine mathematical models for soil moisture and the dielectric constant (Ka) for a Hapludalf, two clayey Hapludox and a very clayey Hapludox and test the reliability of universal models, such as those proposed by Topp and Ledieu and their co-workers in the 80's, and specific models to estimate soil moisture with a TDR. Soil samples were collected from the 0 to 0.30 m layer, sieved through a mesh of 0.002 m diameter and packed in PVC cylinders with a 0.1 m diameter and 0.3 m height. Seven samples of each soil class were saturated by capillarity and a probe composed of two rods was inserted in each one of them. Moisture readings began with the saturated soil and concluded when the soil was near permanent wilting point. In each step, the samples were weighed on a precision scale to calculate volumetric moisture. Linear and polynomial models were adjusted for each soil class and for all soils together between soil moisture and the dielectric constant. Accuracy of the models was evaluated by the coefficient of determination, the standard error of estimate and the 1:1 line. The models proposed by Topp and Ledieu and their co-workers were not adequate for estimating the moisture in the soil classes studied. The adjusted linear and polynomial models for the entire set of data of the four soil classes did not have sufficient accuracy for estimating soil moisture. The greater the soil clay and Fe oxide content, the greater the dielectric constant of the medium for a given volumetric moisture. The specific models, θ = 0.40283 - 0.04231 Ka + 0.00194 Ka² - 0.000022 Ka³ (Hapludox) θ = 0.01971 + 0.02902 Ka - 0.00086 Ka² + 0.000012 Ka³ (Hapludox -PF), θ = 0.01692 - 0.00507 Ka (Hapludalf) and θ = 0.08471 + 0.01145 Ka (Hapludox-CA), show greater accuracy and reliability for estimating soil moisture in the soil classes studied.
Resumo:
Gaseous N losses from soil are considerable, resulting mostly from ammonia volatilization linked to agricultural activities such as pasture fertilization. The use of simple and accessible measurement methods of such losses is fundamental in the evaluation of the N cycle in agricultural systems. The purpose of this study was to evaluate quantification methods of NH3 volatilization from fertilized surface soil with urea, with minimal influence on the volatilization processes. The greenhouse experiment was arranged in a completely randomized design with 13 treatments and five replications, with the following treatments: (1) Polyurethane foam (density 20 kg m-3) with phosphoric acid solution absorber (foam absorber), installed 1, 5, 10 and 20 cm above the soil surface; (2) Paper filter with sulfuric acid solution absorber (paper absorber, 1, 5, 10 and 20 cm above the soil surface); (3) Sulfuric acid solution absorber (1, 5 and 10 cm above the soil surface); (4) Semi-open static collector; (5) 15N balance (control). The foam absorber placed 1 cm above the soil surface estimated the real daily rate of loss and accumulated loss of NH3N and proved efficient in capturing NH3 volatized from urea-treated soil. The estimates based on acid absorbers 1, 5 and 10 cm above the soil surface and paper absorbers 1 and 5 cm above the soil surface were only realistic for accumulated N-NH3 losses. Foam absorbers can be indicated to quantify accumulated and daily rates of NH3 volatilization losses similarly to an open static chamber, making calibration equations or correction factors unnecessary.
Resumo:
OBJECTIVE: To describe a method to obtain a profile of the duration and intensity (speed) of walking periods over 24 hours in women under free-living conditions. DESIGN: A new method based on accelerometry was designed for analyzing walking activity. In order to take into account inter-individual variability of acceleration, an individual calibration process was used. Different experiments were performed to highlight the variability of acceleration vs walking speed relationship, to analyze the speed prediction accuracy of the method, and to test the assessment of walking distance and duration over 24-h. SUBJECTS: Twenty-eight women were studied (mean+/-s.d.) age: 39.3+/-8.9 y; body mass: 79.7+/-11.1 kg; body height: 162.9+/-5.4 cm; and body mass index (BMI) 30.0+/-3.8 kg/m(2). RESULTS: Accelerometer output was significantly correlated with speed during treadmill walking (r=0.95, P<0.01), and short unconstrained walks (r=0.86, P<0.01), although with a large inter-individual variation of the regression parameters. By using individual calibration, it was possible to predict walking speed on a standard urban circuit (predicted vs measured r=0.93, P<0.01, s.e.e.=0.51 km/h). In the free-living experiment, women spent on average 79.9+/-36.0 (range: 31.7-168.2) min/day in displacement activities, from which discontinuous short walking activities represented about 2/3 and continuous ones 1/3. Total walking distance averaged 2.1+/-1.2 (range: 0.4-4.7) km/day. It was performed at an average speed of 5.0+/-0.5 (range: 4.1-6.0) km/h. CONCLUSION: An accelerometer measuring the anteroposterior acceleration of the body can estimate walking speed together with the pattern, intensity and duration of daily walking activity.
Resumo:
In response to the mandate on Load and Resistance Factor Design (LRFD) implementations by the Federal Highway Administration (FHWA) on all new bridge projects initiated after October 1, 2007, the Iowa Highway Research Board (IHRB) sponsored these research projects to develop regional LRFD recommendations. The LRFD development was performed using the Iowa Department of Transportation (DOT) Pile Load Test database (PILOT). To increase the data points for LRFD development, develop LRFD recommendations for dynamic methods, and validate the results of LRFD calibration, 10 full-scale field tests on the most commonly used steel H-piles (e.g., HP 10 x 42) were conducted throughout Iowa. Detailed in situ soil investigations were carried out, push-in pressure cells were installed, and laboratory soil tests were performed. Pile responses during driving, at the end of driving (EOD), and at re-strikes were monitored using the Pile Driving Analyzer (PDA), following with the CAse Pile Wave Analysis Program (CAPWAP) analysis. The hammer blow counts were recorded for Wave Equation Analysis Program (WEAP) and dynamic formulas. Static load tests (SLTs) were performed and the pile capacities were determined based on the Davisson’s criteria. The extensive experimental research studies generated important data for analytical and computational investigations. The SLT measured load displacements were compared with the simulated results obtained using a model of the TZPILE program and using the modified borehole shear test method. Two analytical pile setup quantification methods, in terms of soil properties, were developed and validated. A new calibration procedure was developed to incorporate pile setup into LRFD.
Resumo:
Mathematical models have great potential to support land use planning, with the goal of improving water and land quality. Before using a model, however, the model must demonstrate that it can correctly simulate the hydrological and erosive processes of a given site. The SWAT model (Soil and Water Assessment Tool) was developed in the United States to evaluate the effects of conservation agriculture on hydrological processes and water quality at the watershed scale. This model was initially proposed for use without calibration, which would eliminate the need for measured hydro-sedimentologic data. In this study, the SWAT model was evaluated in a small rural watershed (1.19 km²) located on the basalt slopes of the state of Rio Grande do Sul in southern Brazil, where farmers have been using cover crops associated with minimum tillage to control soil erosion. Values simulated by the model were compared with measured hydro-sedimentological data. Results for surface and total runoff on a daily basis were considered unsatisfactory (Nash-Sutcliffe efficiency coefficient - NSE < 0.5). However simulation results on monthly and annual scales were significantly better. With regard to the erosion process, the simulated sediment yields for all years of the study were unsatisfactory in comparison with the observed values on a daily and monthly basis (NSE values < -6), and overestimated the annual sediment yield by more than 100 %.
Resumo:
The paper gives the theory of airborne GPs related to Photogrammetry and the results of a self calibration used to validate the theory. Accordingly, no ground control points are required for mapping using a strip or block of photographs provided the site is within 10 Km of the calibration site.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
A simple and sensitive liquid chromatography-electrospray ionization mass spectrometry method was developed for the simultaneous quantification in human plasma of all selective serotonin reuptake inhibitors (citalopram, fluoxetine, fluvoxamine, paroxetine and sertraline) and their main active metabolites (desmethyl-citalopram and norfluoxetine). A stable isotope-labeled internal standard was used for each analyte to compensate for the global method variability, including extraction and ionization variations. After sample (250μl) pre-treatment with acetonitrile (500μl) to precipitate proteins, a fast solid-phase extraction procedure was performed using mixed mode Oasis MCX 96-well plate. Chromatographic separation was achieved in less than 9.0min on a XBridge C18 column (2.1×100mm; 3.5μm) using a gradient of ammonium acetate (pH 8.1; 50mM) and acetonitrile as mobile phase at a flow rate of 0.3ml/min. The method was fully validated according to Société Française des Sciences et Techniques Pharmaceutiques protocols and the latest Food and Drug Administration guidelines. Six point calibration curves were used to cover a large concentration range of 1-500ng/ml for citalopram, desmethyl-citalopram, paroxetine and sertraline, 1-1000ng/ml for fluoxetine and fluvoxamine, and 2-1000ng/ml for norfluoxetine. Good quantitative performances were achieved in terms of trueness (84.2-109.6%), repeatability (0.9-14.6%) and intermediate precision (1.8-18.0%) in the entire assay range including the lower limit of quantification. Internal standard-normalized matrix effects were lower than 13%. The accuracy profiles (total error) were mainly included in the acceptance limits of ±30% for biological samples. The method was successfully applied for routine therapeutic drug monitoring of more than 1600 patient plasma samples over 9 months. The β-expectation tolerance intervals determined during the validation phase were coherent with the results of quality control samples analyzed during routine use. This method is therefore precise and suitable both for therapeutic drug monitoring and pharmacokinetic studies in most clinical laboratories.