978 resultados para sampling techniques
Resumo:
Nitrous oxide (N2O) emissions from soil are often measured using the manual static chamber method. Manual gas sampling is labour intensive, so a minimal sampling frequency that maintains the accuracy of measurements would be desirable. However, the high temporal (diurnal, daily and seasonal) variabilities of N2O emissions can compromise the accuracy of measurements if not addressed adequately when formulating a sampling schedule. Assessments of sampling strategies to date have focussed on relatively low emission systems with high episodicity, where a small number of the highest emission peaks can be critically important in the measurement of whole season cumulative emissions. Using year-long, automated sub-daily N2O measurements from three fertilised sugarcane fields, we undertook an evaluation of the optimum gas sampling strategies in high emission systems with relatively long emission episodes. The results indicated that sampling in the morning between 09:00–12:00, when soil temperature was generally close to the daily average, best approximated the daily mean N2O emission within 4–7% of the ‘actual’ daily emissions measured by automated sampling. Weekly sampling with biweekly sampling for one week after >20 mm of rainfall was the recommended sampling regime. It resulted in no extreme (>20%) deviations from the ‘actuals’, had a high probability of estimating the annual cumulative emissions within 10% precision, with practicable sampling numbers in comparison to other sampling regimes. This provides robust and useful guidance for manual gas sampling in sugarcane cropping systems, although further adjustments by the operators in terms of expected measurement accuracy and resource availability are encouraged. By implementing these sampling strategies together, labour inputs and errors in measured cumulative N2O emissions can be minimised. Further research is needed to quantify the spatial variability of N2O emissions within sugarcane cropping and to develop techniques for effectively addressing both spatial and temporal variabilities simultaneously.
Resumo:
Excess nutrient loads carried by streams and rivers are a great concern for environmental resource managers. In agricultural regions, excess loads are transported downstream to receiving water bodies, potentially causing algal blooms, which could lead to numerous ecological problems. To better understand nutrient load transport, and to develop appropriate water management plans, it is important to have accurate estimates of annual nutrient loads. This study used a Monte Carlo sub-sampling method and error-corrected statistical models to estimate annual nitrate-N loads from two watersheds in central Illinois. The performance of three load estimation methods (the seven-parameter log-linear model, the ratio estimator, and the flow-weighted averaging estimator) applied at one-, two-, four-, six-, and eight-week sampling frequencies were compared. Five error correction techniques; the existing composite method, and four new error correction techniques developed in this study; were applied to each combination of sampling frequency and load estimation method. On average, the most accurate error reduction technique, (proportional rectangular) resulted in 15% and 30% more accurate load estimates when compared to the most accurate uncorrected load estimation method (ratio estimator) for the two watersheds. Using error correction methods, it is possible to design more cost-effective monitoring plans by achieving the same load estimation accuracy with fewer observations. Finally, the optimum combinations of monitoring threshold and sampling frequency that minimizes the number of samples required to achieve specified levels of accuracy in load estimation were determined. For one- to three-weeks sampling frequencies, combined threshold/fixed-interval monitoring approaches produced the best outcomes, while fixed-interval-only approaches produced the most accurate results for four- to eight-weeks sampling frequencies.
Resumo:
The occurrence frequency of failure events serve as critical indexes representing the safety status of dam-reservoir systems. Although overtopping is the most common failure mode with significant consequences, this type of event, in most cases, has a small probability. Estimation of such rare event risks for dam-reservoir systems with crude Monte Carlo (CMC) simulation techniques requires a prohibitively large number of trials, where significant computational resources are required to reach the satisfied estimation results. Otherwise, estimation of the disturbances would not be accurate enough. In order to reduce the computation expenses and improve the risk estimation efficiency, an importance sampling (IS) based simulation approach is proposed in this dissertation to address the overtopping risks of dam-reservoir systems. Deliverables of this study mainly include the following five aspects: 1) the reservoir inflow hydrograph model; 2) the dam-reservoir system operation model; 3) the CMC simulation framework; 4) the IS-based Monte Carlo (ISMC) simulation framework; and 5) the overtopping risk estimation comparison of both CMC and ISMC simulation. In a broader sense, this study meets the following three expectations: 1) to address the natural stochastic characteristics of the dam-reservoir system, such as the reservoir inflow rate; 2) to build up the fundamental CMC and ISMC simulation frameworks of the dam-reservoir system in order to estimate the overtopping risks; and 3) to compare the simulation results and the computational performance in order to demonstrate the ISMC simulation advantages. The estimation results of overtopping probability could be used to guide the future dam safety investigations and studies, and to supplement the conventional analyses in decision making on the dam-reservoir system improvements. At the same time, the proposed methodology of ISMC simulation is reasonably robust and proved to improve the overtopping risk estimation. The more accurate estimation, the smaller variance, and the reduced CPU time, expand the application of Monte Carlo (MC) technique on evaluating rare event risks for infrastructures.
Resumo:
This dissertation presents the design of three high-performance successive-approximation-register (SAR) analog-to-digital converters (ADCs) using distinct digital background calibration techniques under the framework of a generalized code-domain linear equalizer. These digital calibration techniques effectively and efficiently remove the static mismatch errors in the analog-to-digital (A/D) conversion. They enable aggressive scaling of the capacitive digital-to-analog converter (DAC), which also serves as sampling capacitor, to the kT/C limit. As a result, outstanding conversion linearity, high signal-to-noise ratio (SNR), high conversion speed, robustness, superb energy efficiency, and minimal chip-area are accomplished simultaneously. The first design is a 12-bit 22.5/45-MS/s SAR ADC in 0.13-μm CMOS process. It employs a perturbation-based calibration based on the superposition property of linear systems to digitally correct the capacitor mismatch error in the weighted DAC. With 3.0-mW power dissipation at a 1.2-V power supply and a 22.5-MS/s sample rate, it achieves a 71.1-dB signal-to-noise-plus-distortion ratio (SNDR), and a 94.6-dB spurious free dynamic range (SFDR). At Nyquist frequency, the conversion figure of merit (FoM) is 50.8 fJ/conversion step, the best FoM up to date (2010) for 12-bit ADCs. The SAR ADC core occupies 0.06 mm2, while the estimated area the calibration circuits is 0.03 mm2. The second proposed digital calibration technique is a bit-wise-correlation-based digital calibration. It utilizes the statistical independence of an injected pseudo-random signal and the input signal to correct the DAC mismatch in SAR ADCs. This idea is experimentally verified in a 12-bit 37-MS/s SAR ADC fabricated in 65-nm CMOS implemented by Pingli Huang. This prototype chip achieves a 70.23-dB peak SNDR and an 81.02-dB peak SFDR, while occupying 0.12-mm2 silicon area and dissipating 9.14 mW from a 1.2-V supply with the synthesized digital calibration circuits included. The third work is an 8-bit, 600-MS/s, 10-way time-interleaved SAR ADC array fabricated in 0.13-μm CMOS process. This work employs an adaptive digital equalization approach to calibrate both intra-channel nonlinearities and inter-channel mismatch errors. The prototype chip achieves 47.4-dB SNDR, 63.6-dB SFDR, less than 0.30-LSB differential nonlinearity (DNL), and less than 0.23-LSB integral nonlinearity (INL). The ADC array occupies an active area of 1.35 mm2 and dissipates 30.3 mW, including synthesized digital calibration circuits and an on-chip dual-loop delay-locked loop (DLL) for clock generation and synchronization.
Resumo:
Dissertação de Mestrado Integrado em Medicina Veterinária
Resumo:
Measurement and modeling techniques were developed to improve over-water gaseous air-water exchange measurements for persistent bioaccumulative and toxic chemicals (PBTs). Analytical methods were applied to atmospheric measurements of hexachlorobenzene (HCB), polychlorinated biphenyls (PCBs), and polybrominated diphenyl ethers (PBDEs). Additionally, the sampling and analytical methods are well suited to study semivolatile organic compounds (SOCs) in air with applications related to secondary organic aerosol formation, urban, and indoor air quality. A novel gas-phase cleanup method is described for use with thermal desorption methods for analysis of atmospheric SOCs using multicapillary denuders. The cleanup selectively removed hydrogen-bonding chemicals from samples, including much of the background matrix of oxidized organic compounds in ambient air, and thereby improved precision and method detection limits for nonpolar analytes. A model is presented that predicts gas collection efficiency and particle collection artifact for SOCs in multicapillary denuders using polydimethylsiloxane (PDMS) sorbent. An approach is presented to estimate the equilibrium PDMS-gas partition coefficient (Kpdms) from an Abraham solvation parameter model for any SOC. A high flow rate (300 L min-1) multicapillary denuder was designed for measurement of trace atmospheric SOCs. Overall method precision and detection limits were determined using field duplicates and compared to the conventional high-volume sampler method. The high-flow denuder is an alternative to high-volume or passive samplers when separation of gas and particle-associated SOCs upstream of a filter and short sample collection time are advantageous. A Lagrangian internal boundary layer transport exchange (IBLTE) Model is described. The model predicts the near-surface variation in several quantities with fetch in coastal, offshore flow: 1) modification in potential temperature and gas mixing ratio, 2) surface fluxes of sensible heat, water vapor, and trace gases using the NOAA COARE Bulk Algorithm and Gas Transfer Model, 3) vertical gradients in potential temperature and mixing ratio. The model was applied to interpret micrometeorological measurements of air-water exchange flux of HCB and several PCB congeners in Lake Superior. The IBLTE Model can be applied to any scalar, including water vapor, carbon dioxide, dimethyl sulfide, and other scalar quantities of interest with respect to hydrology, climate, and ecosystem science.
Resumo:
Sampling and preconcentration techniques play a critical role in headspace analysis in analytical chemistry. My dissertation presents a novel sampling design, capillary microextraction of volatiles (CMV), that improves the preconcentration of volatiles and semivolatiles in a headspace with high throughput, near quantitative analysis, high recovery and unambiguous identification of compounds when coupled to mass spectrometry. The CMV devices use sol-gel polydimethylsiloxane (PDMS) coated microglass fibers as the sampling/preconcentration sorbent when these fibers are stacked into open-ended capillary tubes. The design allows for dynamic headspace sampling by connecting the device to a hand-held vacuum pump. The inexpensive device can be fitted into a thermal desorption probe for thermal desorption of the extracted volatile compounds into a gas chromatography-mass spectrometer (GC-MS). The performance of the CMV devices was compared with two other existing preconcentration techniques, solid phase microextraction (SPME) and planar solid phase microextraction (PSPME). Compared to SPME fibers, the CMV devices have an improved surface area and phase volume of 5000 times and 80 times, respectively. One (1) minute dynamic CMV air sampling resulted in similar performance as a 30 min static extraction using a SPME fiber. The PSPME devices have been fashioned to easily interface with ion mobility spectrometers (IMS) for explosives or drugs detection. The CMV devices are shown to offer dynamic sampling and can now be coupled to COTS GC-MS instruments. Several compound classes representing explosives have been analyzed with minimum breakthrough even after a 60 min. sampling time. The extracted volatile compounds were retained in the CMV devices when preserved in aluminum foils after sampling. Finally, the CMV sampling device were used for several different headspace profiling applications which involved sampling a shipping facility, six illicit drugs, seven military explosives and eighteen different bacteria strains. Successful detection of the target analytes at ng levels of the target signature volatile compounds in these applications suggests that the CMV devices can provide high throughput qualitative and quantitative analysis with high recovery and unambiguous identification of analytes.
Resumo:
Predictive models of species distributions are important tools for fisheries management. Unfortunately, these predictive models can be difficult to perform on large waterbodies where fish are difficult to detect and exhaustive sampling is not possible. In recent years the development of Geographic Information Systems (GIS) and new occupancy modelling techniques has improved our ability to predict distributions across landscapes as well as account for imperfect detection. I surveyed the nearshore fish community at 105 sites between Kingston, Ontario and Rockport, Ontario with the objective of modelling geographic and environmental characteristics associated with littoral fish distributions. Occupancy modelling was performed on Round Goby, Yellow perch, and Lepomis spp. Modelling with geographic and environmental covariates revealed the effect of shoreline exposure on nearshore habitat characteristics and the occupancy of Round Goby. Yellow Perch, and Lepomis spp. occupancy was most strongly associated negatively with distance to a wetland. These results are consistent with past research on large lake systems indicate the importance of wetlands and shoreline exposure in determining the fish community of the littoral zone. By examining 3 species with varying rates of occupancy and detection, this study was also able to demonstrate the variable utility of occupancy modelling.
Resumo:
Many real-word decision- making problems are defined based on forecast parameters: for example, one may plan an urban route by relying on traffic predictions. In these cases, the conventional approach consists in training a predictor and then solving an optimization problem. This may be problematic since mistakes made by the predictor may trick the optimizer into taking dramatically wrong decisions. Recently, the field of Decision-Focused Learning overcomes this limitation by merging the two stages at training time, so that predictions are rewarded and penalized based on their outcome in the optimization problem. There are however still significant challenges toward a widespread adoption of the method, mostly related to the limitation in terms of generality and scalability. One possible solution for dealing with the second problem is introducing a caching-based approach, to speed up the training process. This project aims to investigate these techniques, in order to reduce even more, the solver calls. For each considered method, we designed a particular smart sampling approach, based on their characteristics. In the case of the SPO method, we ended up discovering that it is only necessary to initialize the cache with only several solutions; those needed to filter the elements that we still need to properly learn. For the Blackbox method, we designed a smart sampling approach, based on inferred solutions.
Resumo:
The aim of this investigation was to compare the skeletal stability of three different rigid fixation methods after mandibular advancement. Fifty-five class II malocclusion patients treated with the use of bilateral sagittal split ramus osteotomy and mandibular advancement were selected for this retrospective study. Group 1 (n = 17) had miniplates with monocortical screws, Group 2 (n = 16) had bicortical screws and Group 3 (n = 22) had the osteotomy fixed by means of the hybrid technique. Cephalograms were taken preoperatively, 1 week within the postoperative care period, and 6 months after the orthognathic surgery. Linear and angular changes of the cephalometric landmarks of the chin region were measured at each period, and the changes at each cephalometric landmark were determined for the time gaps. Postoperative changes in the mandibular shape were analyzed to determine the stability of fixation methods. There was minimum difference in the relapse of the mandibular advancement among the three groups. Statistical analysis showed no significant difference in postoperative stability. However, a positive correlation between the amount of advancement and the amount of postoperative relapse was demonstrated by the linear multiple regression test (p < 0.05). It can be concluded that all techniques can be used to obtain stable postoperative results in mandibular advancement after 6 months.
Resumo:
Quantification of dermal exposure to pesticides in rural workers, used in risk assessment, can be performed with different techniques such as patches or whole body evaluation. However, the wide variety of methods can jeopardize the process by producing disparate results, depending on the principles in sample collection. A critical review was thus performed on the main techniques for quantifying dermal exposure, calling attention to this issue and the need to establish a single methodology for quantification of dermal exposure in rural workers. Such harmonization of different techniques should help achieve safer and healthier working conditions. Techniques that can provide reliable exposure data are an essential first step towards avoiding harm to workers' health.
Resumo:
The Centers for High Cost Medication (Centros de Medicação de Alto Custo, CEDMAC), Health Department, São Paulo were instituted by project in partnership with the Clinical Hospital of the Faculty of Medicine, USP, sponsored by the Foundation for Research Support of the State of São Paulo (Fundação de Amparo à Pesquisa do Estado de São Paulo, FAPESP) aimed at the formation of a statewide network for comprehensive care of patients referred for use of immunobiological agents in rheumatological diseases. The CEDMAC of Hospital de Clínicas, Universidade Estadual de Campinas (HC-Unicamp), implemented by the Division of Rheumatology, Faculty of Medical Sciences, identified the need for standardization of the multidisciplinary team conducts, in face of the specificity of care conducts, verifying the importance of describing, in manual format, their operational and technical processes. The aim of this study is to present the methodology applied to the elaboration of the CEDMAC/HC-Unicamp Manual as an institutional tool, with the aim of offering the best assistance and administrative quality. In the methodology for preparing the manuals at HC-Unicamp since 2008, the premise was to obtain a document that is participatory, multidisciplinary, focused on work processes integrated with institutional rules, with objective and didactic descriptions, in a standardized format and with electronic dissemination. The CEDMAC/HC-Unicamp Manual was elaborated in 10 months, with involvement of the entire multidisciplinary team, with 19 chapters on work processes and techniques, in addition to those concerning the organizational structure and its annexes. Published in the electronic portal of HC Manuals in July 2012 as an e-Book (ISBN 978-85-63274-17-5), the manual has been a valuable instrument in guiding professionals in healthcare, teaching and research activities.
Resumo:
Abstract The aim of this study was to evaluate three transfer techniques used to obtain working casts of implant-supported prostheses through the marginal misfit and strain induced to metallic framework. Thirty working casts were obtained from a metallic master cast, each one containing two implant analogues simulating a clinical situation of three-unit implant-supported fixed prostheses, according to the following transfer impression techniques: Group A, squared transfers splinted with dental floss and acrylic resin, sectioned and re-splinted; Group B, squared transfers splinted with dental floss and bis-acrylic resin; and Group N, squared transfers not splinted. A metallic framework was made for marginal misfit and strain measurements from the metallic master cast. The misfit between metallic framework and the working casts was evaluated with an optical microscope following the single-screw test protocol. In the same conditions, the strain was evaluated using strain gauges placed on the metallic framework. The data was submitted to one-way ANOVA followed by the Tukey's test (α=5%). For both marginal misfit and strain, there were statistically significant differences between Groups A and N (p<0.01) and Groups B and N (p<0.01), with greater values for the Group N. According to the Pearson's test, there was a positive correlation between the variables misfit and strain (r=0.5642). The results of this study showed that the impression techniques with splinted transfers promoted better accuracy than non-splinted one, regardless of the splinting material utilized.
Resumo:
El Niño South Oscillation (ENSO) is one climatic phenomenon related to the inter-annual variability of global meteorological patterns influencing sea surface temperature and rainfall variability. It influences human health indirectly through extreme temperature and moisture conditions that may accelerate the spread of some vector-borne viral diseases, like dengue fever (DF). This work examines the spatial distribution of association between ENSO and DF in the countries of the Americas during 1995-2004, which includes the 1997-1998 El Niño, one of the most important climatic events of 20(th) century. Data regarding the South Oscillation index (SOI), indicating El Niño-La Niña activity, were obtained from Australian Bureau of Meteorology. The annual DF incidence (AIy) by country was computed using Pan-American Health Association data. SOI and AIy values were standardised as deviations from the mean and plotted in bars-line graphics. The regression coefficient values between SOI and AIy (rSOI,AI) were calculated and spatially interpolated by an inverse distance weighted algorithm. The results indicate that among the five years registering high number of cases (1998, 2002, 2001, 2003 and 1997), four had El Niño activity. In the southern hemisphere, the annual spatial weighted mean centre of epidemics moved southward, from 6° 31' S in 1995 to 21° 12' S in 1999 and the rSOI,AI values were negative in Cuba, Belize, Guyana and Costa Rica, indicating a synchrony between higher DF incidence rates and a higher El Niño activity. The rSOI,AI map allows visualisation of a graded surface with higher values of ENSO-DF associations for Mexico, Central America, northern Caribbean islands and the extreme north-northwest of South America.
Resumo:
The aim of this study was to compare the performance of the following techniques on the isolation of volatiles of importance for the aroma/flavor of fresh cashew apple juice: dynamic headspace analysis using PorapakQ(®) as trap, solvent extraction with and without further concentration of the isolate, and solid-phase microextraction (fiber DVB/CAR/PDMS). A total of 181 compounds were identified, from which 44 were esters, 20 terpenes, 19 alcohols, 17 hydrocarbons, 15 ketones, 14 aldehydes, among others. Sensory evaluation of the gas chromatography effluents revealed esters (n = 24) and terpenes (n = 10) as the most important aroma compounds. The four techniques were efficient in isolating esters, a chemical class of high impact in the cashew aroma/flavor. However, the dynamic headspace methodology produced an isolate in which the analytes were in greater concentration, which facilitates their identification (gas chromatography-mass spectrometry) and sensory evaluation in the chromatographic effluents. Solvent extraction (dichloromethane) without further concentration of the isolate was the most efficient methodology for the isolation of terpenes. Because these two techniques also isolated in greater concentration the volatiles from other chemical classes important to the cashew aroma, such as aldehydes and alcohols, they were considered the most advantageous for the study of cashew aroma/flavor.