978 resultados para sampling techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Different geoenvironmental site investigation techniques to assess contamination from a municipal solid waste disposal site in Brazil are presented here. Superficial geophysical investigation (geoelectrical survey), resistivity piezocone penetration tests (RCPTU), soil samples collected with direct-push samplers and water samples collected from monitoring wells were applied in this study. The application of the geoelectrical method was indispensable to identify the presence and flow direction of contamination plumes (leachate) as well as to indicate the most suitable locations for RCPTU tests and soil and water sampling. Chemical analyses of groundwater samples contributed to a better understanding of the flow of the contaminated plume. The piezocone presented some limitations for tropical soils, since the groundwater level is sometimes deeper than the layer which is impenetrable to the cone, and the soil genesis and unsaturated conditions affect soil behavior. The combined interpretation of geoelectrical measurements and soil and water samplings underpinned the interpretation of RCPTU tests. The interpretation of all the test results indicates that the contamination plume has already overreached the landfill's west-northwest borders. Geoenvironmental laboratory test results suggest that contamination from the solid waste disposal site has been developing gradually, indicating the need for continuous monitoring of the groundwater.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Species richness is central to ecological theory, with practical applications in conservation, environmental management and monitoring. Several techniques are available for measuring species richness and composition of amphibians in breeding pools, but the relative efficacy of these methods for sampling high-diversity Neotropical amphibian fauna is poorly understood. I evaluated seven studies from south and south-eastern Brazil to compare the relative and combined effectiveness of two methods for measuring species richness at anuran breeding pools: acoustic surveys with visual encounter of adults and dipnet surveys of larvae. I also compared the relative efficacy of each survey method in detecting species with different reproductive modes. Results showed that both survey methods underestimated the number of species when used separately; however, a close approximation of the actual number of species in each breeding pool was obtained when the methods were combined. There was no difference between survey methods in detecting species with different reproductive modes. These results indicate that researchers should employ multiple survey methods that target both adult and larval life history stages in order to accurately assess anuran species richness at breeding pools in the Neotropics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although visualization in the field of dentistry has some of the same requirements as the medicine field, the differences in goal demand specific approaches. This paper reports on the implementation of two fundamentally different approaches to reconstruction of structures from planar cross sections and their application to dentistry data. One of the approaches was an implementation of a distance-based sampling technique, and the other is a new algorithm, based on the Delaunay triangulation. Both were tested using contour data of teeth and the results are compared here in the light of the target applications, which are teaching and training dentistry, as well as simulation of dental procedures and illnesses. Widely mentioned problems encountered in local reconstruction methods such as marching cubes for these cases are clearly illustrated in this paper, and a very satisfactory alternative is given. © 2000 SPIE and IS&T.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, researches have shown that the performance of metaheuristics can be affected by population initialization. Opposition-based Differential Evolution (ODE), Quasi-Oppositional Differential Evolution (QODE), and Uniform-Quasi-Opposition Differential Evolution (UQODE) are three state-of-the-art methods that improve the performance of the Differential Evolution algorithm based on population initialization and different search strategies. In a different approach to achieve similar results, this paper presents a technique to discover promising regions in a continuous search-space of an optimization problem. Using machine-learning techniques, the algorithm named Smart Sampling (SS) finds regions with high possibility of containing a global optimum. Next, a metaheuristic can be initialized inside each region to find that optimum. SS and DE were combined (originating the SSDE algorithm) to evaluate our approach, and experiments were conducted in the same set of benchmark functions used by ODE, QODE and UQODE authors. Results have shown that the total number of function evaluations required by DE to reach the global optimum can be significantly reduced and that the success rate improves if SS is employed first. Such results are also in consonance with results from the literature, stating the importance of an adequate starting population. Moreover, SS presents better efficacy to find initial populations of superior quality when compared to the other three algorithms that employ oppositional learning. Finally and most important, the SS performance in finding promising regions is independent of the employed metaheuristic with which SS is combined, making SS suitable to improve the performance of a large variety of optimization techniques. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis covers sampling and analytical procedures for isocyanates (R-NCO) and amines (R-NH2), two kinds of chemicals frequently used in association with the polymeric material polyurethane (PUR). Exposure to isocyanates may result in respiratory disorders and dermal sensitisation, and they are one of the main causes of occupational asthma. Several of the aromatic diamines associated with PUR production are classified as suspected carcinogens. Hence, the presence of these chemicals in different exposure situations must be monitored. In the context of determining isocyanates in air, the methodologies included derivatisation with the reagent di-n-butylamine (DBA) upon collection and subsequent determination using liquid chromatography (LC) and mass spectrometric detection (MS). A user-friendly solvent-free sampler for collection of airborne isocyanates was developed as an alternative to a more cumbersome impinger-filter sampling technique. The combination of the DBA reagent together with MS detection techniques revealed several new exposure situations for isocyanates, such as isocyanic acid during thermal degradation of PUR and urea-based resins. Further, a method for characterising isocyanates in technical products used in the production of PUR was developed. This enabled determination of isocyanates in air for which pure analytical standards are missing. Tandem MS (MS/MS) determination of isocyanates in air below 10-6 of the threshold limit values was achieved. As for the determination of amines, the analytical methods included derivatisation into pentafluoropropionic amide or ethyl carbamate ester derivatives and subsequent MS analysis. Several amines in biological fluids, as markers of exposure for either the amines themselves or the corresponding isocyanates, were determined by LC-MS/MS at amol level. In aqueous extraction solutions of flexible PUR foam products, toluene diamine and related compounds were found. In conclusion, this thesis demonstrates the usefulness of well characterised analytical procedures and techniques for determination of hazardous compounds. Without reliable and robust methodologies there is a risk that exposure levels will be underestimated or, even worse, that relevant compounds will be completely missed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gossip protocols have proved to be a viable solution to set-up and manage largescale P2P services or applications in a fully decentralised scenario. The gossip or epidemic communication scheme is heavily based on stochastic behaviors and it is the fundamental idea behind many large-scale P2P protocols. It provides many remarkable features, such as scalability, robustness to failures, emergent load balancing capabilities, fast spreading, and redundancy of information. In some sense, these services or protocols mimic natural system behaviors in order to achieve their goals. The key idea of this work is that the remarkable properties of gossip hold when all the participants follow the rules dictated by the actual protocols. If one or more malicious nodes join the network and start cheating according to some strategy, the result can be catastrophic. In order to study how serious the threat posed by malicious nodes can be and what can be done to prevent attackers from cheating, we focused on a general attack model aimed to defeat a key service in gossip overlay networks (the Peer Sampling Service [JGKvS04]). We also focused on the problem of protecting against forged information exchanged in gossip services. We propose a solution technique for each problem; both techniques are general enough to be applied to distinct service implementations. As gossip protocols, our solutions are based on stochastic behavior and are fully decentralized. In addition, each technique’s behaviour is abstracted by a general primitive function extending the basic gossip scheme; this approach allows the adoptions of our solutions with minimal changes in different scenarios. We provide an extensive experimental evaluation to support the effectiveness of our techniques. Basically, these techniques aim to be building blocks or P2P architecture guidelines in building more resilient and more secure P2P services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present thesis, a new methodology of diagnosis based on advanced use of time-frequency technique analysis is presented. More precisely, a new fault index that allows tracking individual fault components in a single frequency band is defined. More in detail, a frequency sliding is applied to the signals being analyzed (currents, voltages, vibration signals), so that each single fault frequency component is shifted into a prefixed single frequency band. Then, the discrete Wavelet Transform is applied to the resulting signal to extract the fault signature in the frequency band that has been chosen. Once the state of the machine has been qualitatively diagnosed, a quantitative evaluation of the fault degree is necessary. For this purpose, a fault index based on the energy calculation of approximation and/or detail signals resulting from wavelet decomposition has been introduced to quantify the fault extend. The main advantages of the developed new method over existing Diagnosis techniques are the following: - Capability of monitoring the fault evolution continuously over time under any transient operating condition; - Speed/slip measurement or estimation is not required; - Higher accuracy in filtering frequency components around the fundamental in case of rotor faults; - Reduction in the likelihood of false indications by avoiding confusion with other fault harmonics (the contribution of the most relevant fault frequency components under speed-varying conditions are clamped in a single frequency band); - Low memory requirement due to low sampling frequency; - Reduction in the latency of time processing (no requirement of repeated sampling operation).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aerosol particles are strongly related to climate, air quality, visibility and human health issues. They contribute the largest uncertainty in the assessment of the Earth´s radiative budget, directly by scattering or absorbing solar radiation or indirectly by nucleating cloud droplets. The influence of aerosol particles on cloud related climatic effects essentially depends upon their number concentration, size and chemical composition. A major part of submicron aerosol consists of secondary organic aerosol (SOA) that is formed in the atmosphere by the oxidation of volatile organic compounds. SOA can comprise a highly diverse spectrum of compounds that undergo continuous chemical transformations in the atmosphere.rnThe aim of this work was to obtain insights into the complexity of ambient SOA by the application of advanced mass spectrometric techniques. Therefore, an atmospheric pressure chemical ionization ion trap mass spectrometer (APCI-IT-MS) was applied in the field, facilitating the measurement of ions of the intact molecular organic species. Furthermore, the high measurement frequency provided insights into SOA composition and chemical transformation processes on a high temporal resolution. Within different comprehensive field campaigns, online measurements of particular biogenic organic acids were achieved by combining an online aerosol concentrator with the APCI-IT-MS. A holistic picture of the ambient organic aerosol was obtained through the co-located application of other complementary MS techniques, such as aerosol mass spectrometry (AMS) or filter sampling for the analysis by liquid chromatography / ultrahigh resolution mass spectrometry (LC/UHRMS).rnIn particular, during a summertime field study at the pristine boreal forest station in Hyytiälä, Finland, the partitioning of organic acids between gas and particle phase was quantified, based on the online APCI-IT-MS and AMS measurements. It was found that low volatile compounds reside to a large extent in the gas phase. This observation can be interpreted as a consequence of large aerosol equilibration timescales, which build up due to the continuous production of low volatile compounds in the gas phase and/or a semi-solid phase state of the ambient aerosol. Furthermore, in-situ structural informations of particular compounds were achieved by using the MS/MS mode of the ion trap. The comparison to MS/MS spectra from laboratory generated SOA of specific monoterpene precursors indicated that laboratory SOA barely depicts the complexity of ambient SOA. Moreover, it was shown that the mass spectra of the laboratory SOA more closely resemble the ambient gas phase composition, indicating that the oxidation state of the ambient organic compounds in the particle phase is underestimated by the comparison to laboratory ozonolysis. These observations suggest that the micro-scale processes, such as the chemistry of aerosol aging or the gas-to-particle partitioning, need to be better understood in order to predict SOA concentrations more reliably.rnDuring a field study at the Mt. Kleiner Feldberg, Germany, a slightly different aerosol concentrator / APCI-IT-MS setup made the online analysis of new particle formation possible. During a particular nucleation event, the online mass spectra indicated that organic compounds of approximately 300 Da are main constituents of the bulk aerosol during ambient new particle formation. Co-located filter analysis by LC/UHRMS analysis supported these findings and furthermore allowed to determine the molecular formulas of the involved organic compounds. The unambiguous identification of several oxidized C 15 compounds indicated that oxidation products of sesquiterpenes can be important compounds for the initial formation and subsequent growth of atmospheric nanoparticles.rnThe LC/UHRMS analysis furthermore revealed that considerable amounts of organosulfates and nitrooxy organosulfates were detected on the filter samples. Indeed, it was found that several nitrooxy organosulfate related APCI-IT-MS mass traces were simultaneously enhanced. Concurrent particle phase ion chromatography and AMS measurements indicated a strong bias between inorganic sulfate and total sulfate concentrations, supporting the assumption that substantial amounts of sulfate was bonded to organic molecules.rnFinally, the comprehensive chemical analysis of the aerosol composition was compared to the hygroscopicity parameter kappa, which was derived from cloud condensation nuclei (CCN) measurements. Simultaneously, organic aerosol aging was observed by the evolution of a ratio between a second and a first generation biogenic oxidation product. It was found that this aging proxy positively correlates with increasing hygroscopicity. Moreover, it was observed that the bonding of sulfate to organic molecules leads to a significant reduction of kappa, compared to an internal mixture of the same mass fractions of purely inorganic sulfate and organic molecules. Concluding, it has been shown within this thesis that the application of modern mass spectrometric techniques allows for detailed insights into chemical and physico-chemical processes of atmospheric aerosols.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Image-based Relighting (IBRL) has recently attracted a lot of research interest for its ability to relight real objects or scenes, from novel illuminations captured in natural/synthetic environments. Complex lighting effects such as subsurface scattering, interreflection, shadowing, mesostructural self-occlusion, refraction and other relevant phenomena can be generated using IBRL. The main advantage of image-based graphics is that the rendering time is independent of scene complexity as the rendering is actually a process of manipulating image pixels, instead of simulating light transport. The goal of this paper is to provide a complete and systematic overview of the research in Imagebased Relighting. We observe that essentially all IBRL techniques can be broadly classified into three categories (Fig. 9), based on how the scene/illumination information is captured: Reflectance function-based, Basis function-based and Plenoptic function-based. We discuss the characteristics of each of these categories and their representative methods. We also discuss about the sampling density and types of light source(s), relevant issues of IBRL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present results from an intercomparison program of CO2, δ(O2/N2) and δ13CO2 measurements from atmospheric flask samples. Flask samples are collected on a bi-weekly basis at the High Altitude Research Station Jungfraujoch in Switzerland for three European laboratories: the University of Bern, Switzerland, the University of Groningen, the Netherlands and the Max Planck Institute for Biogeochemistry in Jena, Germany. Almost 4 years of measurements of CO2, δ(O2/N2) and δ13CO2 are compared in this paper to assess the measurement compatibility of the three laboratories. While the average difference for the CO2 measurements between the laboratories in Bern and Jena meets the required compatibility goal as defined by the World Meteorological Organization, the standard deviation of the average differences between all laboratories is not within the required goal. However, the obtained annual trend and seasonalities are the same within their estimated uncertainties. For δ(O2/N2) significant differences are observed between the three laboratories. The comparison for δ13CO2 yields the least compatible results and the required goals are not met between the three laboratories. Our study shows the importance of regular intercomparison exercises to identify potential biases between laboratories and the need to improve the quality of atmospheric measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time-based localization techniques such as multilateration are favoured for positioning to wide-band signals. Applying the same techniques with narrow-band signals such as GSM is not so trivial. The process is challenged by the needs of synchronization accuracy and timestamp resolution both in the nanoseconds range. We propose approaches to deal with both challenges. On the one hand, we introduce a method to eliminate the negative effect of synchronization offset on time measurements. On the other hand, we propose timestamps with nanoseconds accuracy by using timing information from the signal processing chain. For a set of experiments, ranging from sub-urban to indoor environments, we show that our proposed approaches are able to improve the localization accuracy of TDOA approaches by several factors. We are even able to demonstrate errors as small as 10 meters for outdoor settings with narrow-band signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many techniques based on data which are drawn by Ranked Set Sampling (RSS) scheme assume that the ranking of observations is perfect. Therefore it is essential to develop some methods for testing this assumption. In this article, we propose a parametric location-scale free test for assessing the assumption of perfect ranking. The results of a simulation study in two special cases of normal and exponential distributions indicate that the proposed test performs well in comparison with its leading competitors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present three methods for the distortion-free enhancement of THz signals measured by electro-optic sampling in zinc blende-type detector crystals, e.g., ZnTe or GaP. A technique commonly used in optically heterodyne-detected optical Kerr effect spectroscopy is introduced, which is based on two measurements at opposite optical biases near the zero transmission point in a crossed polarizer detection geometry. In contrast to other techniques for an undistorted THz signal enhancement, it also works in a balanced detection scheme and does not require an elaborate procedure for the reconstruction of the true signal as the two measured waveforms are simply subtracted to remove distortions. We study three different approaches for setting an optical bias using the Jones matrix formalism and discuss them also in the framework of optical heterodyne detection. We show that there is an optimal bias point in realistic situations where a small fraction of the probe light is scattered by optical components. The experimental demonstration will be given in the second part of this two-paper series [J. Opt. Soc. Am. B, doc. ID 204877 (2014, posted online)].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monte Carlo integration is firmly established as the basis for most practical realistic image synthesis algorithms because of its flexibility and generality. However, the visual quality of rendered images often suffers from estimator variance, which appears as visually distracting noise. Adaptive sampling and reconstruction algorithms reduce variance by controlling the sampling density and aggregating samples in a reconstruction step, possibly over large image regions. In this paper we survey recent advances in this area. We distinguish between “a priori” methods that analyze the light transport equations and derive sampling rates and reconstruction filters from this analysis, and “a posteriori” methods that apply statistical techniques to sets of samples to drive the adaptive sampling and reconstruction process. They typically estimate the errors of several reconstruction filters, and select the best filter locally to minimize error. We discuss advantages and disadvantages of recent state-of-the-art techniques, and provide visual and quantitative comparisons. Some of these techniques are proving useful in real-world applications, and we aim to provide an overview for practitioners and researchers to assess these approaches. In addition, we discuss directions for potential further improvements.