937 resultados para SAMPLING SUFFICIENCY


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recently reported Monte Carlo Random Path Sampling method (RPS) is here improved and its application is expanded to the study of the 2D and 3D Ising and discrete Heisenberg models. The methodology was implemented to allow use in both CPU-based high-performance computing infrastructures (C/MPI) and GPU-based (CUDA) parallel computation, with significant computational performance gains. Convergence is discussed, both in terms of free energy and magnetization dependence on field/temperature. From the calculated magnetization-energy joint density of states, fast calculations of field and temperature dependent thermodynamic properties are performed, including the effects of anisotropy on coercivity, and the magnetocaloric effect. The emergence of first-order magneto-volume transitions in the compressible Ising model is interpreted using the Landau theory of phase transitions. Using metallic Gadolinium as a real-world example, the possibility of using RPS as a tool for computational magnetic materials design is discussed. Experimental magnetic and structural properties of a Gadolinium single crystal are compared to RPS-based calculations using microscopic parameters obtained from Density Functional Theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrous oxide (N2O) emissions from soil are often measured using the manual static chamber method. Manual gas sampling is labour intensive, so a minimal sampling frequency that maintains the accuracy of measurements would be desirable. However, the high temporal (diurnal, daily and seasonal) variabilities of N2O emissions can compromise the accuracy of measurements if not addressed adequately when formulating a sampling schedule. Assessments of sampling strategies to date have focussed on relatively low emission systems with high episodicity, where a small number of the highest emission peaks can be critically important in the measurement of whole season cumulative emissions. Using year-long, automated sub-daily N2O measurements from three fertilised sugarcane fields, we undertook an evaluation of the optimum gas sampling strategies in high emission systems with relatively long emission episodes. The results indicated that sampling in the morning between 09:00–12:00, when soil temperature was generally close to the daily average, best approximated the daily mean N2O emission within 4–7% of the ‘actual’ daily emissions measured by automated sampling. Weekly sampling with biweekly sampling for one week after >20 mm of rainfall was the recommended sampling regime. It resulted in no extreme (>20%) deviations from the ‘actuals’, had a high probability of estimating the annual cumulative emissions within 10% precision, with practicable sampling numbers in comparison to other sampling regimes. This provides robust and useful guidance for manual gas sampling in sugarcane cropping systems, although further adjustments by the operators in terms of expected measurement accuracy and resource availability are encouraged. By implementing these sampling strategies together, labour inputs and errors in measured cumulative N2O emissions can be minimised. Further research is needed to quantify the spatial variability of N2O emissions within sugarcane cropping and to develop techniques for effectively addressing both spatial and temporal variabilities simultaneously.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper, based on the outcome of discussions at a NORMAN Network-supported workshop in Lyon (France) in November 2014 aims to provide a common position of passive sampling community experts regarding concrete actions required to foster the use of passive sampling techniques in support of contaminant risk assessment and management and for routine monitoring of contaminants in aquatic systems. The brief roadmap presented here focusses on the identification of robust passive sampling methodology, technology that requires further development or that has yet to be developed, our current knowledge of the evaluation of uncertainties when calculating a freely dissolved concentration, the relationship between data from PS and that obtained through biomonitoring. A tiered approach to identifying areas of potential environmental quality standard (EQS) exceedances is also shown. Finally, we propose a list of recommended actions to improve the acceptance of passive sampling by policy-makers. These include the drafting of guidelines, quality assurance and control procedures, developing demonstration projects where biomonitoring and passive sampling are undertaken alongside, organising proficiency testing schemes and interlaboratory comparison and, finally, establishing passive sampler-based assessment criteria in relation to existing EQS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Task-based approach implicates identifying all the tasks developed in each workplace aiming to refine the exposure characterization. The starting point of this approach is the recognition that only through a more detailed and comprehensive understanding of tasks is possible to understand, in more detail, the exposure scenario. In addition allows also the most suitable risk management measures identification. This approach can be also used when there is a need of identifying the workplace surfaces for sampling chemicals that have the dermal exposure route as the most important. In this case is possible to identify, through detail observation of tasks performance, the surfaces that involves higher contact (frequency) by the workers and can be contaminated. Identify the surfaces to sample when performing occupational exposure assessment to antineoplasic agents. Surfaces selection done based on the task-based approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sampling the total air concentration of particulate matter (PM) only provides a basic estimate of exposure that normally not allows correlating with the observed health effects. Therefore is of extreme importance to know the particles size distribution and, in more detail, the exposure to fine particles (≤ 2.5 µm). This particles dimension corresponds to the respirable fraction. This particle fraction can result, besides local effects, in systemic effects due to particle deposition and clearance from the lungs and transport within the organism. This study intended to describe occupational exposure to PM2.5 in three different units located near Lisbon and related with occupational exposure to organic dust, namely: swine and poultry feed production and waste management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Division of Fisheries, Illinois Department of Natural Resources Grant/Contract No: Federal Aid Project F-123 R-15

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The occurrence frequency of failure events serve as critical indexes representing the safety status of dam-reservoir systems. Although overtopping is the most common failure mode with significant consequences, this type of event, in most cases, has a small probability. Estimation of such rare event risks for dam-reservoir systems with crude Monte Carlo (CMC) simulation techniques requires a prohibitively large number of trials, where significant computational resources are required to reach the satisfied estimation results. Otherwise, estimation of the disturbances would not be accurate enough. In order to reduce the computation expenses and improve the risk estimation efficiency, an importance sampling (IS) based simulation approach is proposed in this dissertation to address the overtopping risks of dam-reservoir systems. Deliverables of this study mainly include the following five aspects: 1) the reservoir inflow hydrograph model; 2) the dam-reservoir system operation model; 3) the CMC simulation framework; 4) the IS-based Monte Carlo (ISMC) simulation framework; and 5) the overtopping risk estimation comparison of both CMC and ISMC simulation. In a broader sense, this study meets the following three expectations: 1) to address the natural stochastic characteristics of the dam-reservoir system, such as the reservoir inflow rate; 2) to build up the fundamental CMC and ISMC simulation frameworks of the dam-reservoir system in order to estimate the overtopping risks; and 3) to compare the simulation results and the computational performance in order to demonstrate the ISMC simulation advantages. The estimation results of overtopping probability could be used to guide the future dam safety investigations and studies, and to supplement the conventional analyses in decision making on the dam-reservoir system improvements. At the same time, the proposed methodology of ISMC simulation is reasonably robust and proved to improve the overtopping risk estimation. The more accurate estimation, the smaller variance, and the reduced CPU time, expand the application of Monte Carlo (MC) technique on evaluating rare event risks for infrastructures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Passive sampling devices (PS) are widely used for pollutant monitoring in water, but estimation of measurement uncertainties by PS has seldom been undertaken. The aim of this work was to identify key parameters governing PS measurements of metals and their dispersion. We report the results of an in situ intercomparison exercise on diffusive gradient in thin films (DGT) in surface waters. Interlaboratory uncertainties of time-weighted average (TWA) concentrations were satisfactory (from 28% to 112%) given the number of participating laboratories (10) and ultra-trace metal concentrations involved. Data dispersion of TWA concentrations was mainly explained by uncertainties generated during DGT handling and analytical procedure steps. We highlight that DGT handling is critical for metals such as Cd, Cr and Zn, implying that DGT assembly/dismantling should be performed in very clean conditions. Using a unique dataset, we demonstrated that DGT markedly lowered the LOQ in comparison to spot sampling and stressed the need for accurate data calculation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document is the Online Supplement to ‘Myopic Allocation Policy with Asymptotically Optimal Sampling Rate,’ to be published in the IEEE Transactions of Automatic Control in 2017.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Activity of 7-ethoxyresorufin-O-deethylase (EROD) in fish is certainly the best-studied biomarker of exposure applied in the field to evaluate biological effects of contamination in the marine environment. Since 1991, a feasibility study for a monitoring network using this biomarker of exposure has been conducted along French coasts. Using data obtained during several cruises, this study aims to determine the number of fish required to detect a given difference between 2 mean EROD activities, i.e. to achieve an a priori fixed statistical power (l-beta) given significance level (alpha), variance estimations and projected ratio of unequal sample sizes (k). Mean EROD activity and standard error were estimated at each of 82 sampling stations. The inter-individual variance component was dominant in estimating the variance of mean EROD activity. Influences of alpha, beta, k and variability on sample sizes are illustrated and discussed in terms of costs. In particular, sample sizes do not have to be equal, especially if such a requirement would lead to a significant cost in sampling extra material. Finally, the feasibility of longterm monitoring is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coprime and nested sampling are well known deterministic sampling techniques that operate at rates significantly lower than the Nyquist rate, and yet allow perfect reconstruction of the spectra of wide sense stationary signals. However, theoretical guarantees for these samplers assume ideal conditions such as synchronous sampling, and ability to perfectly compute statistical expectations. This thesis studies the performance of coprime and nested samplers in spatial and temporal domains, when these assumptions are violated. In spatial domain, the robustness of these samplers is studied by considering arrays with perturbed sensor locations (with unknown perturbations). Simplified expressions for the Fisher Information matrix for perturbed coprime and nested arrays are derived, which explicitly highlight the role of co-array. It is shown that even in presence of perturbations, it is possible to resolve $O(M^2)$ under appropriate conditions on the size of the grid. The assumption of small perturbations leads to a novel ``bi-affine" model in terms of source powers and perturbations. The redundancies in the co-array are then exploited to eliminate the nuisance perturbation variable, and reduce the bi-affine problem to a linear underdetermined (sparse) problem in source powers. This thesis also studies the robustness of coprime sampling to finite number of samples and sampling jitter, by analyzing their effects on the quality of the estimated autocorrelation sequence. A variety of bounds on the error introduced by such non ideal sampling schemes are computed by considering a statistical model for the perturbation. They indicate that coprime sampling leads to stable estimation of the autocorrelation sequence, in presence of small perturbations. Under appropriate assumptions on the distribution of WSS signals, sharp bounds on the estimation error are established which indicate that the error decays exponentially with the number of samples. The theoretical claims are supported by extensive numerical experiments.