942 resultados para SPME sampling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Division of Fisheries, Illinois Department of Natural Resources Grant/Contract No: Federal Aid Project F-123 R-15

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The occurrence frequency of failure events serve as critical indexes representing the safety status of dam-reservoir systems. Although overtopping is the most common failure mode with significant consequences, this type of event, in most cases, has a small probability. Estimation of such rare event risks for dam-reservoir systems with crude Monte Carlo (CMC) simulation techniques requires a prohibitively large number of trials, where significant computational resources are required to reach the satisfied estimation results. Otherwise, estimation of the disturbances would not be accurate enough. In order to reduce the computation expenses and improve the risk estimation efficiency, an importance sampling (IS) based simulation approach is proposed in this dissertation to address the overtopping risks of dam-reservoir systems. Deliverables of this study mainly include the following five aspects: 1) the reservoir inflow hydrograph model; 2) the dam-reservoir system operation model; 3) the CMC simulation framework; 4) the IS-based Monte Carlo (ISMC) simulation framework; and 5) the overtopping risk estimation comparison of both CMC and ISMC simulation. In a broader sense, this study meets the following three expectations: 1) to address the natural stochastic characteristics of the dam-reservoir system, such as the reservoir inflow rate; 2) to build up the fundamental CMC and ISMC simulation frameworks of the dam-reservoir system in order to estimate the overtopping risks; and 3) to compare the simulation results and the computational performance in order to demonstrate the ISMC simulation advantages. The estimation results of overtopping probability could be used to guide the future dam safety investigations and studies, and to supplement the conventional analyses in decision making on the dam-reservoir system improvements. At the same time, the proposed methodology of ISMC simulation is reasonably robust and proved to improve the overtopping risk estimation. The more accurate estimation, the smaller variance, and the reduced CPU time, expand the application of Monte Carlo (MC) technique on evaluating rare event risks for infrastructures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Passive sampling devices (PS) are widely used for pollutant monitoring in water, but estimation of measurement uncertainties by PS has seldom been undertaken. The aim of this work was to identify key parameters governing PS measurements of metals and their dispersion. We report the results of an in situ intercomparison exercise on diffusive gradient in thin films (DGT) in surface waters. Interlaboratory uncertainties of time-weighted average (TWA) concentrations were satisfactory (from 28% to 112%) given the number of participating laboratories (10) and ultra-trace metal concentrations involved. Data dispersion of TWA concentrations was mainly explained by uncertainties generated during DGT handling and analytical procedure steps. We highlight that DGT handling is critical for metals such as Cd, Cr and Zn, implying that DGT assembly/dismantling should be performed in very clean conditions. Using a unique dataset, we demonstrated that DGT markedly lowered the LOQ in comparison to spot sampling and stressed the need for accurate data calculation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document is the Online Supplement to ‘Myopic Allocation Policy with Asymptotically Optimal Sampling Rate,’ to be published in the IEEE Transactions of Automatic Control in 2017.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Activity of 7-ethoxyresorufin-O-deethylase (EROD) in fish is certainly the best-studied biomarker of exposure applied in the field to evaluate biological effects of contamination in the marine environment. Since 1991, a feasibility study for a monitoring network using this biomarker of exposure has been conducted along French coasts. Using data obtained during several cruises, this study aims to determine the number of fish required to detect a given difference between 2 mean EROD activities, i.e. to achieve an a priori fixed statistical power (l-beta) given significance level (alpha), variance estimations and projected ratio of unequal sample sizes (k). Mean EROD activity and standard error were estimated at each of 82 sampling stations. The inter-individual variance component was dominant in estimating the variance of mean EROD activity. Influences of alpha, beta, k and variability on sample sizes are illustrated and discussed in terms of costs. In particular, sample sizes do not have to be equal, especially if such a requirement would lead to a significant cost in sampling extra material. Finally, the feasibility of longterm monitoring is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coprime and nested sampling are well known deterministic sampling techniques that operate at rates significantly lower than the Nyquist rate, and yet allow perfect reconstruction of the spectra of wide sense stationary signals. However, theoretical guarantees for these samplers assume ideal conditions such as synchronous sampling, and ability to perfectly compute statistical expectations. This thesis studies the performance of coprime and nested samplers in spatial and temporal domains, when these assumptions are violated. In spatial domain, the robustness of these samplers is studied by considering arrays with perturbed sensor locations (with unknown perturbations). Simplified expressions for the Fisher Information matrix for perturbed coprime and nested arrays are derived, which explicitly highlight the role of co-array. It is shown that even in presence of perturbations, it is possible to resolve $O(M^2)$ under appropriate conditions on the size of the grid. The assumption of small perturbations leads to a novel ``bi-affine" model in terms of source powers and perturbations. The redundancies in the co-array are then exploited to eliminate the nuisance perturbation variable, and reduce the bi-affine problem to a linear underdetermined (sparse) problem in source powers. This thesis also studies the robustness of coprime sampling to finite number of samples and sampling jitter, by analyzing their effects on the quality of the estimated autocorrelation sequence. A variety of bounds on the error introduced by such non ideal sampling schemes are computed by considering a statistical model for the perturbation. They indicate that coprime sampling leads to stable estimation of the autocorrelation sequence, in presence of small perturbations. Under appropriate assumptions on the distribution of WSS signals, sharp bounds on the estimation error are established which indicate that the error decays exponentially with the number of samples. The theoretical claims are supported by extensive numerical experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compressed covariance sensing using quadratic samplers is gaining increasing interest in recent literature. Covariance matrix often plays the role of a sufficient statistic in many signal and information processing tasks. However, owing to the large dimension of the data, it may become necessary to obtain a compressed sketch of the high dimensional covariance matrix to reduce the associated storage and communication costs. Nested sampling has been proposed in the past as an efficient sub-Nyquist sampling strategy that enables perfect reconstruction of the autocorrelation sequence of Wide-Sense Stationary (WSS) signals, as though it was sampled at the Nyquist rate. The key idea behind nested sampling is to exploit properties of the difference set that naturally arises in quadratic measurement model associated with covariance compression. In this thesis, we will focus on developing novel versions of nested sampling for low rank Toeplitz covariance estimation, and phase retrieval, where the latter problem finds many applications in high resolution optical imaging, X-ray crystallography and molecular imaging. The problem of low rank compressive Toeplitz covariance estimation is first shown to be fundamentally related to that of line spectrum recovery. In absence if noise, this connection can be exploited to develop a particular kind of sampler called the Generalized Nested Sampler (GNS), that can achieve optimal compression rates. In presence of bounded noise, we develop a regularization-free algorithm that provably leads to stable recovery of the high dimensional Toeplitz matrix from its order-wise minimal sketch acquired using a GNS. Contrary to existing TV-norm and nuclear norm based reconstruction algorithms, our technique does not use any tuning parameters, which can be of great practical value. The idea of nested sampling idea also finds a surprising use in the problem of phase retrieval, which has been of great interest in recent times for its convex formulation via PhaseLift, By using another modified version of nested sampling, namely the Partial Nested Fourier Sampler (PNFS), we show that with probability one, it is possible to achieve a certain conjectured lower bound on the necessary measurement size. Moreover, for sparse data, an l1 minimization based algorithm is proposed that can lead to stable phase retrieval using order-wise minimal number of measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de Mestrado Integrado em Medicina Veterinária

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Similar Exposure Group (SEG) can be created through the evaluation of workers performing the same or similar task, hazards they are exposed to, frequency and duration of their exposures, engineering controls available during their operations, personal protective equipment used, and exposure data. For this report, the samples of one facility that has collected nearly 40,000 various types of samples will be evaluated to determine if the creation of a SEG can be supported. The data will be reviewed for consistency with collection methods and laboratory detection limits. A subset of the samples may be selected based on the review. Data will also be statistically evaluated in order to determine whether the data is sufficient to terminate the sampling. IHDataAnalyst V1.27 will be used to assess the data. This program uses Bayesian Analysis to assist in making determinations. The 95 percent confidence interval will be calculated and evaluated in making decisions. This evaluation will be used to determine if a SEG can be created for any of the workers and determine the need for future sample collection. The data and evaluation presented in this report have been selected and evaluated specifically for the purposes of this project.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Direct sampling methods are increasingly being used to solve the inverse medium scattering problem to estimate the shape of the scattering object. A simple direct method using one incident wave and multiple measurements was proposed by Ito, Jin and Zou. In this report, we performed some analytic and numerical studies of the direct sampling method. The method was found to be effective in general. However, there are a few exceptions exposed in the investigation. Analytic solutions in different situations were studied to verify the viability of the method while numerical tests were used to validate the effectiveness of the method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several deterministic and probabilistic methods are used to evaluate the probability of seismically induced liquefaction of a soil. The probabilistic models usually possess some uncertainty in that model and uncertainties in the parameters used to develop that model. These model uncertainties vary from one statistical model to another. Most of the model uncertainties are epistemic, and can be addressed through appropriate knowledge of the statistical model. One such epistemic model uncertainty in evaluating liquefaction potential using a probabilistic model such as logistic regression is sampling bias. Sampling bias is the difference between the class distribution in the sample used for developing the statistical model and the true population distribution of liquefaction and non-liquefaction instances. Recent studies have shown that sampling bias can significantly affect the predicted probability using a statistical model. To address this epistemic uncertainty, a new approach was developed for evaluating the probability of seismically-induced soil liquefaction, in which a logistic regression model in combination with Hosmer-Lemeshow statistic was used. This approach was used to estimate the population (true) distribution of liquefaction to non-liquefaction instances of standard penetration test (SPT) and cone penetration test (CPT) based most updated case histories. Apart from this, other model uncertainties such as distribution of explanatory variables and significance of explanatory variables were also addressed using KS test and Wald statistic respectively. Moreover, based on estimated population distribution, logistic regression equations were proposed to calculate the probability of liquefaction for both SPT and CPT based case history. Additionally, the proposed probability curves were compared with existing probability curves based on SPT and CPT case histories.