938 resultados para nonuniform sampling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Activity of 7-ethoxyresorufin-O-deethylase (EROD) in fish is certainly the best-studied biomarker of exposure applied in the field to evaluate biological effects of contamination in the marine environment. Since 1991, a feasibility study for a monitoring network using this biomarker of exposure has been conducted along French coasts. Using data obtained during several cruises, this study aims to determine the number of fish required to detect a given difference between 2 mean EROD activities, i.e. to achieve an a priori fixed statistical power (l-beta) given significance level (alpha), variance estimations and projected ratio of unequal sample sizes (k). Mean EROD activity and standard error were estimated at each of 82 sampling stations. The inter-individual variance component was dominant in estimating the variance of mean EROD activity. Influences of alpha, beta, k and variability on sample sizes are illustrated and discussed in terms of costs. In particular, sample sizes do not have to be equal, especially if such a requirement would lead to a significant cost in sampling extra material. Finally, the feasibility of longterm monitoring is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

info:eu-repo/semantics/inPress

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coprime and nested sampling are well known deterministic sampling techniques that operate at rates significantly lower than the Nyquist rate, and yet allow perfect reconstruction of the spectra of wide sense stationary signals. However, theoretical guarantees for these samplers assume ideal conditions such as synchronous sampling, and ability to perfectly compute statistical expectations. This thesis studies the performance of coprime and nested samplers in spatial and temporal domains, when these assumptions are violated. In spatial domain, the robustness of these samplers is studied by considering arrays with perturbed sensor locations (with unknown perturbations). Simplified expressions for the Fisher Information matrix for perturbed coprime and nested arrays are derived, which explicitly highlight the role of co-array. It is shown that even in presence of perturbations, it is possible to resolve $O(M^2)$ under appropriate conditions on the size of the grid. The assumption of small perturbations leads to a novel ``bi-affine" model in terms of source powers and perturbations. The redundancies in the co-array are then exploited to eliminate the nuisance perturbation variable, and reduce the bi-affine problem to a linear underdetermined (sparse) problem in source powers. This thesis also studies the robustness of coprime sampling to finite number of samples and sampling jitter, by analyzing their effects on the quality of the estimated autocorrelation sequence. A variety of bounds on the error introduced by such non ideal sampling schemes are computed by considering a statistical model for the perturbation. They indicate that coprime sampling leads to stable estimation of the autocorrelation sequence, in presence of small perturbations. Under appropriate assumptions on the distribution of WSS signals, sharp bounds on the estimation error are established which indicate that the error decays exponentially with the number of samples. The theoretical claims are supported by extensive numerical experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compressed covariance sensing using quadratic samplers is gaining increasing interest in recent literature. Covariance matrix often plays the role of a sufficient statistic in many signal and information processing tasks. However, owing to the large dimension of the data, it may become necessary to obtain a compressed sketch of the high dimensional covariance matrix to reduce the associated storage and communication costs. Nested sampling has been proposed in the past as an efficient sub-Nyquist sampling strategy that enables perfect reconstruction of the autocorrelation sequence of Wide-Sense Stationary (WSS) signals, as though it was sampled at the Nyquist rate. The key idea behind nested sampling is to exploit properties of the difference set that naturally arises in quadratic measurement model associated with covariance compression. In this thesis, we will focus on developing novel versions of nested sampling for low rank Toeplitz covariance estimation, and phase retrieval, where the latter problem finds many applications in high resolution optical imaging, X-ray crystallography and molecular imaging. The problem of low rank compressive Toeplitz covariance estimation is first shown to be fundamentally related to that of line spectrum recovery. In absence if noise, this connection can be exploited to develop a particular kind of sampler called the Generalized Nested Sampler (GNS), that can achieve optimal compression rates. In presence of bounded noise, we develop a regularization-free algorithm that provably leads to stable recovery of the high dimensional Toeplitz matrix from its order-wise minimal sketch acquired using a GNS. Contrary to existing TV-norm and nuclear norm based reconstruction algorithms, our technique does not use any tuning parameters, which can be of great practical value. The idea of nested sampling idea also finds a surprising use in the problem of phase retrieval, which has been of great interest in recent times for its convex formulation via PhaseLift, By using another modified version of nested sampling, namely the Partial Nested Fourier Sampler (PNFS), we show that with probability one, it is possible to achieve a certain conjectured lower bound on the necessary measurement size. Moreover, for sparse data, an l1 minimization based algorithm is proposed that can lead to stable phase retrieval using order-wise minimal number of measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de Mestrado Integrado em Medicina Veterinária

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Similar Exposure Group (SEG) can be created through the evaluation of workers performing the same or similar task, hazards they are exposed to, frequency and duration of their exposures, engineering controls available during their operations, personal protective equipment used, and exposure data. For this report, the samples of one facility that has collected nearly 40,000 various types of samples will be evaluated to determine if the creation of a SEG can be supported. The data will be reviewed for consistency with collection methods and laboratory detection limits. A subset of the samples may be selected based on the review. Data will also be statistically evaluated in order to determine whether the data is sufficient to terminate the sampling. IHDataAnalyst V1.27 will be used to assess the data. This program uses Bayesian Analysis to assist in making determinations. The 95 percent confidence interval will be calculated and evaluated in making decisions. This evaluation will be used to determine if a SEG can be created for any of the workers and determine the need for future sample collection. The data and evaluation presented in this report have been selected and evaluated specifically for the purposes of this project.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Direct sampling methods are increasingly being used to solve the inverse medium scattering problem to estimate the shape of the scattering object. A simple direct method using one incident wave and multiple measurements was proposed by Ito, Jin and Zou. In this report, we performed some analytic and numerical studies of the direct sampling method. The method was found to be effective in general. However, there are a few exceptions exposed in the investigation. Analytic solutions in different situations were studied to verify the viability of the method while numerical tests were used to validate the effectiveness of the method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several deterministic and probabilistic methods are used to evaluate the probability of seismically induced liquefaction of a soil. The probabilistic models usually possess some uncertainty in that model and uncertainties in the parameters used to develop that model. These model uncertainties vary from one statistical model to another. Most of the model uncertainties are epistemic, and can be addressed through appropriate knowledge of the statistical model. One such epistemic model uncertainty in evaluating liquefaction potential using a probabilistic model such as logistic regression is sampling bias. Sampling bias is the difference between the class distribution in the sample used for developing the statistical model and the true population distribution of liquefaction and non-liquefaction instances. Recent studies have shown that sampling bias can significantly affect the predicted probability using a statistical model. To address this epistemic uncertainty, a new approach was developed for evaluating the probability of seismically-induced soil liquefaction, in which a logistic regression model in combination with Hosmer-Lemeshow statistic was used. This approach was used to estimate the population (true) distribution of liquefaction to non-liquefaction instances of standard penetration test (SPT) and cone penetration test (CPT) based most updated case histories. Apart from this, other model uncertainties such as distribution of explanatory variables and significance of explanatory variables were also addressed using KS test and Wald statistic respectively. Moreover, based on estimated population distribution, logistic regression equations were proposed to calculate the probability of liquefaction for both SPT and CPT based case history. Additionally, the proposed probability curves were compared with existing probability curves based on SPT and CPT case histories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mobile sensor networks have unique advantages compared with wireless sensor networks. The mobility enables mobile sensors to flexibly reconfigure themselves to meet sensing requirements. In this dissertation, an adaptive sampling method for mobile sensor networks is presented. Based on the consideration of sensing resource constraints, computing abilities, and onboard energy limitations, the adaptive sampling method follows a down sampling scheme, which could reduce the total number of measurements, and lower sampling cost. Compressive sensing is a recently developed down sampling method, using a small number of randomly distributed measurements for signal reconstruction. However, original signals cannot be reconstructed using condensed measurements, as addressed by Shannon Sampling Theory. Measurements have to be processed under a sparse domain, and convex optimization methods should be applied to reconstruct original signals. Restricted isometry property would guarantee signals can be recovered with little information loss. While compressive sensing could effectively lower sampling cost, signal reconstruction is still a great research challenge. Compressive sensing always collects random measurements, whose information amount cannot be determined in prior. If each measurement is optimized as the most informative measurement, the reconstruction performance can perform much better. Based on the above consideration, this dissertation is focusing on an adaptive sampling approach, which could find the most informative measurements in unknown environments and reconstruct original signals. With mobile sensors, measurements are collect sequentially, giving the chance to uniquely optimize each of them. When mobile sensors are about to collect a new measurement from the surrounding environments, existing information is shared among networked sensors so that each sensor would have a global view of the entire environment. Shared information is analyzed under Haar Wavelet domain, under which most nature signals appear sparse, to infer a model of the environments. The most informative measurements can be determined by optimizing model parameters. As a result, all the measurements collected by the mobile sensor network are the most informative measurements given existing information, and a perfect reconstruction would be expected. To present the adaptive sampling method, a series of research issues will be addressed, including measurement evaluation and collection, mobile network establishment, data fusion, sensor motion, signal reconstruction, etc. Two dimensional scalar field will be reconstructed using the method proposed. Both single mobile sensors and mobile sensor networks will be deployed in the environment, and reconstruction performance of both will be compared.In addition, a particular mobile sensor, a quadrotor UAV is developed, so that the adaptive sampling method can be used in three dimensional scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sampling and preconcentration techniques play a critical role in headspace analysis in analytical chemistry. My dissertation presents a novel sampling design, capillary microextraction of volatiles (CMV), that improves the preconcentration of volatiles and semivolatiles in a headspace with high throughput, near quantitative analysis, high recovery and unambiguous identification of compounds when coupled to mass spectrometry. The CMV devices use sol-gel polydimethylsiloxane (PDMS) coated microglass fibers as the sampling/preconcentration sorbent when these fibers are stacked into open-ended capillary tubes. The design allows for dynamic headspace sampling by connecting the device to a hand-held vacuum pump. The inexpensive device can be fitted into a thermal desorption probe for thermal desorption of the extracted volatile compounds into a gas chromatography-mass spectrometer (GC-MS). The performance of the CMV devices was compared with two other existing preconcentration techniques, solid phase microextraction (SPME) and planar solid phase microextraction (PSPME). Compared to SPME fibers, the CMV devices have an improved surface area and phase volume of 5000 times and 80 times, respectively. One (1) minute dynamic CMV air sampling resulted in similar performance as a 30 min static extraction using a SPME fiber. The PSPME devices have been fashioned to easily interface with ion mobility spectrometers (IMS) for explosives or drugs detection. The CMV devices are shown to offer dynamic sampling and can now be coupled to COTS GC-MS instruments. Several compound classes representing explosives have been analyzed with minimum breakthrough even after a 60 min. sampling time. The extracted volatile compounds were retained in the CMV devices when preserved in aluminum foils after sampling. Finally, the CMV sampling device were used for several different headspace profiling applications which involved sampling a shipping facility, six illicit drugs, seven military explosives and eighteen different bacteria strains. Successful detection of the target analytes at ng levels of the target signature volatile compounds in these applications suggests that the CMV devices can provide high throughput qualitative and quantitative analysis with high recovery and unambiguous identification of analytes.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many studies are documenting positive large-scale species– people correlations (Luck, 2007; Schuldt & Assmann, 2010). The issue is scale dependent: the local association of species richness and people is in many cases a negative one (Pautasso, 2007; Pecher et al., 2010). This biogeographical pattern is thus important for conservation. If species-rich regions are also densely populated, preserving biodiversity becomes more difficult, ceteris paribus, than if species-rich regions were sparsely populated. At the same time, positive, regional species–people correlations are an opportunity for the biodiversity education of the majority of the human population and underline the importance of conservation in human-modified landscapes (e.g. Sheil & Meijaard, 2010; Ward, 2010).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analysed the viscera of 534 moles (Ta l p a spp.) from 30 of the 47 provinces of peninsular Spain, including 255 individuals of T. europaea from eight provinces, 154 individuals of T. occidentalis from 20 provinces, and 125 unidentified Ta l p a individuals from two provinces. We identified their helminth parasites and determined parasite species richness. We related parasite species richness with sampling effort using both a linear and a logarithmic function. We then performed stepwise linear regressions to predict mole parasite species richness from a small set of selected predictor variables that included sampling effort. We applied the resulting models to forecast T. euro p a e a, T. occidentalis, and Ta l p a spp. parasite species richness in all provinces with recorded host presence, assuming different levels of sampling eff o r t . F i n a l l y, we used partial regression analysis to partition the variation explained by each of the selected variables in the models. We found that mole parasite species richness is strongly conditioned by sampling effort, but that other factors such as cropland area and environmental disturbance have significant independent effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowledge of the geographical distribution of timber tree species in the Amazon is still scarce. This is especially true at the local level, thereby limiting natural resource management actions. Forest inventories are key sources of information on the occurrence of such species. However, areas with approved forest management plans are mostly located near access roads and the main industrial centers. The present study aimed to assess the spatial scale effects of forest inventories used as sources of occurrence data in the interpolation of potential species distribution models. The occurrence data of a group of six forest tree species were divided into four geographical areas during the modeling process. Several sampling schemes were then tested applying the maximum entropy algorithm, using the following predictor variables: elevation, slope, exposure, normalized difference vegetation index (NDVI) and height above the nearest drainage (HAND). The results revealed that using occurrence data from only one geographical area with unique environmental characteristics increased both model overfitting to input data and omission error rates. The use of a diagonal systematic sampling scheme and lower threshold values led to improved model performance. Forest inventories may be used to predict areas with a high probability of species occurrence, provided they are located in forest management plan regions representative of the environmental range of the model projection area.