837 resultados para Entropy of a sampling design


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigates the potential use of zerocrossing information for speech sample estimation. It provides 21 new method tn) estimate speech samples using composite zerocrossings. A simple linear interpolation technique is developed for this purpose. By using this method the A/D converter can be avoided in a speech coder. The newly proposed zerocrossing sampling theory is supported with results of computer simulations using real speech data. The thesis also presents two methods for voiced/ unvoiced classification. One of these methods is based on a distance measure which is a function of short time zerocrossing rate and short time energy of the signal. The other one is based on the attractor dimension and entropy of the signal. Among these two methods the first one is simple and reguires only very few computations compared to the other. This method is used imtea later chapter to design an enhanced Adaptive Transform Coder. The later part of the thesis addresses a few problems in Adaptive Transform Coding and presents an improved ATC. Transform coefficient with maximum amplitude is considered as ‘side information’. This. enables more accurate tfiiz assignment enui step—size computation. A new bit reassignment scheme is also introduced in this work. Finally, sum ATC which applies switching between luiscrete Cosine Transform and Discrete Walsh-Hadamard Transform for voiced and unvoiced speech segments respectively is presented. Simulation results are provided to show the improved performance of the coder

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long-term monitoring of forest soils as part of a pan-European network to detect environmental change depends on an accurate determination of the mean of the soil properties at each monitoring event. Forest soil is known to be very variable spatially, however. A study was undertaken to explore and quantify this variability at three forest monitoring plots in Britain. Detailed soil sampling was carried out, and the data from the chemical analyses were analysed by classical statistics and geostatistics. An analysis of variance showed that there were no consistent effects from the sample sites in relation to the position of the trees. The variogram analysis showed that there was spatial dependence at each site for several variables and some varied in an apparently periodic way. An optimal sampling analysis based on the multivariate variogram for each site suggested that a bulked sample from 36 cores would reduce error to an acceptable level. Future sampling should be designed so that it neither targets nor avoids trees and disturbed ground. This can be achieved best by using a stratified random sampling design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of the European Commission (EC)'s revision of the Sewage Sludge Directive and the development of a Biowaste Directive, there was recognition of the difficulty of comparing data from Member States (MSs) because of differences in sampling and analytical procedures. The 'HORIZONTAL' initiative, funded by the EC and MSs, seeks to address these differences in approach and to produce standardised procedures in the form of CEN standards. This article is a preliminary investigation into aspects of the sampling of biosolids, composts and soils to which there is a history of biosolid application. The article provides information on the measurement uncertainty associated with sampling from heaps, large bags and pipes and soils in the landscape under a limited set of conditions, using sampling approaches in space and time and sample numbers based on procedures widely used in the relevant industries and when sampling similar materials. These preliminary results suggest that considerably more information is required before the appropriate sample design, optimum number of samples, number of samples comprising a composite, and temporal and spatial frequency of sampling might be recommended to achieve consistent results of a high level of precision and confidence. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[1] Cloud cover is conventionally estimated from satellite images as the observed fraction of cloudy pixels. Active instruments such as radar and Lidar observe in narrow transects that sample only a small percentage of the area over which the cloud fraction is estimated. As a consequence, the fraction estimate has an associated sampling uncertainty, which usually remains unspecified. This paper extends a Bayesian method of cloud fraction estimation, which also provides an analytical estimate of the sampling error. This method is applied to test the sensitivity of this error to sampling characteristics, such as the number of observed transects and the variability of the underlying cloud field. The dependence of the uncertainty on these characteristics is investigated using synthetic data simulated to have properties closely resembling observations of the spaceborne Lidar NASA-LITE mission. Results suggest that the variance of the cloud fraction is greatest for medium cloud cover and least when conditions are mostly cloudy or clear. However, there is a bias in the estimation, which is greatest around 25% and 75% cloud cover. The sampling uncertainty is also affected by the mean lengths of clouds and of clear intervals; shorter lengths decrease uncertainty, primarily because there are more cloud observations in a transect of a given length. Uncertainty also falls with increasing number of transects. Therefore a sampling strategy aimed at minimizing the uncertainty in transect derived cloud fraction will have to take into account both the cloud and clear sky length distributions as well as the cloud fraction of the observed field. These conclusions have implications for the design of future satellite missions. This paper describes the first integrated methodology for the analytical assessment of sampling uncertainty in cloud fraction observations from forthcoming spaceborne radar and Lidar missions such as NASA's Calipso and CloudSat.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores firstly the potential of a new evolutionary method - the Cross-Entropy (CE) method in solving continuous inverse electromagnetic problems. For this purpose, an adaptive updating formula for the smoothing parameter, some mutation operation, and a new termination criterion are proposed. The proposed CE based metaheuristics is applied to reduce the ripple of the magnetic levitation forces of a prototype Maglev system. The numerical results have shown that the ripple of the magnetic levitation forces of the prototype system is reduced significantly after the design optimization using the proposed algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with the joint economic design of (x) over bar and R charts when the occurrence times of assignable causes follow Weibull distributions with increasing failure rates. The variable quality characteristic is assumed to be normally distributed and the process is subject to two independent assignable causes (such as tool wear-out, overheating, or vibration). One cause changes the process mean and the other changes the process variance. However, the occurrence of one kind of assignable cause does not preclude the occurrence of the other. A cost model is developed and a non-uniform sampling interval scheme is adopted. A two-step search procedure is employed to determine the optimum design parameters. Finally, a sensitivity analysis of the model is conducted, and the cost savings associated with the use of non-uniform sampling intervals instead of constant sampling intervals are evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with the joint economic design of x̄ and R charts when the occurrence times of assignable causes follow Weibull distributions with increasing failure rates. The variable quality characteristic is assumed to be normally distributed and the process is subject to two independent assignable causes (such as tool wear-out, overheating, or vibration). One cause changes the process mean and the other changes the process variance. However, the occurrence of one kind of assignable cause does not preclude the occurrence of the other. A cost model is developed and a non-uniform sampling interval scheme is adopted. A two-step search procedure is employed to determine the optimum design parameters. Finally, a sensitivity analysis of the model is conducted, and the cost savings associated with the use of non-uniform sampling intervals instead of constant sampling intervals are evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report dramatic sensitivity enhancements in multidimensional MAS NMR spectra by the use of nonuniform sampling (NUS) and introduce maximum entropy interpolation (MINT) processing that assures the linearity between the time and frequency domains of the NUS acquired data sets. A systematic analysis of sensitivity and resolution in 2D and 3D NUS spectra reveals that with NUS, at least 1.5- to 2-fold sensitivity enhancement can be attained in each indirect dimension without compromising the spectral resolution. These enhancements are similar to or higher than those attained by the newest-generation commercial cryogenic probes. We explore the benefits of this NUS/MaxEnt approach in proteins and protein assemblies using 1-73-(U-C-13,N-15)/74-108-(U-N-15) Escherichia coil thioredoxin reassembly. We demonstrate that in thioredoxin reassembly, NUS permits acquisition of high-quality 3D-NCACX spectra, which are inaccessible with conventional sampling due to prohibitively long experiment times. Of critical importance, issues that hinder NUS-based SNR enhancement in 3D-NMR of liquids are mitigated in the study of solid samples in which theoretical enhancements on the order of 3-4 fold are accessible by compounding the NUS-based SNR enhancement of each indirect dimension. NUS/MINT is anticipated to be widely applicable and advantageous for multidimensional heteronuclear MAS NMR spectroscopy of proteins, protein assemblies, and other biological systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this manuscript is to introduce a framework for consideration of designs for population pharmacokinetic orpharmacokinetic-pharmacodynamic studies. A standard one compartment pharmacokinetic model with first-order input and elimination is considered. A series of theoretical designs are considered that explore the influence of optimizing the allocation of sampling times, allocating patients to elementary designs, consideration of sparse sampling and unbalanced designs and also the influence of single vs. multiple dose designs. It was found that what appears to be relatively sparse sampling (less blood samples per patient than the number of fixed effects parameters to estimate) can also be highly informative. Overall, it is evident that exploring the population design space can yield many parsimonious designs that are efficient for parameter estimation and that may not otherwise have been considered without the aid of optimal design theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sampling and preconcentration techniques play a critical role in headspace analysis in analytical chemistry. My dissertation presents a novel sampling design, capillary microextraction of volatiles (CMV), that improves the preconcentration of volatiles and semivolatiles in a headspace with high throughput, near quantitative analysis, high recovery and unambiguous identification of compounds when coupled to mass spectrometry. The CMV devices use sol-gel polydimethylsiloxane (PDMS) coated microglass fibers as the sampling/preconcentration sorbent when these fibers are stacked into open-ended capillary tubes. The design allows for dynamic headspace sampling by connecting the device to a hand-held vacuum pump. The inexpensive device can be fitted into a thermal desorption probe for thermal desorption of the extracted volatile compounds into a gas chromatography-mass spectrometer (GC-MS). The performance of the CMV devices was compared with two other existing preconcentration techniques, solid phase microextraction (SPME) and planar solid phase microextraction (PSPME). Compared to SPME fibers, the CMV devices have an improved surface area and phase volume of 5000 times and 80 times, respectively. One (1) minute dynamic CMV air sampling resulted in similar performance as a 30 min static extraction using a SPME fiber. The PSPME devices have been fashioned to easily interface with ion mobility spectrometers (IMS) for explosives or drugs detection. The CMV devices are shown to offer dynamic sampling and can now be coupled to COTS GC-MS instruments. Several compound classes representing explosives have been analyzed with minimum breakthrough even after a 60 min. sampling time. The extracted volatile compounds were retained in the CMV devices when preserved in aluminum foils after sampling. Finally, the CMV sampling device were used for several different headspace profiling applications which involved sampling a shipping facility, six illicit drugs, seven military explosives and eighteen different bacteria strains. Successful detection of the target analytes at ng levels of the target signature volatile compounds in these applications suggests that the CMV devices can provide high throughput qualitative and quantitative analysis with high recovery and unambiguous identification of analytes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concept evaluation at the early phase of product development plays a crucial role in new product development. It determines the direction of the subsequent design activities. However, the evaluation information at this stage mainly comes from experts' judgments, which is subjective and imprecise. How to manage the subjectivity to reduce the evaluation bias is a big challenge in design concept evaluation. This paper proposes a comprehensive evaluation method which combines information entropy theory and rough number. Rough number is first presented to aggregate individual judgments and priorities and to manipulate the vagueness under a group decision-making environment. A rough number based information entropy method is proposed to determine the relative weights of evaluation criteria. The composite performance values based on rough number are then calculated to rank the candidate design concepts. The results from a practical case study on the concept evaluation of an industrial robot design show that the integrated evaluation model can effectively strengthen the objectivity across the decision-making processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By providing vehicle-to-vehicle and vehicle-to-infrastructure wireless communications, vehicular ad hoc networks (VANETs), also known as the “networks on wheels”, can greatly enhance traffic safety, traffic efficiency and driving experience for intelligent transportation system (ITS). However, the unique features of VANETs, such as high mobility and uneven distribution of vehicular nodes, impose critical challenges of high efficiency and reliability for the implementation of VANETs. This dissertation is motivated by the great application potentials of VANETs in the design of efficient in-network data processing and dissemination. Considering the significance of message aggregation, data dissemination and data collection, this dissertation research targets at enhancing the traffic safety and traffic efficiency, as well as developing novel commercial applications, based on VANETs, following four aspects: 1) accurate and efficient message aggregation to detect on-road safety relevant events, 2) reliable data dissemination to reliably notify remote vehicles, 3) efficient and reliable spatial data collection from vehicular sensors, and 4) novel promising applications to exploit the commercial potentials of VANETs. Specifically, to enable cooperative detection of safety relevant events on the roads, the structure-less message aggregation (SLMA) scheme is proposed to improve communication efficiency and message accuracy. The scheme of relative position based message dissemination (RPB-MD) is proposed to reliably and efficiently disseminate messages to all intended vehicles in the zone-of-relevance in varying traffic density. Due to numerous vehicular sensor data available based on VANETs, the scheme of compressive sampling based data collection (CS-DC) is proposed to efficiently collect the spatial relevance data in a large scale, especially in the dense traffic. In addition, with novel and efficient solutions proposed for the application specific issues of data dissemination and data collection, several appealing value-added applications for VANETs are developed to exploit the commercial potentials of VANETs, namely general purpose automatic survey (GPAS), VANET-based ambient ad dissemination (VAAD) and VANET based vehicle performance monitoring and analysis (VehicleView). Thus, by improving the efficiency and reliability in in-network data processing and dissemination, including message aggregation, data dissemination and data collection, together with the development of novel promising applications, this dissertation will help push VANETs further to the stage of massive deployment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sampling and preconcentration techniques play a critical role in headspace analysis in analytical chemistry. My dissertation presents a novel sampling design, capillary microextraction of volatiles (CMV), that improves the preconcentration of volatiles and semivolatiles in a headspace with high throughput, near quantitative analysis, high recovery and unambiguous identification of compounds when coupled to mass spectrometry. The CMV devices use sol-gel polydimethylsiloxane (PDMS) coated microglass fibers as the sampling/preconcentration sorbent when these fibers are stacked into open-ended capillary tubes. The design allows for dynamic headspace sampling by connecting the device to a hand-held vacuum pump. The inexpensive device can be fitted into a thermal desorption probe for thermal desorption of the extracted volatile compounds into a gas chromatography-mass spectrometer (GC-MS). The performance of the CMV devices was compared with two other existing preconcentration techniques, solid phase microextraction (SPME) and planar solid phase microextraction (PSPME). Compared to SPME fibers, the CMV devices have an improved surface area and phase volume of 5000 times and 80 times, respectively. One (1) minute dynamic CMV air sampling resulted in similar performance as a 30 min static extraction using a SPME fiber. The PSPME devices have been fashioned to easily interface with ion mobility spectrometers (IMS) for explosives or drugs detection. The CMV devices are shown to offer dynamic sampling and can now be coupled to COTS GC-MS instruments. Several compound classes representing explosives have been analyzed with minimum breakthrough even after a 60 min. sampling time. The extracted volatile compounds were retained in the CMV devices when preserved in aluminum foils after sampling. Finally, the CMV sampling device were used for several different headspace profiling applications which involved sampling a shipping facility, six illicit drugs, seven military explosives and eighteen different bacteria strains. Successful detection of the target analytes at ng levels of the target signature volatile compounds in these applications suggests that the CMV devices can provide high throughput qualitative and quantitative analysis with high recovery and unambiguous identification of analytes.