998 resultados para Digital Sampling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-uniform sampling (NUS) has been established as a route to obtaining true sensitivity enhancements when recording indirect dimensions of decaying signals in the same total experimental time as traditional uniform incrementation of the indirect evolution period. Theory and experiments have shown that NUS can yield up to two-fold improvements in the intrinsic signal-to-noise ratio (SNR) of each dimension, while even conservative protocols can yield 20-40 % improvements in the intrinsic SNR of NMR data. Applications of biological NMR that can benefit from these improvements are emerging, and in this work we develop some practical aspects of applying NUS nD-NMR to studies that approach the traditional detection limit of nD-NMR spectroscopy. Conditions for obtaining high NUS sensitivity enhancements are considered here in the context of enabling H-1,N-15-HSQC experiments on natural abundance protein samples and H-1,C-13-HMBC experiments on a challenging natural product. Through systematic studies we arrive at more precise guidelines to contrast sensitivity enhancements with reduced line shape constraints, and report an alternative sampling density based on a quarter-wave sinusoidal distribution that returns the highest fidelity we have seen to date in line shapes obtained by maximum entropy processing of non-uniformly sampled data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Riparian zones are dynamic, transitional ecosystems between aquatic and terrestrial ecosystems with well defined vegetation and soil characteristics. Development of an all-encompassing definition for riparian ecotones, because of their high variability, is challenging. However, there are two primary factors that all riparian ecotones are dependent on: the watercourse and its associated floodplain. Previous approaches to riparian boundary delineation have utilized fixed width buffers, but this methodology has proven to be inadequate as it only takes the watercourse into consideration and ignores critical geomorphology, associated vegetation and soil characteristics. Our approach offers advantages over other previously used methods by utilizing: the geospatial modeling capabilities of ArcMap GIS; a better sampling technique along the water course that can distinguish the 50-year flood plain, which is the optimal hydrologic descriptor of riparian ecotones; the Soil Survey Database (SSURGO) and National Wetland Inventory (NWI) databases to distinguish contiguous areas beyond the 50-year plain; and land use/cover characteristics associated with the delineated riparian zones. The model utilizes spatial data readily available from Federal and State agencies and geospatial clearinghouses. An accuracy assessment was performed to assess the impact of varying the 50-year flood height, changing the DEM spatial resolution (1, 3, 5 and 10m), and positional inaccuracies with the National Hydrography Dataset (NHD) streams layer on the boundary placement of the delineated variable width riparian ecotones area. The result of this study is a robust and automated GIS based model attached to ESRI ArcMap software to delineate and classify variable-width riparian ecotones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proteins are linear chain molecules made out of amino acids. Only when they fold to their native states, they become functional. This dissertation aims to model the solvent (environment) effect and to develop & implement enhanced sampling methods that enable a reliable study of the protein folding problem in silico. We have developed an enhanced solvation model based on the solution to the Poisson-Boltzmann equation in order to describe the solvent effect. Following the quantum mechanical Polarizable Continuum Model (PCM), we decomposed net solvation free energy into three physical terms– Polarization, Dispersion and Cavitation. All the terms were implemented, analyzed and parametrized individually to obtain a high level of accuracy. In order to describe the thermodynamics of proteins, their conformational space needs to be sampled thoroughly. Simulations of proteins are hampered by slow relaxation due to their rugged free-energy landscape, with the barriers between minima being higher than the thermal energy at physiological temperatures. In order to overcome this problem a number of approaches have been proposed of which replica exchange method (REM) is the most popular. In this dissertation we describe a new variant of canonical replica exchange method in the context of molecular dynamic simulation. The advantage of this new method is the easily tunable high acceptance rate for the replica exchange. We call our method Microcanonical Replica Exchange Molecular Dynamic (MREMD). We have described the theoretical frame work, comment on its actual implementation, and its application to Trp-cage mini-protein in implicit solvent. We have been able to correctly predict the folding thermodynamics of this protein using our approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a technique for online compression of ECG signals using the Golomb-Rice encoding algorithm. This is facilitated by a novel time encoding asynchronous analog-to-digital converter targeted for low-power, implantable, long-term bio-medical sensing applications. In contrast to capturing the actual signal (voltage) values the asynchronous time encoder captures and encodes the time information at which predefined changes occur in the signal thereby minimizing the sensor's energy use and the number of bits we store to represent the information by not capturing unnecessary samples. The time encoder transforms the ECG signal data to pure time information that has a geometric distribution such that the Golomb-Rice encoding algorithm can be used to further compress the data. An overall online compression rate of about 6 times is achievable without the usual computations associated with most compression methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This theses investigates changes at Gepatschferner in length, area and volume since the last glacier maximum in 1850. Changes are discussed for the following time periods: 1850-1922, 1922-1971, 1971-1997, 1997-2006. Digital elevation models were created for 1850 from geomorphological data and for 1922 and 1971 from historical maps. Existing DEMs for 1997 and 2006 were further analysed. Since 1850 Gepatschferner has retreated by 2 km in length and has lost 32% of its area and 36% of its volume. The rate of loss of volume is increasing faster than the rate of loss of area and losses in the upper regions of the glacier are becoming increasingly more important to overall losses. The largest losses per 50 m elevation increment occur at the tongue. These losses are greatest in the most recent time step studied, 1997-2006, and exceed previous values by 40% and more. The data base includes the glacier margins, elevations models as they have been compiled within the thesis (DEMs of 1997 and 2006 are part of the glacier inventories, length changes are part of the length change data base of the Austrian Alpine Club).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This data set provides a high-resolution digital elevation model (DEM) of a thermokarst depression (~7 km²) on ice-complex deposits in the Arctic Lena Delta, Siberia. The DEM based on a geodetic field survey and was used for quantitative land surface analyses and detailed description of the thermokarst depression morphology. Detailed morphometrical analyses, volume calculations, and solar radiation modeling were performed and statistically analyzed by Ulrich et al. (2010) to investigate the asymmetrical thermokarst depression development and directed lake migration previously proposed by Morgenstern et al. (2008). Furthermore, the high-resolution DEM in combination with satellite data allowed detailed analyses of spatial and temporal landscape changes due to thermokarst development (Günther, 2009).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of developing efficient sampling schemes for multiband sparse signals. Previous results on multicoset sampling implementations that lead to universal sampling patterns (which guarantee perfect reconstruction), are based on a set of appropriate interleaved analog to digital converters, all of them operating at the same sampling frequency. In this paper we propose an alternative multirate synchronous implementation of multicoset codes, that is, all the analog to digital converters in the sampling scheme operate at different sampling frequencies, without need of introducing any delay. The interleaving is achieved through the usage of different rates, whose sum is significantly lower than the Nyquist rate of the multiband signal. To obtain universal patterns the sampling matrix is formulated and analyzed. Appropriate choices of the parameters, that is the block length and the sampling rates, are also proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many problems in digital communications involve wideband radio signals. As the most recent example, the impressive advances in Cognitive Radio systems make even more necessary the development of sampling schemes for wideband radio signals with spectral holes. This is equivalent to considering a sparse multiband signal in the framework of Compressive Sampling theory. Starting from previous results on multicoset sampling and recent advances in compressive sampling, we analyze the matrix involved in the corresponding reconstruction equation and define a new method for the design of universal multicoset codes, that is, codes guaranteeing perfect reconstruction of the sparse multiband signal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acknowledgements We would like to thank Erik Rexstad and Rob Williams for useful reviews of this manuscript. The collection of visual and acoustic data was funded by the UK Department of Energy & Climate Change, the Scottish Government, Collaborative Offshore Wind Research into the Environment (COWRIE) and Oil & Gas UK. Digital aerial surveys were funded by Moray Offshore Renewables Ltd and additional funding for analysis of the combined datasets was provided by Marine Scotland. Collaboration between the University of Aberdeen and Marine Scotland was supported by MarCRF. We thank colleagues at the University of Aberdeen, Moray First Marine, NERI, Hi-Def Aerial Surveying Ltd and Ravenair for essential support in the field, particularly Tim Barton, Bill Ruck, Rasmus Nielson and Dave Rutter. Thanks also to Andy Webb, David Borchers, Len Thomas, Kelly McLeod, David L. Miller, Dinara Sadykova and Thomas Cornulier for advice on survey design and statistical approache. Data Accessibility Data are available from the Dryad Digital Repository: http://dx.doi.org/10.5061/dryad.cf04g

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The world's largest fossil oyster reef, formed by the giant oyster Crassostrea gryphoides and located in Stetten (north of Vienna, Austria) is studied by Harzhauser et al., 2015, 2016; Djuricic et al., 2016. Digital documentation of the unique geological site is provided by terrestrial laser scanning (TLS) at the millimeter scale. Obtaining meaningful results is not merely a matter of data acquisition with a suitable device; it requires proper planning, data management, and postprocessing. Terrestrial laser scanning technology has a high potential for providing precise 3D mapping that serves as the basis for automatic object detection in different scenarios; however, it faces challenges in the presence of large amounts of data and the irregular geometry of an oyster reef. We provide a detailed description of the techniques and strategy used for data collection and processing in Djuricic et al., 2016. The use of laser scanning provided the ability to measure surface points of 46,840 (estimated) shells. They are up to 60-cm-long oyster specimens, and their surfaces are modeled with a high accuracy of 1 mm. In addition to laser scanning measurements, more than 300 photographs were captured, and an orthophoto mosaic was generated with a ground sampling distance (GSD) of 0.5 mm. This high-resolution 3D information and the photographic texture serve as the basis for ongoing and future geological and paleontological analyses. Moreover, they provide unprecedented documentation for conservation issues at a unique natural heritage site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for low bit-rate speech coding is the result of growing demand on the available radio bandwidth for mobile communications both for military purposes and for the public sector. To meet this growing demand it is required that the available bandwidth be utilized in the most economic way to accommodate more services. Two low bit-rate speech coders have been built and tested in this project. The two coders combine predictive coding with delta modulation, a property which enables them to achieve simultaneously the low bit-rate and good speech quality requirements. To enhance their efficiency, the predictor coefficients and the quantizer step size are updated periodically in each coder. This enables the coders to keep up with changes in the characteristics of the speech signal with time and with changes in the dynamic range of the speech waveform. However, the two coders differ in the method of updating their predictor coefficients. One updates the coefficients once every one hundred sampling periods and extracts the coefficients from input speech samples. This is known in this project as the Forward Adaptive Coder. Since the coefficients are extracted from input speech samples, these must be transmitted to the receiver to reconstruct the transmitted speech sample, thus adding to the transmission bit rate. The other updates its coefficients every sampling period, based on information of output data. This coder is known as the Backward Adaptive Coder. Results of subjective tests showed both coders to be reasonably robust to quantization noise. Both were graded quite good, with the Forward Adaptive performing slightly better, but with a slightly higher transmission bit rate for the same speech quality, than its Backward counterpart. The coders yielded acceptable speech quality of 9.6kbps for the Forward Adaptive and 8kbps for the Backward Adaptive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fire debris evidence is submitted to crime laboratories to determine if an ignitable liquid (IL) accelerant was used to commit arson. An ignitable liquid residue (ILR) may be difficult to analyze due to interferences, complex matrices, degradation, and low concentrations of analytes. Debris from an explosion and pre-detonated explosive compounds are not trivial to detect and identify due to sampling difficulties, complex matrices, and extremely low amounts (nanogram) of material present. The focus of this research is improving the sampling and detection of ILR and explosives through enhanced sensitivity, selectivity, and field portable instrumentation. Solid Phase MicroExtraction (SPME) enhanced the extraction of ILR by two orders of magnitude over conventional activated charcoal strip (ACS) extraction. Gas chromatography tandem mass spectrometry (GC/MS/MS) improved sensitivity of ILR by one order of magnitude and explosives by two orders of magnitude compared to gas chromatography mass spectrometry (GC/MS). Improvements in sensitivity were attributed to enhanced selectivity. An interface joining SPME to ion mobility spectrometry (IMS) has been constructed and evaluated to improve field detection of hidden explosives. The SPME-IMS interface improved the detection of volatile and semi-volatile explosive compounds and successfully adapted the IMS from a particle sampler into a vapor sampler. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 9/11 Act mandates the inspection of 100% of cargo shipments entering the U.S. by 2012 and 100% inspection of air cargo by March 2010. So far, only 5% of inbound shipping containers are inspected thoroughly while air cargo inspections have fared better at 50%. Government officials have admitted that these milestones cannot be met since the appropriate technology does not exist. This research presents a novel planar solid phase microextraction (PSPME) device with enhanced surface area and capacity for collection of the volatile chemical signatures in air that are emitted from illicit compounds for direct introduction into ion mobility spectrometers (IMS) for detection. These IMS detectors are widely used to detect particles of illicit substances and do not have to be adapted specifically to this technology. For static extractions, PDMS and sol-gel PDMS PSPME devices provide significant increases in sensitivity over conventional fiber SPME. Results show a 50–400 times increase in mass detected of piperonal and a 2–4 times increase for TNT. In a blind study of 6 cases suspected to contain varying amounts of MDMA, PSPME-IMS correctly detected 5 positive cases with no false positives or negatives. One of these cases had minimal amounts of MDMA resulting in a false negative response for fiber SPME-IMS. A La (dihed) phase chemistry has shown an increase in the extraction efficiency of TNT and 2,4-DNT and enhanced retention over time. An alternative PSPME device was also developed for the rapid (seconds) dynamic sampling and preconcentration of large volumes of air for direct thermal desorption into an IMS. This device affords high extraction efficiencies due to strong retention properties under ambient conditions resulting in ppt detection limits when 3.5 L of air are sampled over the course of 10 seconds. Dynamic PSPME was used to sample the headspace over the following: MDMA tablets (12–40 ng detected of piperonal), high explosives (Pentolite) (0.6 ng detected of TNT), and several smokeless powders (26–35 ng of 2,4-DNT and 11–74 ng DPA detected). PSPME-IMS technology is flexible to end-user needs, is low-cost, rapid, sensitive, easy to use, easy to implement, and effective. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human scent and human remains detection canines are used to locate living or deceased humans under many circumstances. Human scent canines locate individual humans on the basis of their unique scent profile, while human remains detection canines locate the general scent of decomposing human remains. Scent evidence is often collected by law enforcement agencies using a Scent Transfer Unit, a dynamic headspace concentration device. The goals of this research were to evaluate the STU-100 for the collection of human scent samples, and to apply this method to the collection of living and deceased human samples, and to the creation of canine training aids. The airflow rate and collection material used with the STU-100 were evaluated using a novel scent delivery method. Controlled Odor Mimic Permeation Systems were created containing representative standard compounds delivered at known rates, improving the reproducibility of optimization experiments. Flow rates and collection materials were compared. Higher air flow rates usually yielded significantly less total volatile compounds due to compound breakthrough through the collection material. Collection from polymer and cellulose-based materials demonstrated that the molecular backbone of the material is a factor in the trapping and releasing of compounds. The weave of the material also affects compound collection, as those materials with a tighter weave demonstrated enhanced collection efficiencies. Using the optimized method, volatiles were efficiently collected from living and deceased humans. Replicates of the living human samples showed good reproducibility; however, the odor profiles from individuals were not always distinguishable from one another. Analysis of the human remains samples revealed similarity in the type and ratio of compounds. Two types of prototype training aids were developed utilizing combinations of pure compounds as well as volatiles from actual human samples concentrated onto sorbents, which were subsequently used in field tests. The pseudo scent aids had moderate success in field tests, and the Odor pad aids had significant success. This research demonstrates that the STU-100 is a valuable tool for dog handlers and as a field instrument; however, modifications are warranted in order to improve its performance as a method for instrumental detection.