7 resultados para Sampling method

em Digital Commons - Michigan Tech


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Direct sampling methods are increasingly being used to solve the inverse medium scattering problem to estimate the shape of the scattering object. A simple direct method using one incident wave and multiple measurements was proposed by Ito, Jin and Zou. In this report, we performed some analytic and numerical studies of the direct sampling method. The method was found to be effective in general. However, there are a few exceptions exposed in the investigation. Analytic solutions in different situations were studied to verify the viability of the method while numerical tests were used to validate the effectiveness of the method.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Mobile sensor networks have unique advantages compared with wireless sensor networks. The mobility enables mobile sensors to flexibly reconfigure themselves to meet sensing requirements. In this dissertation, an adaptive sampling method for mobile sensor networks is presented. Based on the consideration of sensing resource constraints, computing abilities, and onboard energy limitations, the adaptive sampling method follows a down sampling scheme, which could reduce the total number of measurements, and lower sampling cost. Compressive sensing is a recently developed down sampling method, using a small number of randomly distributed measurements for signal reconstruction. However, original signals cannot be reconstructed using condensed measurements, as addressed by Shannon Sampling Theory. Measurements have to be processed under a sparse domain, and convex optimization methods should be applied to reconstruct original signals. Restricted isometry property would guarantee signals can be recovered with little information loss. While compressive sensing could effectively lower sampling cost, signal reconstruction is still a great research challenge. Compressive sensing always collects random measurements, whose information amount cannot be determined in prior. If each measurement is optimized as the most informative measurement, the reconstruction performance can perform much better. Based on the above consideration, this dissertation is focusing on an adaptive sampling approach, which could find the most informative measurements in unknown environments and reconstruct original signals. With mobile sensors, measurements are collect sequentially, giving the chance to uniquely optimize each of them. When mobile sensors are about to collect a new measurement from the surrounding environments, existing information is shared among networked sensors so that each sensor would have a global view of the entire environment. Shared information is analyzed under Haar Wavelet domain, under which most nature signals appear sparse, to infer a model of the environments. The most informative measurements can be determined by optimizing model parameters. As a result, all the measurements collected by the mobile sensor network are the most informative measurements given existing information, and a perfect reconstruction would be expected. To present the adaptive sampling method, a series of research issues will be addressed, including measurement evaluation and collection, mobile network establishment, data fusion, sensor motion, signal reconstruction, etc. Two dimensional scalar field will be reconstructed using the method proposed. Both single mobile sensors and mobile sensor networks will be deployed in the environment, and reconstruction performance of both will be compared.In addition, a particular mobile sensor, a quadrotor UAV is developed, so that the adaptive sampling method can be used in three dimensional scenarios.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Assessment of soil disturbance on the Custer National Forest was conducted during two summers to determine if the U.S. Forest Service Forest Soil Disturbance Monitoring Protocol (FSDMP) was able to distinguish post-harvest soil conditions in a chronological sequence of sites harvested using different ground-based logging systems. Results from the first year of sampling suggested that the FSDMP point sampling method may not be sensitive enough to measure post-harvest disturbance in stands with low levels of disturbance. Therefore, a revised random transect method was used during the second sampling season to determine the actual extent of soil disturbance in these cutting units. Using combined data collected from both summers I detected statistically significant differences (p < 0.05) in fine fraction bulk density measurements between FSDMP disturbance classes across all sites. Disturbance class 3 (most severe) had the highest reported bulk density, which suggest that the FSDMP visual class estimates are defined adequately allowing for correlations to be made between visual disturbance and actual soil physical characteristics. Forest site productivity can be defined by its ability to retain carbon and convert it to above- and belowground biomass. However, forest management activities that alter basic site characteristics have the potential to alter productivity. Soil compaction is one critical management impact that is important to understand; compaction has been shown to impede the root growth potential of plants, reduce water infiltration rates increasing erosion potential, and alter plant available water and nutrients, depending on soil texture. A new method to assess ground cover, erosion, and other soil disturbances was recently published by the U.S. Forest Service, as the Forest Soil Disturbance Protocol (FSDMP). The FSDMP allows soil scientists to visually assign a disturbance class estimate (0 – none, 1, 2, 3 – severe) from field measures of consistently defined soil disturbance indicators (erosion, fire, rutting, compaction, and platy/massive/puddled structure) in small circular (15 cm) plots to compare soil quality properties pre- and post- harvest condition. Using this protocol we were able to determine that ground-based timber harvesting activities occurring on the Custer National Forest are not reaching the 15% maximum threshold for detrimental soil disturbance outlined by the Region 1 Soil Quality Standards.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Analyzing “nuggety” gold samples commonly produces erratic fire assay results, due to random inclusion or exclusion of coarse gold in analytical samples. Preconcentrating gold samples might allow the nuggets to be concentrated and fire assayed separately. In this investigation synthetic gold samples were made using similar density tungsten powder and silica, and were preconcentrated using two approaches: an air jig and an air classifier. Current analytical gold sampling method is time and labor intensive and our aim is to design a set-up for rapid testing. It was observed that the preliminary air classifier design showed more promise than the air jig in terms of control over mineral recovery and preconcentrating bulk ore sub-samples. Hence the air classifier was modified with the goal of producing 10-30 grams samples aiming to capture all of the high density metallic particles, tungsten in this case. Effects of air velocity and feed rate on the recovery of tungsten from synthetic tungsten-silica mixtures were studied. The air classifier achieved optimal high density metal recovery of 97.7% at an air velocity of 0.72 m/s and feed rate of 160 g/min. Effects of density on classification were investigated by using iron as the dense metal instead of tungsten and the recovery was seen to drop from 96.13% to 20.82%. Preliminary investigations suggest that preconcentration of gold samples is feasible using the laboratory designed air classifier.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Peatlands cover only ~3% of the global land area, but store ~30% of the worlds' soil carbon. There are many different peat types that store different amounts of carbon. Most inventories of carbon storage in northern peatlands have been conducted in the expansive Sphagnum dominated peatlands. Although, northern white cedar peatlands (NW cedar, Thuja occidentalis L.) are also one of the most common peatland types in the Great Lakes Region, occupying more than 2 million hectares. NW cedar swamps are understudied, due in part to the difficulties in collection methods. General lack of rapid and consistent sampling methods has also contributed in a lack of carbon stock quantification for many peatlands. The main objective of this thesis is to quantify: 1) to evaluate peat sampling methods 2) the amount of C-stored and the rates of long-term carbon accumulation in NW cedar peatlands. We sampled 38 peatlands separated into four categories (black ash, NW cedar swamp, sedge, and Sphagnum) during the summers of 2011/2012 across northern MN and the Upper Peninsula of MI. Basal dates of peat indicate that cedar peatlands were between 1970-7790 years old. Cedar peatlands are generally shallower than Sphagnum peat, but due to their higher bulk density, hold similar amounts of carbon with our sites averaging ~800 MgC ha-1. We estimate that NW cedar peatlands store over 1.7 Gt of carbon in the Great Lakes Region. Each of the six methods evaluated had a different level of accuracy and requires varying levels of effort and resources. The depth only method and intermittent sampling method were the most accurate methods of peatland sampling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proteins are linear chain molecules made out of amino acids. Only when they fold to their native states, they become functional. This dissertation aims to model the solvent (environment) effect and to develop & implement enhanced sampling methods that enable a reliable study of the protein folding problem in silico. We have developed an enhanced solvation model based on the solution to the Poisson-Boltzmann equation in order to describe the solvent effect. Following the quantum mechanical Polarizable Continuum Model (PCM), we decomposed net solvation free energy into three physical terms– Polarization, Dispersion and Cavitation. All the terms were implemented, analyzed and parametrized individually to obtain a high level of accuracy. In order to describe the thermodynamics of proteins, their conformational space needs to be sampled thoroughly. Simulations of proteins are hampered by slow relaxation due to their rugged free-energy landscape, with the barriers between minima being higher than the thermal energy at physiological temperatures. In order to overcome this problem a number of approaches have been proposed of which replica exchange method (REM) is the most popular. In this dissertation we describe a new variant of canonical replica exchange method in the context of molecular dynamic simulation. The advantage of this new method is the easily tunable high acceptance rate for the replica exchange. We call our method Microcanonical Replica Exchange Molecular Dynamic (MREMD). We have described the theoretical frame work, comment on its actual implementation, and its application to Trp-cage mini-protein in implicit solvent. We have been able to correctly predict the folding thermodynamics of this protein using our approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.