996 resultados para adaptive sampling


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper describes a novel Autonomous Surface Vehicle capable of navigating throughout complex inland water storages and measuring a range of water quality properties and greenhouse gas emissions. The 16 ft long solar powered catamaran can collect this information throughout the water column whilst the vehicle is moving. A unique feature of this ASV is its integration into a storage scale floating sensor network to allow remote mission uploads, data download and adaptive sampling strategies. This paper provides an overview of the vehicle design and operation including control, laser-based obstacle avoidance, and vision-based inspection capabilities. Experimental results are shown illustrating its ability to continuously collect key water quality parameters and compliment intensive manual monitoring campaigns.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Water to air methane emissions from freshwater reservoirs can be dominated by sediment bubbling (ebullitive) events. Previous work to quantify methane bubbling from a number of Australian sub-tropical reservoirs has shown that this can contribute as much as 95% of total emissions. These bubbling events are controlled by a variety of different factors including water depth, surface and internal waves, wind seiching, atmospheric pressure changes and water levels changes. Key to quantifying the magnitude of this emission pathway is estimating both the bubbling rate as well as the areal extent of bubbling. Both bubbling rate and areal extent are seldom constant and require persistent monitoring over extended time periods before true estimates can be generated. In this paper we present a novel system for persistent monitoring of both bubbling rate and areal extent using multiple robotic surface chambers and adaptive sampling (grazing) algorithms to automate the quantification process. Individual chambers are self-propelled and guided and communicate between each other without the need for supervised control. They can maintain station at a sampling site for a desired incubation period and continuously monitor, record and report fluxes during the incubation. To exploit the methane sensor detection capabilities, the chamber can be automatically lowered to decrease the head-space and increase concentration. The grazing algorithms assign a hierarchical order to chambers within a preselected zone. Chambers then converge on the individual recording the highest 15 minute bubbling rate. Individuals maintain a specified distance apart from each other during each sampling period before all individuals are then required to move to different locations based on a sampling algorithm (systematic or adaptive) exploiting prior measurements. This system has been field tested on a large-scale subtropical reservoir, Little Nerang Dam, and over monthly timescales. Using this technique, localised bubbling zones on the water storage were found to produce over 50,000 mg m-2 d-1 and the areal extent ranged from 1.8 to 7% of the total reservoir area. The drivers behind these changes as well as lessons learnt from the system implementation are presented. This system exploits relatively cheap materials, sensing and computing and can be applied to a wide variety of aquatic and terrestrial systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We use Bayesian model selection techniques to test extensions of the standard flat LambdaCDM paradigm. Dark-energy and curvature scenarios, and primordial perturbation models are considered. To that end, we calculate the Bayesian evidence in favour of each model using Population Monte Carlo (PMC), a new adaptive sampling technique which was recently applied in a cosmological context. The Bayesian evidence is immediately available from the PMC sample used for parameter estimation without further computational effort, and it comes with an associated error evaluation. Besides, it provides an unbiased estimator of the evidence after any fixed number of iterations and it is naturally parallelizable, in contrast with MCMC and nested sampling methods. By comparison with analytical predictions for simulated data, we show that our results obtained with PMC are reliable and robust. The variability in the evidence evaluation and the stability for various cases are estimated both from simulations and from data. For the cases we consider, the log-evidence is calculated with a precision of better than 0.08. Using a combined set of recent CMB, SNIa and BAO data, we find inconclusive evidence between flat LambdaCDM and simple dark-energy models. A curved Universe is moderately to strongly disfavoured with respect to a flat cosmology. Using physically well-motivated priors within the slow-roll approximation of inflation, we find a weak preference for a running spectral index. A Harrison-Zel'dovich spectrum is weakly disfavoured. With the current data, tensor modes are not detected; the large prior volume on the tensor-to-scalar ratio r results in moderate evidence in favour of r=0.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The abundances and distributions of coastal pelagic fish species in the California Current Ecosystem from San Diego to southern Vancouver Island, were estimated from combined acoustic and trawl surveys conducted in the spring of 2006, 2008, and 2010. Pacific sardine (Sardinops sagax), jack mackerel (Trachurus symmetricus), and Pacific mackerel (Scomber japonicus) were the dominant coastal pelagic fish species, in that order. Northern anchovy (Engraulis mordax) and Pacific herring (Clupea pallasii) were sampled only sporadically and therefore estimates for these species were unreliable. The estimates of sardine biomass compared well with those of the annual assessments and confirmed a declining trajectory of the “northern stock” since 2006. During the sampling period, the biomass of jack mackerel was stable or increasing, and that of Pacific mackerel was low and variable. The uncertainties in these estimates are mostly the result of spatial patchiness which increased from sardine to mackerels to anchovy and herring. Future surveys of coastal pelagic fish species in the California Current Ecosystem should benefit from adaptive sampling based on modeled habitat; increased echosounder and trawl sampling, particularly for the most patchy and nearshore species; and directed-trawl sampling for improved species identification and estimations of their acoustic target stren

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Field campaigns are instrumental in providing ground truth for understanding and modeling global ocean biogeochemical budgets. A survey however can only inspect a fraction of the global oceans, typically a region hundreds of kilometers wide for a temporal window of the order of (at most) several weeks. This spatiotemporal domain is also the one in which the mesoscale activity induces through horizontal stirring a strong variability in the biogeochemical tracers, with ephemeral, local contrasts which can easily mask the regional and seasonal gradients. Therefore, whenever local in situ measures are used to infer larger-scale budgets, one faces the challenge of identifying the mesoscale structuring effect, if not simply to filter it out. In the case of the KEOPS2 investigation of biogeochemical responses to natural iron fertilization, this problem was tackled by designing an adaptive sampling strategy based on regionally optimized multisatellite products analyzed in real time by specifically designed Lagrangian diagnostics. This strategy identified the different mesoscale and stirring structures present in the region and tracked the dynamical frontiers among them. It also enabled back trajectories for the ship-sampled stations to be estimated, providing important insights into the timing and pathways of iron supply, which were explored further using a model based on first-order iron removal. This context was essential for the interpretation of the field results. The mesoscale circulation-based strategy was also validated post-cruise by comparing the Lagrangian maps derived from satellites with the patterns of more than one hundred drifters, including some adaptively released during KEOPS2 and a subsequent research voyage. The KEOPS2 strategy was adapted to the specific biogeochemical characteristics of the region, but its principles are general and will be useful for future in situ biogeochemical surveys.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Karaoke singing is a popular form of entertainment in several parts of the world. Since this genre of performance attracts amateurs, the singing often has artifacts related to scale, tempo, and synchrony. We have developed an approach to correct these artifacts using cross-modal multimedia streams information. We first perform adaptive sampling on the user's rendition and then use the original singer's rendition as well as the video caption highlighting information in order to correct the pitch, tempo and the loudness. A method of analogies has been employed to perform this correction. The basic idea is to manipulate the user's rendition in a manner to make it as similar as possible to the original singing. A pre-processing step of noise removal due to feedback and huffing also helps improve the quality of the user's audio. The results are described in the paper which shows the effectiveness of this multimedia approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

For the computation of limit cycle oscillations (LCO) at transonic speeds, CFD is required to capture the nonlinear flow features present. The Harmonic Balance method provides an effective means for the computation of LCOs and this paper exploits its efficiency to investigate the impact of variability (both structural a nd aerodynamic) on the aeroelastic behaviour of a 2 dof aerofoil. A Harmonic Balance inviscid CFD solver is coupled with the structural equations and is validated against time marching analyses. Polynomial chaos expansions are employed for the stochastic investiga tion as a faster alternative to Monte Carlo analysis. Adaptive sampling is employed when discontinuities are present. Uncertainties in aerodynamic parameters are looked at first followed by the inclusion of structural variability. Results show the nonlinear effect of Mach number and it’s interaction with the structural parameters on supercritical LCOs. The bifurcation boundaries are well captured by the polynomial chaos.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work investigates limit cycle oscillations in the transonic regime. A novel approach to predict Limit Cycle Oscillations using high fidelity analysis is exploited to accelerate calculations. The method used is an Aeroeasltic Harmonic Balance approach, which has been proven to be efficient and able to predict periodic phenomena. The behaviour of limit cycle oscillations is analysed using uncertainty quantification tools based on polynomial chaos expansions. To improve the efficiency of the sampling process for the polynomial-chaos expansions an adaptive sampling procedure is used. These methods are exercised using two problems: a pitch/plunge aerofoil and a delta-wing. Results indicate that Mach n. variability is determinant to the amplitude of the LCO for the 2D test case, whereas for the wing case analysed here, variability in the Mach n. has an almost negligible influence in amplitude variation and the LCO frequency variability has an almost linear relation with Mach number. Further test cases are required to understand the generality of these results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En synthèse d'images réalistes, l'intensité finale d'un pixel est calculée en estimant une intégrale de rendu multi-dimensionnelle. Une large portion de la recherche menée dans ce domaine cherche à trouver de nouvelles techniques afin de réduire le coût de calcul du rendu tout en préservant la fidelité et l'exactitude des images résultantes. En tentant de réduire les coûts de calcul afin d'approcher le rendu en temps réel, certains effets réalistes complexes sont souvent laissés de côté ou remplacés par des astuces ingénieuses mais mathématiquement incorrectes. Afin d'accélerer le rendu, plusieurs avenues de travail ont soit adressé directement le calcul de pixels individuels en améliorant les routines d'intégration numérique sous-jacentes; ou ont cherché à amortir le coût par région d'image en utilisant des méthodes adaptatives basées sur des modèles prédictifs du transport de la lumière. L'objectif de ce mémoire, et de l'article résultant, est de se baser sur une méthode de ce dernier type[Durand2005], et de faire progresser la recherche dans le domaine du rendu réaliste adaptatif rapide utilisant une analyse du transport de la lumière basée sur la théorie de Fourier afin de guider et prioriser le lancer de rayons. Nous proposons une approche d'échantillonnage et de reconstruction adaptative pour le rendu de scènes animées illuminées par cartes d'environnement, permettant la reconstruction d'effets tels que les ombres et les réflexions de tous les niveaux fréquentiels, tout en préservant la cohérence temporelle.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In most classical frameworks for learning from examples, it is assumed that examples are randomly drawn and presented to the learner. In this paper, we consider the possibility of a more active learner who is allowed to choose his/her own examples. Our investigations are carried out in a function approximation setting. In particular, using arguments from optimal recovery (Micchelli and Rivlin, 1976), we develop an adaptive sampling strategy (equivalent to adaptive approximation) for arbitrary approximation schemes. We provide a general formulation of the problem and show how it can be regarded as sequential optimal recovery. We demonstrate the application of this general formulation to two special cases of functions on the real line 1) monotonically increasing functions and 2) functions with bounded derivative. An extensive investigation of the sample complexity of approximating these functions is conducted yielding both theoretical and empirical results on test functions. Our theoretical results (stated insPAC-style), along with the simulations demonstrate the superiority of our active scheme over both passive learning as well as classical optimal recovery. The analysis of active function approximation is conducted in a worst-case setting, in contrast with other Bayesian paradigms obtained from optimal design (Mackay, 1992).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a method that robustly combines color and feature buffers to denoise Monte Carlo renderings. On one hand, feature buffers, such as per pixel normals, textures, or depth, are effective in determining denoising filters because features are highly correlated with rendered images. Filters based solely on features, however, are prone to blurring image details that are not well represented by the features. On the other hand, color buffers represent all details, but they may be less effective to determine filters because they are contaminated by the noise that is supposed to be removed. We propose to obtain filters using a combination of color and feature buffers in an NL-means and cross-bilateral filtering framework. We determine a robust weighting of colors and features using a SURE-based error estimate. We show significant improvements in subjective and quantitative errors compared to the previous state-of-the-art. We also demonstrate adaptive sampling and space-time filtering for animations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An aerodynamic optimization of the ICE 2 high-speed train nose in term of front wind action sensitivity is carried out in this paper. The nose is parametrically defined by Be?zier Curves, and a three-dimensional representation of the nose is obtained using thirty one design variables. This implies a more complete parametrization, allowing the representation of a real model. In order to perform this study a genetic algorithm (GA) is used. Using a GA involves a large number of evaluations before finding such optimal. Hence it is proposed the use of metamodels or surrogate models to replace Navier-Stokes solver and speed up the optimization process. Adaptive sampling is considered to optimize surrogate model fitting and minimize computational cost when dealing with a very large number of design parameters. The paper introduces the feasi- bility of using GA in combination with metamodels for real high-speed train geometry optimization.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present a Bayesian sampling algorithm called adaptive importance sampling or population Monte Carlo (PMC), whose computational workload is easily parallelizable and thus has the potential to considerably reduce the wall-clock time required for sampling, along with providing other benefits. To assess the performance of the approach for cosmological problems, we use simulated and actual data consisting of CMB anisotropies, supernovae of type Ia, and weak cosmological lensing, and provide a comparison of results to those obtained using state-of-the-art Markov chain Monte Carlo (MCMC). For both types of data sets, we find comparable parameter estimates for PMC and MCMC, with the advantage of a significantly lower wall-clock time for PMC. In the case of WMAP5 data, for example, the wall-clock time scale reduces from days for MCMC to hours using PMC on a cluster of processors. Other benefits of the PMC approach, along with potential difficulties in using the approach, are analyzed and discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A spatial sampling design that uses pair-copulas is presented that aims to reduce prediction uncertainty by selecting additional sampling locations based on both the spatial configuration of existing locations and the values of the observations at those locations. The novelty of the approach arises in the use of pair-copulas to estimate uncertainty at unsampled locations. Spatial pair-copulas are able to more accurately capture spatial dependence compared to other types of spatial copula models. Additionally, unlike traditional kriging variance, uncertainty estimates from the pair-copula account for influence from measurement values and not just the configuration of observations. This feature is beneficial, for example, for more accurate identification of soil contamination zones where high contamination measurements are located near measurements of varying contamination. The proposed design methodology is applied to a soil contamination example from the Swiss Jura region. A partial redesign of the original sampling configuration demonstrates the potential of the proposed methodology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Remote sensing of physiological parameters could be a cost effective approach to improving health care, and low-power sensors are essential for remote sensing because these sensors are often energy constrained. This paper presents a power optimized photoplethysmographic sensor interface to sense arterial oxygen saturation, a technique to dynamically trade off SNR for power during sensor operation, and a simple algorithm to choose when to acquire samples in photoplethysmography. A prototype of the proposed pulse oximeter built using commercial-off-the-shelf (COTS) components is tested on 10 adults. The dynamic adaptation techniques described reduce power consumption considerably compared to our reference implementation, and our approach is competitive to state-of-the-art implementations. The techniques presented in this paper may be applied to low-power sensor interface designs where acquiring samples is expensive in terms of power as epitomized by pulse oximetry.