898 resultados para ERROR PROPAGATION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this second part of our comparative study inspecting the (dis)similarities between “Stokes” and “Jones,” we present simulation results yielded by two independent Monte Carlo programs: (i) one developed in Bern with the Jones formalism and (ii) the other implemented in Ulm with the Stokes notation. The simulated polarimetric experiments involve suspensions of polystyrene spheres with varying size. Reflection and refraction at the sample/air interfaces are also considered. Both programs yield identical results when propagating pure polarization states, yet, with unpolarized illumination, second order statistical differences appear, thereby highlighting the pre-averaged nature of the Stokes parameters. This study serves as a validation for both programs and clarifies the misleading belief according to which “Jones cannot treat depolarizing effects.”

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we provide a passive location monitoring system for IEEE 802.15.4 signal emitters. The system adopts software defined radio techniques to passively overhear IEEE 802.15.4 packets and to extract power information from baseband signals. In our system, we provide a new model based on the nonlinear regression for ranging. After obtaining distance information, a Weighted Centroid (WC) algorithm is adopted to locate users. In WC, each weight is inversely proportional to the nth power of propagation distance, and the degree n is obtained from some initial measurements. We evaluate our system in a 16m-18m area with complex indoor propagation conditions. We are able to achieve a median error of 2:1m with only 4 anchor nodes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scoping behavioral variations to dynamic extents is useful to support non-functional requirements that otherwise result in cross-cutting code. Unfortunately, such variations are difficult to achieve with traditional reflection or aspects. We show that with a modification of dynamic proxies, called delegation proxies, it becomes possible to reflectively implement variations that propagate to all objects accessed in the dynamic extent of a message send. We demonstrate our approach with examples of variations scoped to dynamic extents that help simplify code related to safety, reliability, and monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of the DNA phosphodiester backbone in the transfer of melting cooperativity between two helical domains was experimentally addressed with a helix-bulge-helix DNA model, in which the bulge consisted of a varying number of either conformationally flexible propanediol or conformationally constrained bicyclic anucleosidic phosphodiester backbone units. We found that structural communication between two double helical domains is transferred along the DNA backbone over the equivalent of ca. 12-20 backbone units, depending on whether there is a symmetric or asymmetric distribution of the anucleosidic units on both strands. We observed that extension of anucleosidic units on one strand only suffices to disrupt cooperativity transfer in a similar way as if extension occurs on both strands, indicating that the length of the longest anucleosidic inset determines cooperativity transfer. Furthermore, conformational rigidity of the sugar unit increases the distance of coopertivity transfer along the phosphodiester backbone. This is especially the case when the units are asymmetrically distributed in both strands

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attractive business cases in various application fields contribute to the sustained long-term interest in indoor localization and tracking by the research community. Location tracking is generally treated as a dynamic state estimation problem, consisting of two steps: (i) location estimation through measurement, and (ii) location prediction. For the estimation step, one of the most efficient and low-cost solutions is Received Signal Strength (RSS)-based ranging. However, various challenges - unrealistic propagation model, non-line of sight (NLOS), and multipath propagation - are yet to be addressed. Particle filters are a popular choice for dealing with the inherent non-linearities in both location measurements and motion dynamics. While such filters have been successfully applied to accurate, time-based ranging measurements, dealing with the more error-prone RSS based ranging is still challenging. In this work, we address the above issues with a novel, weighted likelihood, bootstrap particle filter for tracking via RSS-based ranging. Our filter weights the individual likelihoods from different anchor nodes exponentially, according to the ranging estimation. We also employ an improved propagation model for more accurate RSS-based ranging, which we suggested in recent work. We implemented and tested our algorithm in a passive localization system with IEEE 802.15.4 signals, showing that our proposed solution largely outperforms a traditional bootstrap particle filter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An in-depth study, using simulations and covariance analysis, is performed to identify the optimal sequence of observations to obtain the most accurate orbit propagation. The accuracy of the results of an orbit determination/ improvement process depends on: tracklet length, number of observations, type of orbit, astrometric error, time interval between tracklets and observation geometry. The latter depends on the position of the object along its orbit and the location of the observing station. This covariance analysis aims to optimize the observation strategy taking into account the influence of the orbit shape, of the relative object-observer geometry and the interval between observations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Astronomical Institute of the University of Bern (AIUB) is conducting several search campaigns for space debris using optical sensors. The debris objects are discovered during systematic survey observations. In general, the result of a discovery consists in only a short observation arc, or tracklet, which is used to perform a first orbit determination in order to be able to observe t he object again in subsequent follow-up observations. The additional observations are used in the orbit improvement process to obtain accurate orbits to be included in a catalogue. In order to obtain the most accurate orbit within the time available it is necessary to optimize the follow-up observations strategy. In this paper an in‐depth study, using simulations and covariance analysis, is performed to identify the optimal sequence of follow-up observations to obtain the most accurate orbit propagation to be used for the space debris catalogue maintenance. The main factors that determine the accuracy of the results of an orbit determination/improvement process are: tracklet length, number of observations, type of orbit, astrometric error of the measurements, time interval between tracklets, and the relative position of the object along its orbit with respect to the observing station. The main aim of the covariance analysis is to optimize the follow-up strategy as a function of the object-observer geometry, the interval between follow-up observations and the shape of the orbit. This an alysis can be applied to every orbital regime but particular attention was dedicated to geostationary, Molniya, and geostationary transfer orbits. Finally the case with more than two follow-up observations and the influence of a second observing station are also analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Astronomical Institute of the University of Bern (AIUB) is conducting several search campaigns for orbital debris. The debris objects are discovered during systematic survey observations. In general only a short observation arc, or tracklet, is available for most of these objects. From this discovery tracklet a first orbit determination is computed in order to be able to find the object again in subsequent follow-up observations. The additional observations are used in the orbit improvement process to obtain accurate orbits to be included in a catalogue. In this paper, the accuracy of the initial orbit determination is analyzed. This depends on a number of factors: tracklet length, number of observations, type of orbit, astrometric error, and observation geometry. The latter is characterized by both the position of the object along its orbit and the location of the observing station. Different positions involve different distances from the target object and a different observing angle with respect to its orbital plane and trajectory. The present analysis aims at optimizing the geometry of the discovery observation is depending on the considered orbit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The polychaete Nereis virens burrows through muddy sediments by exerting dorsoventral forces against the walls of its tongue-depressor- shaped burrow to extend an oblate hemispheroidal crack. Stress is concentrated at the crack tip, which extends when the stress intensity factor (K-I) exceeds the critical stress intensity factor (K-Ic). Relevant forces were measured in gelatin, an analog for elastic muds, by photoelastic stress analysis, and were 0.015 +/- 0.001 N (mean +/- s.d.;N= 5). Measured elastic moduli (E) for gelatin and sediment were used in finite element models to convert the forces in gelatin to those required in muds to maintain the same body shapes observed in gelatin. The force increases directly with increasing sediment stiffness, and is 0.16 N for measured sediment stiffness of E=2.7x10(4) Pa. This measurement of forces exerted by burrowers is the first that explicitly considers the mechanical behavior of the sediment. Calculated stress intensity factors fall within the range of critical values for gelatin and exceed those for sediment, showing that crack propagation is a mechanically feasible mechanism of burrowing. The pharynx extends anteriorly as it everts, extending the crack tip only as far as the anterior of the worm, consistent with wedge-driven fracture and drawing obvious parallels between soft-bodied burrowers and more rigid, wedge-shaped burrowers(i.e. clams). Our results raise questions about the reputed high energetic cost of burrowing and emphasize the need for better understanding of sediment mechanics to quantify external energy expenditure during burrowing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study is to investigate the effects of predictor variable correlations and patterns of missingness with dichotomous and/or continuous data in small samples when missing data is multiply imputed. Missing data of predictor variables is multiply imputed under three different multivariate models: the multivariate normal model for continuous data, the multinomial model for dichotomous data and the general location model for mixed dichotomous and continuous data. Subsequent to the multiple imputation process, Type I error rates of the regression coefficients obtained with logistic regression analysis are estimated under various conditions of correlation structure, sample size, type of data and patterns of missing data. The distributional properties of average mean, variance and correlations among the predictor variables are assessed after the multiple imputation process. ^ For continuous predictor data under the multivariate normal model, Type I error rates are generally within the nominal values with samples of size n = 100. Smaller samples of size n = 50 resulted in more conservative estimates (i.e., lower than the nominal value). Correlation and variance estimates of the original data are retained after multiple imputation with less than 50% missing continuous predictor data. For dichotomous predictor data under the multinomial model, Type I error rates are generally conservative, which in part is due to the sparseness of the data. The correlation structure for the predictor variables is not well retained on multiply-imputed data from small samples with more than 50% missing data with this model. For mixed continuous and dichotomous predictor data, the results are similar to those found under the multivariate normal model for continuous data and under the multinomial model for dichotomous data. With all data types, a fully-observed variable included with variables subject to missingness in the multiple imputation process and subsequent statistical analysis provided liberal (larger than nominal values) Type I error rates under a specific pattern of missing data. It is suggested that future studies focus on the effects of multiple imputation in multivariate settings with more realistic data characteristics and a variety of multivariate analyses, assessing both Type I error and power. ^