1000 resultados para Estimation Bayésienne


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel approach for real-time skin segmentation in video sequences is described. The approach enables reliable skin segmentation despite wide variation in illumination during tracking. An explicit second order Markov model is used to predict evolution of the skin color (HSV) histogram over time. Histograms are dynamically updated based on feedback from the current segmentation and based on predictions of the Markov model. The evolution of the skin color distribution at each frame is parameterized by translation, scaling and rotation in color space. Consequent changes in geometric parameterization of the distribution are propagated by warping and re-sampling the histogram. The parameters of the discrete-time dynamic Markov model are estimated using Maximum Likelihood Estimation, and also evolve over time. Quantitative evaluation of the method was conducted on labeled ground-truth video sequences taken from popular movies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A specialized formulation of Azarbayejani and Pentland's framework for recursive recovery of motion, structure and focal length from feature correspondences tracked through an image sequence is presented. The specialized formulation addresses the case where all tracked points lie on a plane. This planarity constraint reduces the dimension of the original state vector, and consequently the number of feature points needed to estimate the state. Experiments with synthetic data and real imagery illustrate the system performance. The experiments confirm that the specialized formulation provides improved accuracy, stability to observation noise, and rate of convergence in estimation for the case where the tracked points lie on a plane.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Standard structure from motion algorithms recover 3D structure of points. If a surface representation is desired, for example a piece-wise planar representation, then a two-step procedure typically follows: in the first step the plane-membership of points is first determined manually, and in a subsequent step planes are fitted to the sets of points thus determined, and their parameters are recovered. This paper presents an approach for automatically segmenting planar structures from a sequence of images, and simultaneously estimating their parameters. In the proposed approach the plane-membership of points is determined automatically, and the planar structure parameters are recovered directly in the algorithm rather than indirectly in a post-processing stage. Simulated and real experimental results show the efficacy of this approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A learning based framework is proposed for estimating human body pose from a single image. Given a differentiable function that maps from pose space to image feature space, the goal is to invert the process: estimate the pose given only image features. The inversion is an ill-posed problem as the inverse mapping is a one to many process. Hence multiple solutions exist, and it is desirable to restrict the solution space to a smaller subset of feasible solutions. For example, not all human body poses are feasible due to anthropometric constraints. Since the space of feasible solutions may not admit a closed form description, the proposed framework seeks to exploit machine learning techniques to learn an approximation that is smoothly parameterized over such a space. One such technique is Gaussian Process Latent Variable Modelling. Scaled conjugate gradient is then used find the best matching pose in the space of feasible solutions when given an input image. The formulation allows easy incorporation of various constraints, e.g. temporal consistency and anthropometric constraints. The performance of the proposed approach is evaluated in the task of upper-body pose estimation from silhouettes and compared with the Specialized Mapping Architecture. The estimation accuracy of the Specialized Mapping Architecture is at least one standard deviation worse than the proposed approach in the experiments with synthetic data. In experiments with real video of humans performing gestures, the proposed approach produces qualitatively better estimation results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A nonparametric probability estimation procedure using the fuzzy ARTMAP neural network is here described. Because the procedure does not make a priori assumptions about underlying probability distributions, it yields accurate estimates on a wide variety of prediction tasks. Fuzzy ARTMAP is used to perform probability estimation in two different modes. In a 'slow-learning' mode, input-output associations change slowly, with the strength of each association computing a conditional probability estimate. In 'max-nodes' mode, a fixed number of categories are coded during an initial fast learning interval, and weights are then tuned by slow learning. Simulations illustrate system performance on tasks in which various numbers of clusters in the set of input vectors mapped to a given class.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of ultra high speed (~20 Gsamples/s) analogue to digital converters (ADCs), and the delayed deployment of 40 Gbit/s transmission due to the economic downturn, has stimulated the investigation of digital signal processing (DSP) techniques for compensation of optical transmission impairments. In the future, DSP will offer an entire suite of tools to compensate for optical impairments and facilitate the use of advanced modulation formats. Chromatic dispersion is a very significant impairment for high speed optical transmission. This thesis investigates a novel electronic method of dispersion compensation which allows for cost-effective accurate detection of the amplitude and phase of the optical field into the radio frequency domain. The first electronic dispersion compensation (EDC) schemes accessed only the amplitude information using square law detection and achieved an increase in transmission distances. This thesis presents a method by using a frequency sensitive filter to estimate the phase of the received optical field and, in conjunction with the amplitude information, the entire field can be digitised using ADCs. This allows DSP technologies to take the next step in optical communications without requiring complex coherent detection. This is of particular of interest in metropolitan area networks. The full-field receiver investigated requires only an additional asymmetrical Mach-Zehnder interferometer and balanced photodiode to achieve a 50% increase in EDC reach compared to amplitude only detection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For two multinormal populations with equal covariance matrices the likelihood ratio discriminant function, an alternative allocation rule to the sample linear discriminant function when n1 ≠ n2 ,is studied analytically. With the assumption of a known covariance matrix its distribution is derived and the expectation of its actual and apparent error rates evaluated and compared with those of the sample linear discriminant function. This comparison indicates that the likelihood ratio allocation rule is robust to unequal sample sizes. The quadratic discriminant function is studied, its distribution reviewed and evaluation of its probabilities of misclassification discussed. For known covariance matrices the distribution of the sample quadratic discriminant function is derived. When the known covariance matrices are proportional exact expressions for the expectation of its actual and apparent error rates are obtained and evaluated. The effectiveness of the sample linear discriminant function for this case is also considered. Estimation of true log-odds for two multinormal populations with equal or unequal covariance matrices is studied. The estimative, Bayesian predictive and a kernel method are compared by evaluating their biases and mean square errors. Some algebraic expressions for these quantities are derived. With equal covariance matrices the predictive method is preferable. Where it derives this superiority is investigated by considering its performance for various levels of fixed true log-odds. It is also shown that the predictive method is sensitive to n1 ≠ n2. For unequal but proportional covariance matrices the unbiased estimative method is preferred. Product Normal kernel density estimates are used to give a kernel estimator of true log-odds. The effect of correlation in the variables with product kernels is considered. With equal covariance matrices the kernel and parametric estimators are compared by simulation. For moderately correlated variables and large dimension sizes the product kernel method is a good estimator of true log-odds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Empirical modeling of high-frequency currency market data reveals substantial evidence for nonnormality, stochastic volatility, and other nonlinearities. This paper investigates whether an equilibrium monetary model can account for nonlinearities in weekly data. The model incorporates time-nonseparable preferences and a transaction cost technology. Simulated sample paths are generated using Marcet's parameterized expectations procedure. The paper also develops a new method for estimation of structural economic models. The method forces the model to match (under a GMM criterion) the score function of a nonparametric estimate of the conditional density of observed data. The estimation uses weekly U.S.-German currency market data, 1975-90. © 1995.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides a root-n consistent, asymptotically normal weighted least squares estimator of the coefficients in a truncated regression model. The distribution of the errors is unknown and permits general forms of unknown heteroskedasticity. Also provided is an instrumental variables based two-stage least squares estimator for this model, which can be used when some regressors are endogenous, mismeasured, or otherwise correlated with the errors. A simulation study indicates that the new estimators perform well in finite samples. Our limiting distribution theory includes a new asymptotic trimming result addressing the boundary bias in first-stage density estimation without knowledge of the support boundary. © 2007 Cambridge University Press.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An analytical model was developed to describe in-canopy vertical distribution of ammonia (NH(3)) sources and sinks and vertical fluxes in a fertilized agricultural setting using measured in-canopy mean NH(3) concentration and wind speed profiles. This model was applied to quantify in-canopy air-surface exchange rates and above-canopy NH(3) fluxes in a fertilized corn (Zea mays) field. Modeled air-canopy NH(3) fluxes agreed well with independent above-canopy flux estimates. Based on the model results, the urea fertilized soil surface was a consistent source of NH(3) one month following the fertilizer application, whereas the vegetation canopy was typically a net NH(3) sink with the lower portion of the canopy being a constant sink. The model results suggested that the canopy was a sink for some 70% of the estimated soil NH(3) emissions. A logical conclusion is that parametrization of within-canopy processes in air quality models are necessary to explore the impact of agricultural field level management practices on regional air quality. Moreover, there are agronomic and environmental benefits to timing liquid fertilizer applications as close to canopy closure as possible. Finally, given the large within-canopy mean NH(3) concentration gradients in such agricultural settings, a discussion about the suitability of the proposed model is also presented.