68 resultados para Metric


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Convergence analysis of consensus algorithms is revisited in the light of the Hilbert distance. The Lyapunov function used in the early analysis by Tsitsiklis is shown to be the Hilbert distance to consensus in log coordinates. Birkhoff theorem, which proves contraction of the Hilbert metric for any positive homogeneous monotone map, provides an early yet general convergence result for consensus algorithms. Because Birkhoff theorem holds in arbitrary cones, we extend consensus algorithms to the cone of positive definite matrices. The proposed generalization finds applications in the convergence analysis of quantum stochastic maps, which are a generalization of stochastic maps to non-commutative probability spaces. ©2010 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses the design of mobile sensor networks for optimal data collection. The development is strongly motivated by the application to adaptive ocean sampling for an autonomous ocean observing and prediction system. A performance metric, used to derive optimal paths for the network of mobile sensors, defines the optimal data set as one which minimizes error in a model estimate of the sampled field. Feedback control laws are presented that stably coordinate sensors on structured tracks that have been optimized over a minimal set of parameters. Optimal, closed-loop solutions are computed in a number of low-dimensional cases to illustrate the methodology. Robustness of the performance to the influence of a steady flow field on relatively slow-moving mobile sensors is also explored © 2006 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the problem of finding a local minimum of a multilinear function E over the discrete set {0,1}n. The search is achieved by a gradient-like system in [0,1]n with cost function E. Under mild restrictions on the metric, the stable attractors of the gradient-like system are shown to produce solutions of the problem, even when they are not in the vicinity of the discrete set {0,1}n. Moreover, the gradient-like system connects with interior point methods for linear programming and with the analog neural network studied by Vidyasagar (IEEE Trans. Automat. Control 40 (8) (1995) 1359), in the same context. © 2004 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We give simple formulas for the canonical metric, gradient, Lie derivative, Riemannian connection, parallel translation, geodesics and distance on the Grassmann manifold of p-planes in ℝn. In these formulas, p-planes are represented as the column space of n × p matrices. The Newton method on abstract Riemannian manifolds proposed by Smith is made explicit on the Grassmann manifold. Two applications - computing an invariant subspace of a matrix and the mean of subspaces - are worked out.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The notion of coupling within a design, particularly within the context of Multidisciplinary Design Optimization (MDO), is much used but ill-defined. There are many different ways of measuring design coupling, but these measures vary in both their conceptions of what design coupling is and how such coupling may be calculated. Within the differential geometry framework which we have previously developed for MDO systems, we put forth our own design coupling metric for consideration. Our metric is not commensurate with similar types of coupling metrics, but we show that it both provides a helpful geo- metric interpretation of coupling (and uncoupledness in particular) and exhibits greater generality and potential for analysis than those similar metrics. Furthermore, we discuss how the metric might be profitably extended to time-varying problems and show how the metric's measure of coupling can be applied to multi-objective optimization problems (in unconstrained optimization and in MDO). © 2013 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical analysis of diffusion tensor imaging (DTI) data requires a computational framework that is both numerically tractable (to account for the high dimensional nature of the data) and geometric (to account for the nonlinear nature of diffusion tensors). Building upon earlier studies exploiting a Riemannian framework to address these challenges, the present paper proposes a novel metric and an accompanying computational framework for DTI data processing. The proposed approach grounds the signal processing operations in interpolating curves. Well-chosen interpolating curves are shown to provide a computational framework that is at the same time tractable and information relevant for DTI processing. In addition, and in contrast to earlier methods, it provides an interpolation method which preserves anisotropy, a central information carried by diffusion tensor data. © 2013 Springer Science+Business Media New York.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

© 2015 John P. Cunningham and Zoubin Ghahramani. Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a result the connections between all these methods have not been highlighted. Here we survey methods from this disparate literature as optimization programs over matrix manifolds. We discuss principal component analysis, factor analysis, linear multidimensional scaling, Fisher's linear discriminant analysis, canonical correlations analysis, maximum autocorrelation factors, slow feature analysis, sufficient dimensionality reduction, undercomplete independent component analysis, linear regression, distance metric learning, and more. This optimization framework gives insight to some rarely discussed shortcomings of well-known methods, such as the suboptimality of certain eigenvector solutions. Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows straightforward generalizations and novel variants of classical methods, which we demonstrate here by creating an orthogonal-projection canonical correlations analysis. More broadly, this survey and generic solver suggest that linear dimensionality reduction can move toward becoming a blackbox, objective-agnostic numerical technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimization on manifolds is a rapidly developing branch of nonlinear optimization. Its focus is on problems where the smooth geometry of the search space can be leveraged to design effcient numerical algorithms. In particular, optimization on manifolds is well-suited to deal with rank and orthogonality constraints. Such structured constraints appear pervasively in machine learning applications, including low-rank matrix completion, sensor network localization, camera network registration, independent component analysis, metric learning, dimensionality reduction and so on. The Manopt toolbox, available at www.manopt.org, is a user-friendly, documented piece of software dedicated to simplify experimenting with state of the art Riemannian optimization algorithms. By dealing internally with most of the differential geometry, the package aims particularly at lowering the entrance barrier. © 2014 Nicolas Boumal.