29 resultados para RIEMANNIAN MANIFOLDS
em Cambridge University Engineering Department Publications Database
Resumo:
We give simple formulas for the canonical metric, gradient, Lie derivative, Riemannian connection, parallel translation, geodesics and distance on the Grassmann manifold of p-planes in ℝn. In these formulas, p-planes are represented as the column space of n × p matrices. The Newton method on abstract Riemannian manifolds proposed by Smith is made explicit on the Grassmann manifold. Two applications - computing an invariant subspace of a matrix and the mean of subspaces - are worked out.
Resumo:
The classical Rayleigh Quotient Iteration (RQI) computes a 1-dimensional invariant subspace of a symmetric matrix A with cubic convergence. We propose a generalization of the RQI which computes a p-dimensional invariant subspace of A. The geometry of the algorithm on the Grassmann manifold Gr(p,n) is developed to show cubic convergence and to draw connections with recently proposed Newton algorithms on Riemannian manifolds.
Resumo:
Optimization on manifolds is a rapidly developing branch of nonlinear optimization. Its focus is on problems where the smooth geometry of the search space can be leveraged to design effcient numerical algorithms. In particular, optimization on manifolds is well-suited to deal with rank and orthogonality constraints. Such structured constraints appear pervasively in machine learning applications, including low-rank matrix completion, sensor network localization, camera network registration, independent component analysis, metric learning, dimensionality reduction and so on. The Manopt toolbox, available at www.manopt.org, is a user-friendly, documented piece of software dedicated to simplify experimenting with state of the art Riemannian optimization algorithms. By dealing internally with most of the differential geometry, the package aims particularly at lowering the entrance barrier. © 2014 Nicolas Boumal.
Resumo:
Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of fixed-rank matrices. We adopt the geometric framework of optimization on Riemannian quotient manifolds. We study the underlying geometries of several well-known fixed-rank matrix factorizations and then exploit the Riemannian quotient geometry of the search space in the design of a class of gradient descent and trust-region algorithms. The proposed algorithms generalize our previous results on fixed-rank symmetric positive semidefinite matrices, apply to a broad range of applications, scale to high-dimensional problems, and confer a geometric basis to recent contributions on the learning of fixed-rank non-symmetric matrices. We make connections with existing algorithms in the context of low-rank matrix completion and discuss the usefulness of the proposed framework. Numerical experiments suggest that the proposed algorithms compete with state-of-the-art algorithms and that manifold optimization offers an effective and versatile framework for the design of machine learning algorithms that learn a fixed-rank matrix. © 2013 Springer-Verlag Berlin Heidelberg.
Resumo:
In this paper, we tackle the problem of learning a linear regression model whose parameter is a fixed-rank matrix. We study the Riemannian manifold geometry of the set of fixed-rank matrices and develop efficient line-search algorithms. The proposed algorithms have many applications, scale to high-dimensional problems, enjoy local convergence properties and confer a geometric basis to recent contributions on learning fixed-rank matrices. Numerical experiments on benchmarks suggest that the proposed algorithms compete with the state-of-the-art, and that manifold optimization offers a versatile framework for the design of rank-constrained machine learning algorithms. Copyright 2011 by the author(s)/owner(s).
Resumo:
The paper addresses the problem of learning a regression model parameterized by a fixed-rank positive semidefinite matrix. The focus is on the nonlinear nature of the search space and on scalability to high-dimensional problems. The mathematical developments rely on the theory of gradient descent algorithms adapted to the Riemannian geometry that underlies the set of fixedrank positive semidefinite matrices. In contrast with previous contributions in the literature, no restrictions are imposed on the range space of the learned matrix. The resulting algorithms maintain a linear complexity in the problem size and enjoy important invariance properties. We apply the proposed algorithms to the problem of learning a distance function parameterized by a positive semidefinite matrix. Good performance is observed on classical benchmarks. © 2011 Gilles Meyer, Silvere Bonnabel and Rodolphe Sepulchre.
Resumo:
This paper introduces a new metric and mean on the set of positive semidefinite matrices of fixed-rank. The proposed metric is derived from a well-chosen Riemannian quotient geometry that generalizes the reductive geometry of the positive cone and the associated natural metric. The resulting Riemannian space has strong geometrical properties: it is geodesically complete, and the metric is invariant with respect to all transformations that preserve angles (orthogonal transformations, scalings, and pseudoinversion). A meaningful approximation of the associated Riemannian distance is proposed, that can be efficiently numerically computed via a simple algorithm based on SVD. The induced mean preserves the rank, possesses the most desirable characteristics of a geometric mean, and is easy to compute. © 2009 Society for Industrial and Applied Mathematics.
Resumo:
The present paper considers distributed consensus algorithms for agents evolving on a connected compact homogeneous (CCH) manifold. The agents track no external reference and communicate their relative state according to an interconnection graph. The paper first formalizes the consensus problem for synchronization (i.e. maximizing the consensus) and balancing (i.e. minimizing the consensus); it thereby introduces the induced arithmetic mean, an easily computable mean position on CCH manifolds. Then it proposes and analyzes various consensus algorithms on manifolds: natural gradient algorithms which reach local consensus equilibria; an adaptation using auxiliary variables for almost-global synchronization or balancing; and a stochastic gossip setting for global synchronization. It closes by investigating the dependence of synchronization properties on the attraction function between interacting agents on the circle. The theory is also illustrated on SO(n) and on the Grassmann manifolds. ©2009 IEEE.
Resumo:
The present paper considers distributed consensus algorithms that involve N agents evolving on a connected compact homogeneous manifold. The agents track no external reference and communicate their relative state according to a communication graph. The consensus problem is formulated in terms of the extrema of a cost function. This leads to efficient gradient algorithms to synchronize (i.e., maximizing the consensus) or balance (i.e., minimizing the consensus) the agents; a convenient adaptation of the gradient algorithms is used when the communication graph is directed and time-varying. The cost function is linked to a specific centroid definition on manifolds, introduced here as the induced arithmetic mean, that is easily computable in closed form and may be of independent interest for a number of manifolds. The special orthogonal group SO (n) and the Grassmann manifold Grass (p, n) are treated as original examples. A link is also drawn with the many existing results on the circle. © 2009 Society for Industrial and Applied Mathematics.