28 resultados para principle component analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present in this paper a new multivariate probabilistic approach to Acoustic Pulse Recognition (APR) for tangible interface applications. This model uses Principle Component Analysis (PCA) in a probabilistic framework to classify tapping pulses with a high degree of variability. It was found that this model, achieves a higher robustness to pulse variability than simpler template matching methods, specifically when allowed to train on data containing high variability. © 2011 IEEE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

DNA microarrays provide such a huge amount of data that unsupervised methods are required to reduce the dimension of the data set and to extract meaningful biological information. This work shows that Independent Component Analysis (ICA) is a promising approach for the analysis of genome-wide transcriptomic data. The paper first presents an overview of the most popular algorithms to perform ICA. These algorithms are then applied on a microarray breast-cancer data set. Some issues about the application of ICA and the evaluation of biological relevance of the results are discussed. This study indicates that ICA significantly outperforms Principal Component Analysis (PCA).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The brain extracts useful features from a maelstrom of sensory information, and a fundamental goal of theoretical neuroscience is to work out how it does so. One proposed feature extraction strategy is motivated by the observation that the meaning of sensory data, such as the identity of a moving visual object, is often more persistent than the activation of any single sensory receptor. This notion is embodied in the slow feature analysis (SFA) algorithm, which uses “slowness” as an heuristic by which to extract semantic information from multi-dimensional time-series. Here, we develop a probabilistic interpretation of this algorithm showing that inference and learning in the limiting case of a suitable probabilistic model yield exactly the results of SFA. Similar equivalences have proved useful in interpreting and extending comparable algorithms such as independent component analysis. For SFA, we use the equivalent probabilistic model as a conceptual spring-board, with which to motivate several novel extensions to the algorithm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a novel platform for the formation of cost-effective PCB-integrated optical waveguide sensors. The sensor design relies on the use of multimode polymer waveguides that can be formed directly on standard PCBs and commercially-available chemical dyes, enabling the integration of all essential sensor components (electronic, photonic, chemical) on low-cost substrates. Moreover, it enables the detection of multiple analytes from a single device by employing waveguide arrays functionalised with different chemical dyes. The devices can be manufactured with conventional methods of the PCB industry, such as solder-reflow processes and pick-and-place assembly techniques. As a proof of principle, a PCB-integrated ammonia gas sensor is fabricated on a FR4 substrate. The sensor operation relies on the change of the optical transmission characteristics of chemically functionalised optical waveguides in the presence of ammonia molecules. The fabrication and assembly of the sensor unit, as well as fundamental simulation and characterisation studies, are presented. The device achieves a sensitivity of approximately 30 ppm and a linear response up to 600 ppm at room temperature. Finally, the potential to detect multiple analytes from a single device is demonstrated using principal-component analysis. © 1983-2012 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose an algorithm for solving optimization problems defined on a subset of the cone of symmetric positive semidefinite matrices. This algorithm relies on the factorization X = Y Y T , where the number of columns of Y fixes an upper bound on the rank of the positive semidefinite matrix X. It is thus very effective for solving problems that have a low-rank solution. The factorization X = Y Y T leads to a reformulation of the original problem as an optimization on a particular quotient manifold. The present paper discusses the geometry of that manifold and derives a second-order optimization method with guaranteed quadratic convergence. It furthermore provides some conditions on the rank of the factorization to ensure equivalence with the original problem. In contrast to existing methods, the proposed algorithm converges monotonically to the sought solution. Its numerical efficiency is evaluated on two applications: the maximal cut of a graph and the problem of sparse principal component analysis. © 2010 Society for Industrial and Applied Mathematics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work considers the problem of fitting data on a Lie group by a coset of a compact subgroup. This problem can be seen as an extension of the problem of fitting affine subspaces in n to data which can be solved using principal component analysis. We show how the fitting problem can be reduced for biinvariant distances to a generalized mean calculation on an homogeneous space. For biinvariant Riemannian distances we provide an algorithm based on the Karcher mean gradient algorithm. We illustrate our approach by some examples on SO(n). © 2010 Springer -Verlag Berlin Heidelberg.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

© 2015 John P. Cunningham and Zoubin Ghahramani. Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a result the connections between all these methods have not been highlighted. Here we survey methods from this disparate literature as optimization programs over matrix manifolds. We discuss principal component analysis, factor analysis, linear multidimensional scaling, Fisher's linear discriminant analysis, canonical correlations analysis, maximum autocorrelation factors, slow feature analysis, sufficient dimensionality reduction, undercomplete independent component analysis, linear regression, distance metric learning, and more. This optimization framework gives insight to some rarely discussed shortcomings of well-known methods, such as the suboptimality of certain eigenvector solutions. Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows straightforward generalizations and novel variants of classical methods, which we demonstrate here by creating an orthogonal-projection canonical correlations analysis. More broadly, this survey and generic solver suggest that linear dimensionality reduction can move toward becoming a blackbox, objective-agnostic numerical technology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Optimization on manifolds is a rapidly developing branch of nonlinear optimization. Its focus is on problems where the smooth geometry of the search space can be leveraged to design effcient numerical algorithms. In particular, optimization on manifolds is well-suited to deal with rank and orthogonality constraints. Such structured constraints appear pervasively in machine learning applications, including low-rank matrix completion, sensor network localization, camera network registration, independent component analysis, metric learning, dimensionality reduction and so on. The Manopt toolbox, available at www.manopt.org, is a user-friendly, documented piece of software dedicated to simplify experimenting with state of the art Riemannian optimization algorithms. By dealing internally with most of the differential geometry, the package aims particularly at lowering the entrance barrier. © 2014 Nicolas Boumal.