945 resultados para Parasympathetic component


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventional Hidden Markov models generally consist of a Markov chain observed through a linear map corrupted by additive noise. This general class of model has enjoyed a huge and diverse range of applications, for example, speech processing, biomedical signal processing and more recently quantitative finance. However, a lesser known extension of this general class of model is the so-called Factorial Hidden Markov Model (FHMM). FHMMs also have diverse applications, notably in machine learning, artificial intelligence and speech recognition [13, 17]. FHMMs extend the usual class of HMMs, by supposing the partially observed state process is a finite collection of distinct Markov chains, either statistically independent or dependent. There is also considerable current activity in applying collections of partially observed Markov chains to complex action recognition problems, see, for example, [6]. In this article we consider the Maximum Likelihood (ML) parameter estimation problem for FHMMs. Much of the extant literature concerning this problem presents parameter estimation schemes based on full data log-likelihood EM algorithms. This approach can be slow to converge and often imposes heavy demands on computer memory. The latter point is particularly relevant for the class of FHMMs where state space dimensions are relatively large. The contribution in this article is to develop new recursive formulae for a filter-based EM algorithm that can be implemented online. Our new formulae are equivalent ML estimators, however, these formulae are purely recursive and so, significantly reduce numerical complexity and memory requirements. A computer simulation is included to demonstrate the performance of our results. © Taylor & Francis Group, LLC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we develop a new approach to sparse principal component analysis (sparse PCA). We propose two single-unit and two block optimization formulations of the sparse PCA problem, aimed at extracting a single sparse dominant principal component of a data matrix, or more components at once, respectively. While the initial formulations involve nonconvex functions, and are therefore computationally intractable, we rewrite them into the form of an optimization program involving maximization of a convex function on a compact set. The dimension of the search space is decreased enormously if the data matrix has many more columns (variables) than rows. We then propose and analyze a simple gradient method suited for the task. It appears that our algorithm has best convergence properties in the case when either the objective function or the feasible set are strongly convex, which is the case with our single-unit formulations and can be enforced in the block case. Finally, we demonstrate numerically on a set of random and gene expression test problems that our approach outperforms existing algorithms both in quality of the obtained solution and in computational speed. © 2010 Michel Journée, Yurii Nesterov, Peter Richtárik and Rodolphe Sepulchre.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper derives a new algorithm that performs independent component analysis (ICA) by optimizing the contrast function of the RADICAL algorithm. The core idea of the proposed optimization method is to combine the global search of a good initial condition with a gradient-descent algorithm. This new ICA algorithm performs faster than the RADICAL algorithm (based on Jacobi rotations) while still preserving, and even enhancing, the strong robustness properties that result from its contrast. © Springer-Verlag Berlin Heidelberg 2007.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DNA microarrays provide a huge amount of data and require therefore dimensionality reduction methods to extract meaningful biological information. Independent Component Analysis (ICA) was proposed by several authors as an interesting means. Unfortunately, experimental data are usually of poor quality- because of noise, outliers and lack of samples. Robustness to these hurdles will thus be a key feature for an ICA algorithm. This paper identifies a robust contrast function and proposes a new ICA algorithm. © 2007 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximately 40% of annual demand for steel worldwide is used to replace products that have failed. With this percentage set to rise, extending the lifespan of steel in products presents a significant opportunity to reduce demand and thus decrease carbon dioxide emissions from steel production. This article presents a new, simplified framework with which to analyse product failure. When applied to the products that dominate steel use, this framework reveals that they are often replaced because a component/sub-assembly becomes degraded, inferior, unsuitable or worthless. In light of this, four products, which are representative of high steel content products in general, are analysed at the component level, determining steel mass and cost profiles over the lifespan of each product. The results show that the majority of the steel components are underexploited - still functioning when the product is discarded; in particular, the potential lifespan of the steel-rich structure is typically much greater than its actual lifespan. Twelve case studies, in which product or component life has been increased, are then presented. The resulting evidence is used to tailor life-extension strategies to each reason for product failure and to identify the economic motivations for implementing these strategies. The results suggest that a product template in which the long-lived structure accounts for a relatively high share of costs while short-lived components can be easily replaced (offering profit to the producer and enhanced utility to owners) encourages product life extension. © 2013 The Author.