949 resultados para k-Error linear complexity
Resumo:
We consider a problem of robust performance analysis of linear discrete time varying systems on a bounded time interval. The system is represented in the state-space form. It is driven by a random input disturbance with imprecisely known probability distribution; this distributional uncertainty is described in terms of entropy. The worst-case performance of the system is quantified by its a-anisotropic norm. Computing the anisotropic norm is reduced to solving a set of difference Riccati and Lyapunov equations and a special form equation.
Resumo:
There is currently considerable interest in developing general non-linear density models based on latent, or hidden, variables. Such models have the ability to discover the presence of a relatively small number of underlying `causes' which, acting in combination, give rise to the apparent complexity of the observed data set. Unfortunately, to train such models generally requires large computational effort. In this paper we introduce a novel latent variable algorithm which retains the general non-linear capabilities of previous models but which uses a training procedure based on the EM algorithm. We demonstrate the performance of the model on a toy problem and on data from flow diagnostics for a multi-phase oil pipeline.
Resumo:
Purpose. A clinical evaluation of the Shin-Nippon NVision-K 5001 (also branded as the Grand Seiko WR-5100K) autorefractor (Japan) was performed to examine validity and repeatability compared with subjective refraction and Javal-Schiotz keratometry. Methods. Measurements of refractive error were performed on 198 eyes of 99 subjects (aged 23.2 ± 7.4 years) subjectively (noncycloplegic) by one masked optometrist and objectively with the NVision-K autorefractor by a second optometrist. Keratometry measurements using the NVision-K were compared with the Javal-Schiotz keratometer. Intrasession repeatability of the NVision-K was also assessed on all 99 subjects together with intersession repeatability on a separate occasion separated by 7 to 14 days. Results. Refractive error as measured by the NVision-K was found to be similar (p = 0.67) to subjective refraction (difference, 0.14 ± 0.35 D). It was both accurate and repeatable over a wide prescription range (-8.25 to +7.25 D). Keratometry as measured by the NVision-K was found to be similar (p > 0.50) to the Javal-Schiotz technique in both the horizontal and vertical meridians (horizontal: difference, 0.02 ± 0.09 mm; vertical: difference, 0.01 ± 0.14 mm). There was minimal bias, and the results were repeatable (horizontal: intersession difference, 0.00 ± 0.09 mm; vertical: intersession difference, -0.01 ± 0.12 mm). Conclusion. The open-view arrangement of the Shin-Nippon NVision-K 5001 facilitates the measurement of static refractive error and the accommodative response to real-world stimuli. Coupled with its accuracy, repeatability, and capability to measure corneal curvature, it is a valuable addition to objective instrumentation currently available to the optometrist and researcher.
Resumo:
The problem of regression under Gaussian assumptions is treated generally. The relationship between Bayesian prediction, regularization and smoothing is elucidated. The ideal regression is the posterior mean and its computation scales as O(n3), where n is the sample size. We show that the optimal m-dimensional linear model under a given prior is spanned by the first m eigenfunctions of a covariance operator, which is a trace-class operator. This is an infinite dimensional analogue of principal component analysis. The importance of Hilbert space methods to practical statistics is also discussed.
Resumo:
The main aim of this paper is to provide a tutorial on regression with Gaussian processes. We start from Bayesian linear regression, and show how by a change of viewpoint one can see this method as a Gaussian process predictor based on priors over functions, rather than on priors over parameters. This leads in to a more general discussion of Gaussian processes in section 4. Section 5 deals with further issues, including hierarchical modelling and the setting of the parameters that control the Gaussian process, the covariance functions for neural network models and the use of Gaussian processes in classification problems.