953 resultados para Time series. Transfer function. Recursive Estimation. Plunger lift. Gas flow.
Resumo:
EXTRACT (SEE PDF FOR FULL ABSTRACT): Our objective is to combine terrestrial and oceanic records for reconstructing West Coast climate. Tree rings and marine laminated sediments provide high-resolution, accurately dated proxy data on the variability of climate and on the productivity of the ocean and have been used to reconstruct precipitation, temperature, sea level pressure, primary productivity, and other large-scale parameters. We present here the latest Santa Barbara basin varve chronology for the twentieth century as well as a newly developed tree-ring chronology for Torrey pine.
Resumo:
Optical motion capture systems suffer from marker occlusions resulting in loss of useful information. This paper addresses the problem of real-time joint localisation of legged skeletons in the presence of such missing data. The data is assumed to be labelled 3d marker positions from a motion capture system. An integrated framework is presented which predicts the occluded marker positions using a Variable Turn Model within an Unscented Kalman filter. Inferred information from neighbouring markers is used as observation states; these constraints are efficient, simple, and real-time implementable. This work also takes advantage of the common case that missing markers are still visible to a single camera, by combining predictions with under-determined positions, resulting in more accurate predictions. An Inverse Kinematics technique is then applied ensuring that the bone lengths remain constant over time; the system can thereby maintain a continuous data-flow. The marker and Centre of Rotation (CoR) positions can be calculated with high accuracy even in cases where markers are occluded for a long period of time. Our methodology is tested against some of the most popular methods for marker prediction and the results confirm that our approach outperforms these methods in estimating both marker and CoR positions. © 2012 Springer-Verlag.
Resumo:
Variational methods are a key component of the approximate inference and learning toolbox. These methods fill an important middle ground, retaining distributional information about uncertainty in latent variables, unlike maximum a posteriori methods (MAP), and yet generally requiring less computational time than Monte Carlo Markov Chain methods. In particular the variational Expectation Maximisation (vEM) and variational Bayes algorithms, both involving variational optimisation of a free-energy, are widely used in time-series modelling. Here, we investigate the success of vEM in simple probabilistic time-series models. First we consider the inference step of vEM, and show that a consequence of the well-known compactness property of variational inference is a failure to propagate uncertainty in time, thus limiting the usefulness of the retained distributional information. In particular, the uncertainty may appear to be smallest precisely when the approximation is poorest. Second, we consider parameter learning and analytically reveal systematic biases in the parameters found by vEM. Surprisingly, simpler variational approximations (such a mean-field) can lead to less bias than more complicated structured approximations.
Resumo:
We live in an era of abundant data. This has necessitated the development of new and innovative statistical algorithms to get the most from experimental data. For example, faster algorithms make practical the analysis of larger genomic data sets, allowing us to extend the utility of cutting-edge statistical methods. We present a randomised algorithm that accelerates the clustering of time series data using the Bayesian Hierarchical Clustering (BHC) statistical method. BHC is a general method for clustering any discretely sampled time series data. In this paper we focus on a particular application to microarray gene expression data. We define and analyse the randomised algorithm, before presenting results on both synthetic and real biological data sets. We show that the randomised algorithm leads to substantial gains in speed with minimal loss in clustering quality. The randomised time series BHC algorithm is available as part of the R package BHC, which is available for download from Bioconductor (version 2.10 and above) via http://bioconductor.org/packages/2.10/bioc/html/BHC.html. We have also made available a set of R scripts which can be used to reproduce the analyses carried out in this paper. These are available from the following URL. https://sites.google.com/site/randomisedbhc/.
Resumo:
The accurate prediction of time-changing covariances is an important problem in the modeling of multivariate financial data. However, some of the most popular models suffer from a) overfitting problems and multiple local optima, b) failure to capture shifts in market conditions and c) large computational costs. To address these problems we introduce a novel dynamic model for time-changing covariances. Over-fitting and local optima are avoided by following a Bayesian approach instead of computing point estimates. Changes in market conditions are captured by assuming a diffusion process in parameter values, and finally computationally efficient and scalable inference is performed using particle filters. Experiments with financial data show excellent performance of the proposed method with respect to current standard models.
Resumo:
One novel neuron with variable nonlinear transfer function is firstly proposed, It could also be called as subsection transfer function neuron. With different transfer function components, by virtue of multi-thresholded, the variable transfer function neuron switch on among different nonlinear excitated state. And the comparison of output's transfer characteristics between it and single-thresholded neuron will be illustrated, with some practical application experiments on Bi-level logic operation, at last the simple comparison with conventional BP, RBF, and even DBF NN is taken to expect the development foreground on the variable neuron.. The novel nonlinear transfer function neuron could implement the random nonlinear mapping relationship between input layer and output layer, which could make variable transfer function neuron have one much wider applications on lots of reseach realm such as function approximation pattern recognition data compress and so on.
Resumo:
In modern process industry, it is often difficult to analyze a manufacture process due to its umerous time-series data. Analysts wish to not only interpret the evolution of data over time in a working procedure, but also examine the changes in the whole production process through time. To meet such analytic requirements, we have developed ProcessLine, an interactive visualization tool for a large amount of time-series data in process industry. The data are displayed in a fisheye timeline. ProcessLine provides good overviews for the whole production process and details for the focused working procedure. A preliminary user study using beer industry production data has shown that the tool is effective.
Resumo:
In a previous Letter [Opt. Lett. 33, 1171 (2008)], we proposed an improved logarithmic phase mask by making modifications to the original one designed by Sherif. However, further studies in another paper [Appl. Opt. 49, 229 (2010)] show that even when the Sherif mask and the improved one are optimized, their corresponding defocused modulation transfer functions (MTFs) are still not stable with respect to focus errors. So, by further modifying their phase profiles, we design another two logarithmic phase masks that exhibit more stable defocused MTF. However, with the defocus-induced phase effect considered, we find that the performance of the two masks proposed in this Letter is better than the Sherif mask, but worse than our previously proposed phase mask, according to the Hilbert space angle. (C) 2010 Optical Society of America
Resumo:
Wavefront coding can be used to extend the depth of field of incoherent imaging systems and is a powerful system-level technique. In order to assess the performance of a wavefront-coded imaging system, defocused optical transfer function (OTF) is the metric frequently used. Unfortunately, to the best of our knowledge, among all types of phase masks, it is usually difficult to obtain the analytical OTF except the cubic one. Although numerical computation seems good enough for performance evaluation, the approximate analytical OTF is still indispensable because it can reflect the relationship between mask parameters and system frequency response in a clearer way. Thus, a method is proposed to derive the approximate analytical OTF for two-dimensional rectangularly separable phase masks. The analytical results are well consistent with the direct numerical computations, but the proposed method can be accepted only from engineering point of view and needs rigorous proof in future. (c) 2010 Society of Photo-Optical Instrumentation Engineers. [DOI: 10.1117/1.3485759]