72 resultados para second derivative
Resumo:
Recently there has been interest in combined gen- erative/discriminative classifiers. In these classifiers features for the discriminative models are derived from generative kernels. One advantage of using generative kernels is that systematic approaches exist how to introduce complex dependencies beyond conditional independence assumptions. Furthermore, by using generative kernels model-based compensation/adaptation tech- niques can be applied to make discriminative models robust to noise/speaker conditions. This paper extends previous work with combined generative/discriminative classifiers in several directions. First, it introduces derivative kernels based on context- dependent generative models. Second, it describes how derivative kernels can be incorporated in continuous discriminative models. Third, it addresses the issues associated with large number of classes and parameters when context-dependent models and high- dimensional features of derivative kernels are used. The approach is evaluated on two noise-corrupted tasks: small vocabulary AURORA 2 and medium-to-large vocabulary AURORA 4 task.
Resumo:
Recently there has been interest in combining generative and discriminative classifiers. In these classifiers features for the discriminative models are derived from the generative kernels. One advantage of using generative kernels is that systematic approaches exist to introduce complex dependencies into the feature-space. Furthermore, as the features are based on generative models standard model-based compensation and adaptation techniques can be applied to make discriminative models robust to noise and speaker conditions. This paper extends previous work in this framework in several directions. First, it introduces derivative kernels based on context-dependent generative models. Second, it describes how derivative kernels can be incorporated in structured discriminative models. Third, it addresses the issues associated with large number of classes and parameters when context-dependent models and high-dimensional feature-spaces of derivative kernels are used. The approach is evaluated on two noise-corrupted tasks: small vocabulary AURORA 2 and medium-to-large vocabulary AURORA 4 task. © 2011 IEEE.
Resumo:
An expression for the probability density function of the second order response of a general FPSO in spreading seas is derived by using the Kac-Siegert approach. Various approximations of the second order force transfer functions are investigated for a ship-shaped FPSO. It is found that, when expressed in non-dimensional form, the probability density function of the response is not particularly sensitive to wave spreading, although the mean squared response and the resulting dimensional extreme values can be sensitive. The analysis is then applied to a Sevan FPSO, which is a large cylindrical buoy-like structure. The second order force transfer functions are derived by using an efficient semi-analytical hydrodynamic approach, and these are then employed to yield the extreme response. However, a significant effect of wave spreading on the statistics for a Sevan FPSO is found even in non-dimensional form. It implies that the exact statistics of a general ship-shaped FPSO may be sensitive to the wave direction, which needs to be verified in future work. It is also pointed out that the Newman's approximation regarding the frequency dependency of force transfer function is acceptable even for the spreading seas. An improvement on the results may be attained when considering the angular dependency exactly. Copyright © 2009 by ASME.
Resumo:
Sequential Monte Carlo methods, also known as particle methods, are a widely used set of computational tools for inference in non-linear non-Gaussian state-space models. In many applications it may be necessary to compute the sensitivity, or derivative, of the optimal filter with respect to the static parameters of the state-space model; for instance, in order to obtain maximum likelihood model parameters of interest, or to compute the optimal controller in an optimal control problem. In Poyiadjis et al. [2011] an original particle algorithm to compute the filter derivative was proposed and it was shown using numerical examples that the particle estimate was numerically stable in the sense that it did not deteriorate over time. In this paper we substantiate this claim with a detailed theoretical study. Lp bounds and a central limit theorem for this particle approximation of the filter derivative are presented. It is further shown that under mixing conditions these Lp bounds and the asymptotic variance characterized by the central limit theorem are uniformly bounded with respect to the time index. We demon- strate the performance predicted by theory with several numerical examples. We also use the particle approximation of the filter derivative to perform online maximum likelihood parameter estimation for a stochastic volatility model.