279 resultados para Metric Linear Combinations

em Cambridge University Engineering Department Publications Database


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Unbiased location- and scale-invariant `elemental' estimators for the GPD tail parameter are constructed. Each involves three log-spacings. The estimators are unbiased for finite sample sizes, even as small as N=3. It is shown that the elementals form a complete basis for unbiased location- and scale-invariant estimators constructed from linear combinations of log-spacings. Preliminary numerical evidence is presented which suggests that elemental combinations can be constructed which are consistent estimators of the tail parameter for samples drawn from the pure GPD family.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In a companion paper (McRobie(2013) arxiv:1304.3918), a simple set of `elemental' estimators was presented for the Generalized Pareto tail parameter. Each elemental estimator: involves only three log-spacings; is absolutely unbiased for all values of the tail parameter; is location- and scale-invariant; and is valid for all sample sizes $N$, even as small as $N= 3$. It was suggested that linear combinations of such elementals could then be used to construct efficient unbiased estimators. In this paper, the analogous mathematical approach is taken to the Generalised Extreme Value (GEV) distribution. The resulting elemental estimators, although not absolutely unbiased, are found to have very small bias, and may thus provide a useful basis for the construction of efficient estimators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

© 2015 John P. Cunningham and Zoubin Ghahramani. Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a result the connections between all these methods have not been highlighted. Here we survey methods from this disparate literature as optimization programs over matrix manifolds. We discuss principal component analysis, factor analysis, linear multidimensional scaling, Fisher's linear discriminant analysis, canonical correlations analysis, maximum autocorrelation factors, slow feature analysis, sufficient dimensionality reduction, undercomplete independent component analysis, linear regression, distance metric learning, and more. This optimization framework gives insight to some rarely discussed shortcomings of well-known methods, such as the suboptimality of certain eigenvector solutions. Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows straightforward generalizations and novel variants of classical methods, which we demonstrate here by creating an orthogonal-projection canonical correlations analysis. More broadly, this survey and generic solver suggest that linear dimensionality reduction can move toward becoming a blackbox, objective-agnostic numerical technology.