62 resultados para Orthogonal polynomials
Resumo:
Surprisingly expensive to compute wall distances are still used in a range of key turbulence and peripheral physics models. Potentially economical, accuracy improving differential equation based distance algorithms are considered. These involve elliptic Poisson and hyperbolic natured Eikonal equation approaches. Numerical issues relating to non-orthogonal curvilinear grid solution of the latter are addressed. Eikonal extension to a Hamilton-Jacobi (HJ) equation is discussed. Use of this extension to improve turbulence model accuracy and, along with the Eikonal, enhance Detached Eddy Simulation (DES) techniques is considered. Application of the distance approaches is studied for various geometries. These include a plane channel flow with a wire at the centre, a wing-flap system, a jet with co-flow and a supersonic double-delta configuration. Although less accurate than the Eikonal, Poisson method based flow solutions are extremely close to those using a search procedure. For a moving grid case the Poisson method is found especially efficient. Results show the Eikonal equation can be solved on highly stretched, non-orthogonal, curvilinear grids. A key accuracy aspect is that metrics must be upwinded in the propagating front direction. The HJ equation is found to have qualitative turbulence model improving properties. © 2003 by P. G. Tucker.
Resumo:
© 2015 John P. Cunningham and Zoubin Ghahramani. Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a result the connections between all these methods have not been highlighted. Here we survey methods from this disparate literature as optimization programs over matrix manifolds. We discuss principal component analysis, factor analysis, linear multidimensional scaling, Fisher's linear discriminant analysis, canonical correlations analysis, maximum autocorrelation factors, slow feature analysis, sufficient dimensionality reduction, undercomplete independent component analysis, linear regression, distance metric learning, and more. This optimization framework gives insight to some rarely discussed shortcomings of well-known methods, such as the suboptimality of certain eigenvector solutions. Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows straightforward generalizations and novel variants of classical methods, which we demonstrate here by creating an orthogonal-projection canonical correlations analysis. More broadly, this survey and generic solver suggest that linear dimensionality reduction can move toward becoming a blackbox, objective-agnostic numerical technology.