6 resultados para Radial Diffuser
em Massachusetts Institute of Technology
Resumo:
The Support Vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights and threshold such as to minimize an upper bound on the expected test error. The present study is devoted to an experimental comparison of these machines with a classical approach, where the centers are determined by $k$--means clustering and the weights are found using error backpropagation. We consider three machines, namely a classical RBF machine, an SV machine with Gaussian kernel, and a hybrid system with the centers determined by the SV method and the weights trained by error backpropagation. Our results show that on the US postal service database of handwritten digits, the SV machine achieves the highest test accuracy, followed by the hybrid approach. The SV approach is thus not only theoretically well--founded, but also superior in a practical application.
Resumo:
We had previously shown that regularization principles lead to approximation schemes, as Radial Basis Functions, which are equivalent to networks with one layer of hidden units, called Regularization Networks. In this paper we show that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models, Breiman's hinge functions and some forms of Projection Pursuit Regression. In the probabilistic interpretation of regularization, the different classes of basis functions correspond to different classes of prior probabilities on the approximating function spaces, and therefore to different types of smoothness assumptions. In the final part of the paper, we also show a relation between activation functions of the Gaussian and sigmoidal type.
Resumo:
This thesis attempts to quantify the amount of information needed to learn certain tasks. The tasks chosen vary from learning functions in a Sobolev space using radial basis function networks to learning grammars in the principles and parameters framework of modern linguistic theory. These problems are analyzed from the perspective of computational learning theory and certain unifying perspectives emerge.
Resumo:
We propose a nonparametric method for estimating derivative financial asset pricing formulae using learning networks. To demonstrate feasibility, we first simulate Black-Scholes option prices and show that learning networks can recover the Black-Scholes formula from a two-year training set of daily options prices, and that the resulting network formula can be used successfully to both price and delta-hedge options out-of-sample. For comparison, we estimate models using four popular methods: ordinary least squares, radial basis functions, multilayer perceptrons, and projection pursuit. To illustrate practical relevance, we also apply our approach to S&P 500 futures options data from 1987 to 1991.
Resumo:
The Support Vector Machine (SVM) is a new and very promising classification technique developed by Vapnik and his group at AT&T Bell Labs. This new learning algorithm can be seen as an alternative training technique for Polynomial, Radial Basis Function and Multi-Layer Perceptron classifiers. An interesting property of this approach is that it is an approximate implementation of the Structural Risk Minimization (SRM) induction principle. The derivation of Support Vector Machines, its relationship with SRM, and its geometrical insight, are discussed in this paper. Training a SVM is equivalent to solve a quadratic programming problem with linear and box constraints in a number of variables equal to the number of data points. When the number of data points exceeds few thousands the problem is very challenging, because the quadratic form is completely dense, so the memory needed to store the problem grows with the square of the number of data points. Therefore, training problems arising in some real applications with large data sets are impossible to load into memory, and cannot be solved using standard non-linear constrained optimization algorithms. We present a decomposition algorithm that can be used to train SVM's over large data sets. The main idea behind the decomposition is the iterative solution of sub-problems and the evaluation of, and also establish the stopping criteria for the algorithm. We present previous approaches, as well as results and important details of our implementation of the algorithm using a second-order variant of the Reduced Gradient Method as the solver of the sub-problems. As an application of SVM's, we present preliminary results we obtained applying SVM to the problem of detecting frontal human faces in real images.
Resumo:
We present an experimental study on the behavior of bubbles captured in a Taylor vortex. The gap between a rotating inner cylinder and a stationary outer cylinder is filled with a Newtonian mineral oil. Beyond a critical rotation speed (ω[subscript c]), Taylor vortices appear in this system. Small air bubbles are introduced into the gap through a needle connected to a syringe pump. These are then captured in the cores of the vortices (core bubble) and in the outflow regions along the inner cylinder (wall bubble). The flow field is measured with a two-dimensional particle imaging velocimetry (PIV) system. The motion of the bubbles is monitored by using a high speed video camera. It has been found that, if the core bubbles are all of the same size, a bubble ring forms at the center of the vortex such that bubbles are azimuthally uniformly distributed. There is a saturation number (N[subscript s]) of bubbles in the ring, such that the addition of one more bubble leads eventually to a coalescence and a subsequent complicated evolution. Ns increases with increasing rotation speed and decreasing bubble size. For bubbles of non-uniform size, small bubbles and large bubbles in nearly the same orbit can be observed to cross due to their different circulating speeds. The wall bubbles, however, do not become uniformly distributed, but instead form short bubble-chains which might eventually evolve into large bubbles. The motion of droplets and particles in a Taylor vortex was also investigated. As with bubbles, droplets and particles align into a ring structure at low rotation speeds, but the saturation number is much smaller. Moreover, at high rotation speeds, droplets and particles exhibit a characteristic periodic oscillation in the axial, radial and tangential directions due to their inertia. In addition, experiments with non-spherical particles show that they behave rather similarly. This study provides a better understanding of particulate behavior in vortex flow structures.