973 resultados para Vector spaces -- Problems, exercises, etc.
Resumo:
Full Text / Article complet
Resumo:
The overall objective of the study is to examine the problems and prospects of the tea on industry in Kerala. The specific objectives are to trace the historical evolution of the tea plantation industry in India with special reference to Kerala and to study the performance of tea plantation industry in Kerala. In order to analyse the growth performance of tea plantation industry in Kerala in a comparative perspective, growth rates for the neighbouring states of Karnataka and Tamilnadu are estimated along with the National, South Indian and North Indian estimates. Tea plantation industry is a labour intensive activity. Productivity has been low primarily because of the over aging. In all the factories visited only Black tea is produced. In factories outmoded machines which installed years ago is still used which will increase the cost of production. The major problem is high cost of production and low price realization. The workers are found to be not satisfied with their working conditions- long journey to work place, absence of resting places, latrine facilities etc. and also the problems arising from dust in the factory. At a macro level the tea plantation industry has been facing the adverse impacts of globalisation and trade liberalization. There is only one solution to this problem that is to improve the competitiveness in production of raw leaf and manufacturing of tea. Government has a very important role with specification of strict quality control
Resumo:
The present study was undertaken to evaluate the performance of Coir Vyavasaya Co-operative societies (CVCs) in Kerala. It was also intended to examine the extent of fulfillment of the objectives of Co-operativisation Scheme and the socio-economic betterment of worker members, Further the study was directed to find out the level of participation of members in the affairs of CVCs and to identify the major problems confronting the CVCs and the future prospects of the industry. The objectives of this study are to evaluate the performance of CVCs in Kerala with reference to the objectives of co-operativisation, socio-economic background of the worker members of the CVCs in the state, extent of members participation, major problems etc. Major findings of the study shows that 84% of CVCs surveyed were incurring losses, the long-term solvency position of the CVCS shows very pathetic situation, ration analysis shows and unhealthy state of affaires with respect to short-term solvency position and operating efficiency of all categories of CVS were found to be extremely poor. If CVCs are enabled to increase their quantity of production and there by the volume of business, their amount of loss can be reduced. If this is so, the societies can provide more days of employment to their work members, which will help them to earn more wages and thereby improve their economic and social conditions
Resumo:
In this study we combine the notions of fuzzy order and fuzzy topology of Chang and define fuzzy ordered fuzzy topological space. Its various properties are analysed. Product, quotient, union and intersection of fuzzy orders are introduced. Besides, fuzzy order preserving maps and various fuzzy completeness are investigated. Finally an attempt is made to study the notion of generalized fuzzy ordered fuzzy topological space by considering fuzzy order defined on a fuzzy subset.
Resumo:
Department of Mathematics, Cochin University of Science and Technology.
Resumo:
In this thesis we investigate some problems in set theoretical topology related to the concepts of the group of homeomorphisms and order. Many problems considered are directly or indirectly related to the concept of the group of homeomorphisms of a topological space onto itself. Order theoretic methods are used extensively. Chapter-l deals with the group of homeomorphisms. This concept has been investigated by several authors for many years from different angles. It was observed that nonhomeomorphic topological spaces can have isomorphic groups of homeomorphisms. Many problems relating the topological properties of a space and the algebraic properties of its group of homeomorphisms were investigated. The group of isomorphisms of several algebraic, geometric, order theoretic and topological structures had also been investigated. A related concept of the semigroup of continuous functions of a topological space also received attention
Resumo:
Regularization Networks and Support Vector Machines are techniques for solving certain problems of learning from examples -- in particular the regression problem of approximating a multivariate function from sparse data. We present both formulations in a unified framework, namely in the context of Vapnik's theory of statistical learning which provides a general foundation for the learning problem, combining functional analysis and statistics.
Resumo:
This paper presents a computation of the $V_gamma$ dimension for regression in bounded subspaces of Reproducing Kernel Hilbert Spaces (RKHS) for the Support Vector Machine (SVM) regression $epsilon$-insensitive loss function, and general $L_p$ loss functions. Finiteness of the RV_gamma$ dimension is shown, which also proves uniform convergence in probability for regression machines in RKHS subspaces that use the $L_epsilon$ or general $L_p$ loss functions. This paper presenta a novel proof of this result also for the case that a bias is added to the functions in the RKHS.
Resumo:
In the first part of this paper we show a similarity between the principle of Structural Risk Minimization Principle (SRM) (Vapnik, 1982) and the idea of Sparse Approximation, as defined in (Chen, Donoho and Saunders, 1995) and Olshausen and Field (1996). Then we focus on two specific (approximate) implementations of SRM and Sparse Approximation, which have been used to solve the problem of function approximation. For SRM we consider the Support Vector Machine technique proposed by V. Vapnik and his team at AT&T Bell Labs, and for Sparse Approximation we consider a modification of the Basis Pursuit De-Noising algorithm proposed by Chen, Donoho and Saunders (1995). We show that, under certain conditions, these two techniques are equivalent: they give the same solution and they require the solution of the same quadratic programming problem.
Resumo:
The Support Vector Machine (SVM) is a new and very promising classification technique developed by Vapnik and his group at AT&T Bell Labs. This new learning algorithm can be seen as an alternative training technique for Polynomial, Radial Basis Function and Multi-Layer Perceptron classifiers. An interesting property of this approach is that it is an approximate implementation of the Structural Risk Minimization (SRM) induction principle. The derivation of Support Vector Machines, its relationship with SRM, and its geometrical insight, are discussed in this paper. Training a SVM is equivalent to solve a quadratic programming problem with linear and box constraints in a number of variables equal to the number of data points. When the number of data points exceeds few thousands the problem is very challenging, because the quadratic form is completely dense, so the memory needed to store the problem grows with the square of the number of data points. Therefore, training problems arising in some real applications with large data sets are impossible to load into memory, and cannot be solved using standard non-linear constrained optimization algorithms. We present a decomposition algorithm that can be used to train SVM's over large data sets. The main idea behind the decomposition is the iterative solution of sub-problems and the evaluation of, and also establish the stopping criteria for the algorithm. We present previous approaches, as well as results and important details of our implementation of the algorithm using a second-order variant of the Reduced Gradient Method as the solver of the sub-problems. As an application of SVM's, we present preliminary results we obtained applying SVM to the problem of detecting frontal human faces in real images.
Resumo:
When training Support Vector Machines (SVMs) over non-separable data sets, one sets the threshold $b$ using any dual cost coefficient that is strictly between the bounds of $0$ and $C$. We show that there exist SVM training problems with dual optimal solutions with all coefficients at bounds, but that all such problems are degenerate in the sense that the "optimal separating hyperplane" is given by ${f w} = {f 0}$, and the resulting (degenerate) SVM will classify all future points identically (to the class that supplies more training data). We also derive necessary and sufficient conditions on the input data for this to occur. Finally, we show that an SVM training problem can always be made degenerate by the addition of a single data point belonging to a certain unboundedspolyhedron, which we characterize in terms of its extreme points and rays.
Resumo:
Exercises and solutions in PDF
Resumo:
Exercises and solutions in PDF