821 resultados para Electronic Gaming Machines


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regularization Networks and Support Vector Machines are techniques for solving certain problems of learning from examples -- in particular the regression problem of approximating a multivariate function from sparse data. We present both formulations in a unified framework, namely in the context of Vapnik's theory of statistical learning which provides a general foundation for the learning problem, combining functional analysis and statistics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the first part of this paper we show a similarity between the principle of Structural Risk Minimization Principle (SRM) (Vapnik, 1982) and the idea of Sparse Approximation, as defined in (Chen, Donoho and Saunders, 1995) and Olshausen and Field (1996). Then we focus on two specific (approximate) implementations of SRM and Sparse Approximation, which have been used to solve the problem of function approximation. For SRM we consider the Support Vector Machine technique proposed by V. Vapnik and his team at AT&T Bell Labs, and for Sparse Approximation we consider a modification of the Basis Pursuit De-Noising algorithm proposed by Chen, Donoho and Saunders (1995). We show that, under certain conditions, these two techniques are equivalent: they give the same solution and they require the solution of the same quadratic programming problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Support Vector Machine (SVM) is a new and very promising classification technique developed by Vapnik and his group at AT&T Bell Labs. This new learning algorithm can be seen as an alternative training technique for Polynomial, Radial Basis Function and Multi-Layer Perceptron classifiers. An interesting property of this approach is that it is an approximate implementation of the Structural Risk Minimization (SRM) induction principle. The derivation of Support Vector Machines, its relationship with SRM, and its geometrical insight, are discussed in this paper. Training a SVM is equivalent to solve a quadratic programming problem with linear and box constraints in a number of variables equal to the number of data points. When the number of data points exceeds few thousands the problem is very challenging, because the quadratic form is completely dense, so the memory needed to store the problem grows with the square of the number of data points. Therefore, training problems arising in some real applications with large data sets are impossible to load into memory, and cannot be solved using standard non-linear constrained optimization algorithms. We present a decomposition algorithm that can be used to train SVM's over large data sets. The main idea behind the decomposition is the iterative solution of sub-problems and the evaluation of, and also establish the stopping criteria for the algorithm. We present previous approaches, as well as results and important details of our implementation of the algorithm using a second-order variant of the Reduced Gradient Method as the solver of the sub-problems. As an application of SVM's, we present preliminary results we obtained applying SVM to the problem of detecting frontal human faces in real images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When training Support Vector Machines (SVMs) over non-separable data sets, one sets the threshold $b$ using any dual cost coefficient that is strictly between the bounds of $0$ and $C$. We show that there exist SVM training problems with dual optimal solutions with all coefficients at bounds, but that all such problems are degenerate in the sense that the "optimal separating hyperplane" is given by ${f w} = {f 0}$, and the resulting (degenerate) SVM will classify all future points identically (to the class that supplies more training data). We also derive necessary and sufficient conditions on the input data for this to occur. Finally, we show that an SVM training problem can always be made degenerate by the addition of a single data point belonging to a certain unboundedspolyhedron, which we characterize in terms of its extreme points and rays.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The electronics industry is encountering thermal challenges and opportunities with lengthscales comparable to or much less than one micrometer. Examples include nanoscale phonon hotspots in transistors and the increasing temperature rise in onchip interconnects. Millimeter-scale hotspots on microprocessors, resulting from varying rates of power consumption, are being addressed using two-phase microchannel heat sinks. Nanoscale thermal data storage technology has received much attention recently. This paper provides an overview of these topics with a focus on related research at Stanford University.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Each player in the financial industry, each bank, stock exchange, government agency, or insurance company operates its own financial information system or systems. By its very nature, financial information, like the money that it represents, changes hands. Therefore the interoperation of financial information systems is the cornerstone of the financial services they support. E-services frameworks such as web services are an unprecedented opportunity for the flexible interoperation of financial systems. Naturally the critical economic role and the complexity of financial information led to the development of various standards. Yet standards alone are not the panacea: different groups of players use different standards or different interpretations of the same standard. We believe that the solution lies in the convergence of flexible E-services such as web-services and semantically rich meta-data as promised by the semantic Web; then a mediation architecture can be used for the documentation, identification, and resolution of semantic conflicts arising from the interoperation of heterogeneous financial services. In this paper we illustrate the nature of the problem in the Electronic Bill Presentment and Payment (EBPP) industry and the viability of the solution we propose. We describe and analyze the integration of services using four different formats: the IFX, OFX and SWIFT standards, and an example proprietary format. To accomplish this integration we use the COntext INterchange (COIN) framework. The COIN architecture leverages a model of sources and receivers’ contexts in reference to a rich domain model or ontology for the description and resolution of semantic heterogeneity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of the digital business ecosystems, small organizations cooperate between them in order to achieve common goals or offer new services for expanding their markets. There are different approaches for these cooperation models such as virtual enterprises, virtual organizations or dynamic electronic institutions which in their lifecycle have in common a dissolution phase. However this phase has not been studied deeply in the current literature and it lacks formalization. In this paper a first approach for achieving and managing the dissolution phase is proposed, as well as a CBR process in order to support it in a multi-agent system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The TechDis Accessibility Essentials Guide for Reading has been divided into the following three sections: Font colours and styles; Enlarging text and Navigating documents. These guides have been designed to give practical step-by-step information to enable anyone reading electronic material to amend its look and feel into a style which suits them, their audience or the context in which it is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2001, before the term “e-portfolio” was common parlance there was a perceived need to enable teachers in training to save, store, present and archive their electronically-based work so that it could be assessed by tutors. A Teacher Training Agency (TTA) Grant was used to design and implement an online system for trainee teachers to save evidence of their activities called Electronic Portfolio System (EPS). Over the years, its use and value has changed and, through the support of the Teacher Development Agency for schools (TDA), the system will continue to develop. The features of the system have grown organically because of technology changes and by tutors and mentors identifying affordances. This presentation will identify the principles and how they guide the development process. It will give an opportunity to elucidate the next stages in e-portfolio developments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virginia Tech has been depositing e-theses for over a decade and is a leader in the field. University of Southampton introduced e-thesis deposit in the academic session 2008-09

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exercises and solutions for a second year engineering maths course. Diagrams for the question are all together in the support.zip file, as .eps files

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Has a mixture of factual information (conventions on how a report should be structured) and motivational information on improving writing and communication skills

Relevância:

20.00% 20.00%

Publicador:

Resumo:

slides for a class which explores professional, ethical and legal issues surround the use, storage and transmission of electronic communications and data. Follows on from previous class which looked in greater detail at the Data Protection Act

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of electronic documents is constantly growing and the necessity to implement an ad-hoc eCertificate which manages access to private information is not only required but also necessary. This paper presents a protocol for the management of electronic identities (eIDs), meant as a substitute for the paper-based IDs, in a mobile environment with a user-centric approach. Mobile devices have been chosen because they provide mobility, personal use and high computational complexity. The inherent user-centricity also allows the user to personally manage the ID information and to display only what is required. The chosen path to develop the protocol is to migrate the existing eCert technologies implemented by the Learning Societies Laboratory in Southampton. By comparing this protocol with the analysis of the eID problem domain, a new solution has been derived which is compatible with both systems without loss of features.