906 resultados para New Space Vector Modulation
Resumo:
The Support Vector Machine (SVM) is a new and very promising classification technique developed by Vapnik and his group at AT&T Bell Labs. This new learning algorithm can be seen as an alternative training technique for Polynomial, Radial Basis Function and Multi-Layer Perceptron classifiers. An interesting property of this approach is that it is an approximate implementation of the Structural Risk Minimization (SRM) induction principle. The derivation of Support Vector Machines, its relationship with SRM, and its geometrical insight, are discussed in this paper. Training a SVM is equivalent to solve a quadratic programming problem with linear and box constraints in a number of variables equal to the number of data points. When the number of data points exceeds few thousands the problem is very challenging, because the quadratic form is completely dense, so the memory needed to store the problem grows with the square of the number of data points. Therefore, training problems arising in some real applications with large data sets are impossible to load into memory, and cannot be solved using standard non-linear constrained optimization algorithms. We present a decomposition algorithm that can be used to train SVM's over large data sets. The main idea behind the decomposition is the iterative solution of sub-problems and the evaluation of, and also establish the stopping criteria for the algorithm. We present previous approaches, as well as results and important details of our implementation of the algorithm using a second-order variant of the Reduced Gradient Method as the solver of the sub-problems. As an application of SVM's, we present preliminary results we obtained applying SVM to the problem of detecting frontal human faces in real images.
Resumo:
The use of orthonormal coordinates in the simplex and, particularly, balance coordinates, has suggested the use of a dendrogram for the exploratory analysis of compositional data. The dendrogram is based on a sequential binary partition of a compositional vector into groups of parts. At each step of a partition, one group of parts is divided into two new groups, and a balancing axis in the simplex between both groups is defined. The set of balancing axes constitutes an orthonormal basis, and the projections of the sample on them are orthogonal coordinates. They can be represented in a dendrogram-like graph showing: (a) the way of grouping parts of the compositional vector; (b) the explanatory role of each subcomposition generated in the partition process; (c) the decomposition of the total variance into balance components associated with each binary partition; (d) a box-plot of each balance. This representation is useful to help the interpretation of balance coordinates; to identify which are the most explanatory coordinates; and to describe the whole sample in a single diagram independently of the number of parts of the sample
Resumo:
The use of perturbation and power transformation operations permits the investigation of linear processes in the simplex as in a vectorial space. When the investigated geochemical processes can be constrained by the use of well-known starting point, the eigenvectors of the covariance matrix of a non-centred principal component analysis allow to model compositional changes compared with a reference point. The results obtained for the chemistry of water collected in River Arno (central-northern Italy) have open new perspectives for considering relative changes of the analysed variables and to hypothesise the relative effect of different acting physical-chemical processes, thus posing the basis for a quantitative modelling
Resumo:
Traffic Engineering objective is to optimize network resource utilization. Although several works have been published about minimizing network resource utilization in MPLS networks, few of them have been focused in LSR label space reduction. This letter studies Asymmetric Merged Tunneling (AMT) as a new method for reducing the label space in MPLS network. The proposed method may be regarded as a combination of label merging (proposed in the MPLS architecture) and asymmetric tunneling (proposed recently in our previous works). Finally, simulation results are performed by comparing AMT with both ancestors. They show a great improvement in the label space reduction factor
Resumo:
This is a selection of University of Southampton Logos in both vector (svg) and raster (png) formats. These are suitable for use on the web or in small documents and posters. You can open the SVG files using inkscape (http://inkscape.org/download/?lang=en) and edit them directly. The University logo should not be modified and attention should be paid to the branding guidelines found here: http://www.edshare.soton.ac.uk/10481 You must always leave a space the width of an capital O in Southampton on all 4 edges of the logo. The negative space makes it appear more prominently on the page.
Resumo:
These are a range of logos created in the same way as Mr Patrick McSweeny http://www.edshare.soton.ac.uk/11157. The logo has been extracted from PDF documents and is smoother and accurate to the original logo design. Many thanks to to McSweeny for publishing the logo, in SVG originally, I struggled to find it anywhere else. Files are in Inkscape SVG, PDF and PNG. From Mr Patrick McSweeney: This is a selection of University of Southampton Logos in both vector (svg) and raster (png) formats. These are suitable for use on the web or in small documents and posters. You can open the SVG files using inkscape (http://inkscape.org/download/?lang=en) and edit them directly. The University logo should not be modified and attention should be paid to the branding guidelines found here: http://www.edshare.soton.ac.uk/10481 You must always leave a space the width of an capital O in Southampton on all 4 edges of the logo. The negative space makes it appear more prominently on the page.
Resumo:
The article aims to make visible some nuances of the 17TH century in Spain and the New Granada with emphasis on articulations and tensions that made up this cultural and social space through the analysis of the letrados and its position in the Hispanic cultural field of the 16th and 17TH centuries. This article also discusses the traditional thesis about the cultural isolation and obscurantism in the American colonies before the eighteenth century through the analysis of the circulation of books and knowledge between mainland Spain and its colonies, and the heterogeneous character of the lawyers that affect the symbolic monopoly of the Catholic Church.
Resumo:
A conceptually new approach is introduced for the decomposition of the molecular energy calculated at the density functional theory level of theory into sum of one- and two-atomic energy components, and is realized in the "fuzzy atoms" framework. (Fuzzy atoms mean that the three-dimensional physical space is divided into atomic regions having no sharp boundaries but exhibiting a continuous transition from one to another.) The new scheme uses the new concept of "bond order density" to calculate the diatomic exchange energy components and gives them unexpectedly close to the values calculated by the exact (Hartree-Fock) exchange for the same Kohn-Sham orbitals
Resumo:
The front speed of the Neolithic (farmer) spread in Europe decreased as it reached Northern latitudes, where the Mesolithic (huntergatherer) population density was higher. Here, we describe a reaction diffusion model with (i) an anisotropic dispersion kernel depending on the Mesolithic population density gradient and (ii) a modified population growth equation. Both effects are related to the space available for the Neolithic population. The model is able to explain the slowdown of the Neolithic front as observed from archaeological data
Resumo:
As part of its Data User Element programme, the European Space Agency funded the GlobMODEL project which aimed at investigating the scientific, technical, and organizational issues associated with the use and exploitation of remotely-sensed observations, particularly from new sounders. A pilot study was performed as a "demonstrator" of the GlobMODEL idea, based on the use of new data, with a strong European heritage, not yet assimilated operationally. Two parallel assimilation experiments were performed, using either total column ozone or ozone profiles retrieved at the Royal Netherlands Meteorological Institute (KNMI) from the Ozone Monitoring Instrument (OMI). In both cases, the impact of assimilating OMI data in addition to the total ozone columns from the SCanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY) on the European Centre for Medium Range Weather Forecasts (ECMWF) ozone analyses was assessed by means of independent measurements. We found that the impact of OMI total columns is mainly limited to the region between 20 and 80 hPa, and is particularly important at high latitudes in the Southern hemisphere where the stratospheric ozone transport and chemical depletion are generally difficult to model with accuracy. Furthermore, the assimilation experiments carried out in this work suggest that OMI DOAS (Differential Optical Absorption Spectroscopy) total ozone columns are on average larger than SCIAMACHY total columns by up to 3 DU, while OMI total columns derived from OMI ozone profiles are on average about 8 DU larger than SCIAMACHY total columns. At the same time, the demonstrator brought to light a number of issues related to the assimilation of atmospheric composition profiles, such as the shortcomings arising when the vertical resolution of the instrument is not properly accounted for in the assimilation. The GlobMODEL demonstrator accelerated scientific and operational utilization of new observations and its results - prompted ECMWF to start the operational assimilation of OMI total column ozone data.
Resumo:
Urban regeneration programmes in the UK over the past 20 years have increasingly focused on attracting investors, middle-class shoppers and visitors by transforming places and creating new consumption spaces. Ensuring that places are safe and are seen to be safe has taken on greater salience as these flows of income are easily disrupted by changing perceptions of fear and the threat of crime. At the same time, new technologies and policing strategies and tactics have been adopted in a number of regeneration areas which seek to establish control over these new urban spaces. Policing space is increasingly about controlling human actions through design, surveillance technologies and codes of conduct and enforcement. Regeneration agencies and the police now work in partnerships to develop their strategies. At its most extreme, this can lead to the creation of zero-tolerance, or what Smith terms 'revanchist', measures aimed at particular social groups in an effort to sanitise space in the interests of capital accumulation. This paper, drawing on an examination of regeneration practices and processes in one of the UK's fastest-growing urban areas, Reading in Berkshire, assesses policing strategies and tactics in the wake of a major regeneration programme. It documents and discusses the discourses of regeneration that have developed in the town and the ways in which new urban spaces have been secured. It argues that, whilst security concerns have become embedded in institutional discourses and practices, the implementation of security measures has been mediated, in part, by the local socio-political relations in and through which they have been developed.
Resumo:
Much of the writing on urban regeneration in the UK has been focused on the types of urban spaces that have been created in city centres. Less has been written about the issue of when the benefits of regeneration could and should be delivered to a range of different interests, and the different time frames that exist in any development area. Different perceptions of time have been reflected in dominant development philosophies in the UK and elsewhere. The trickle-down agendas of the 1980s, for example, were criticised for their focus on the short-term time frames and needs of developers, often at the expense of those of local communities. The recent emergence of sustainability discourses, however, ostensibly changes the time focus of development and promotes a broader concern with new imagined futures. This paper draws on the example of development in Salford Quays, in the North West of England, to argue that more attention needs to be given to the politics of space-time in urban development processes. It begins by discussing the importance and relevance of this approach before turning to the case study and the ways in which the local politics of space-time has influenced development agendas and outcomes. The paper argues that such an approach harbours the potential for more progressive, far-reaching, and sustainable development agendas to be developed and implemented.
Resumo:
Chess endgame tables should provide efficiently the value and depth of any required position during play. The indexing of an endgame’s positions is crucial to meeting this objective. This paper updates Heinz’ previous review of approaches to indexing and describes the latest approach by the first and third authors. Heinz’ and Nalimov’s endgame tables (EGTs) encompass the en passant rule and have the most compact index schemes to date. Nalimov’s EGTs, to the Distance-to-Mate (DTM) metric, require only 30.6 × 10^9 elements in total for all the 3-to-5-man endgames and are individually more compact than previous tables. His new index scheme has proved itself while generating the tables and in the 1999 World Computer Chess Championship where many of the top programs used the new suite of EGTs.
Resumo:
Chess endgame tables should provide efficiently the value and depth of any required position during play. The indexing of an endgame’s positions is crucial to meeting this objective. This paper updates Heinz’ previous review of approaches to indexing and describes the latest approach by the first and third authors. Heinz’ and Nalimov’s endgame tables (EGTs) encompass the en passant rule and have the most compact index schemes to date. Nalimov’s EGTs, to the Distance-to-Mate (DTM) metric, require only 30.6 × 109 elements in total for all the 3-to-5-man endgames and are individually more compact than previous tables. His new index scheme has proved itself while generating the tables and in the 1999 World Computer Chess Championship where many of the top programs used the new suite of EGTs.