890 resultados para Voting-machines.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY SERVICES WITH PRIOR ARRANGEMENT
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
The problem of computing the storage capacity of a feed-forward network, with L hidden layers, N inputs, and K units in the first hidden layer, is analyzed using techniques from statistical mechanics. We found that the storage capacity strongly depends on the network architecture αc ∼ (log K)1-1/2L and that the number of units K limits the number of possible hidden layers L through the relationship 2L - 1 < 2log K. © 2014 IOP Publishing Ltd.
Resumo:
In recent years, learning word vector representations has attracted much interest in Natural Language Processing. Word representations or embeddings learned using unsupervised methods help addressing the problem of traditional bag-of-word approaches which fail to capture contextual semantics. In this paper we go beyond the vector representations at the word level and propose a novel framework that learns higher-level feature representations of n-grams, phrases and sentences using a deep neural network built from stacked Convolutional Restricted Boltzmann Machines (CRBMs). These representations have been shown to map syntactically and semantically related n-grams to closeby locations in the hidden feature space. We have experimented to additionally incorporate these higher-level features into supervised classifier training for two sentiment analysis tasks: subjectivity classification and sentiment classification. Our results have demonstrated the success of our proposed framework with 4% improvement in accuracy observed for subjectivity classification and improved the results achieved for sentiment classification over models trained without our higher level features.
Resumo:
In high precision industry, the measurement of geometry is often performed using coordinate measuring machines (CMMs). Measurements on CMMs can occur at many places within a long and global supply chain. In this context it is a challenge to control consistency, so that measurements are applied with appropriate levels of rigour and achieve comparable results, wherever and whenever they are performed. In this paper, a framework is outlined in which consistency is controlled through measurement strategy, such as the number and location of measurement points. The framework is put to action in a case study, demonstrating the usefulness of the approach and highlighting the dangers of imposing rigid measurement strategies across the supply chain, even if linked to standardised manufacturing processes. Potential mitigations, and the requirements for future research, are outlined.
Resumo:
Measuring and compensating the pivot points of five-axis machine tools is always challenging and very time consuming. This paper presents a newly developed approach for automatic measurement and compensation of pivot point positional errors on five-axis machine tools. Machine rotary axis errors are measured using a circular test. This method has been tested on five-axis machine tools with swivel table configuration. Results show that up to 99% of the positional errors of the rotary axis can be compensated by using this approach.
Resumo:
This paper draws upon part of the findings of an ethnographic study in which two seventeen year old girls were employed to interview their peer about engineering as a study and career choice. It argues that whilst girls do view engineering as being generally masculine in nature, other factors such as a lack of female role models and an emphasis on physics and maths act as barriers to young women entering the discipline. The paper concludes by noting that engineering has much to offer young women, the problem is, they simply don't know this is the case! Copyright © 2013 Jane Andrews & Robin Clark.
Resumo:
This thesis studies survival analysis techniques dealing with censoring to produce predictive tools that predict the risk of endovascular aortic aneurysm repair (EVAR) re-intervention. Censoring indicates that some patients do not continue follow up, so their outcome class is unknown. Methods dealing with censoring have drawbacks and cannot handle the high censoring of the two EVAR datasets collected. Therefore, this thesis presents a new solution to high censoring by modifying an approach that was incapable of differentiating between risks groups of aortic complications. Feature selection (FS) becomes complicated with censoring. Most survival FS methods depends on Cox's model, however machine learning classifiers (MLC) are preferred. Few methods adopted MLC to perform survival FS, but they cannot be used with high censoring. This thesis proposes two FS methods which use MLC to evaluate features. The two FS methods use the new solution to deal with censoring. They combine factor analysis with greedy stepwise FS search which allows eliminated features to enter the FS process. The first FS method searches for the best neural networks' configuration and subset of features. The second approach combines support vector machines, neural networks, and K nearest neighbor classifiers using simple and weighted majority voting to construct a multiple classifier system (MCS) for improving the performance of individual classifiers. It presents a new hybrid FS process by using MCS as a wrapper method and merging it with the iterated feature ranking filter method to further reduce the features. The proposed techniques outperformed FS methods based on Cox's model such as; Akaike and Bayesian information criteria, and least absolute shrinkage and selector operator in the log-rank test's p-values, sensitivity, and concordance. This proves that the proposed techniques are more powerful in correctly predicting the risk of re-intervention. Consequently, they enable doctors to set patients’ appropriate future observation plan.
Resumo:
A hagyományos szavazási játékok speciális átruházható hasznosságú, kooperatív játékok, úgynevezett egyszerű játékok, ahol a játékosok a pártok, és az egyes koalíciók értéke 1 vagy 0 attól függően, hogy az adott koalíció elég erős-e az adott jogszabály elfogadásához, vagy sem. Ebben a cikkben bevezetjük az általánosított súlyozott szavazási játékok fogalmát, ahol a pártok mandátumainak száma a valószínűségi változó. Magyar példákon keresztül mutatjuk be az új megközelítés használhatóságát. / === / Voting games are cooperative games with transferable utility, so-called simple games, where the players are parties and the value of a coalition may be 0 or 1 depending on its ability to pass a new law. The authors introduce the concept of generalized weighted voting games where the parties' strengths are random variables. taking examples from Hungary to illustrate the use of this approach.
Resumo:
Az új választási törvény egyik célja a korábbinál igazságosabb választási körzetek kialakítása. Ezt a Velencei Bizottság választási kódexében megfogalmazott ajánlásokhoz hasonló, bár azoknál némileg megengedőbb szabályok révén biztosítja. A szabályok rögzítik a körzetek számát, illetve hogy a körzetek nem oszthatnak ketté kisebb településeket, és nem nyúlhatnak át a megyehatárokon. Tanulmányunkban belátjuk, hogy a szabályok betartása mellett a körzetek kialakítása matematikailag lehetetlen. Javaslatot teszünk a probléma optimális megoldására elvi alapon is, vizsgáljuk a módszer tulajdonságait, majd az általunk megfogalmazott hatékony algoritmussal, a 2010. évi országgyűlési választások adatainak felhasználásával meghatározzuk a körzetek megyék közti elosztásának legjobb megoldását. Végül kitérünk a demográfiai változások várható hatásaira, és több javaslatot teszünk a korlátok hosszú távú betartására: javasoljuk a választási körzetek számának körülbelül 130-ra növelését; egy-egy felülvizsgálat alkalmával a választási körzetek számának megváltoztathatóságát; illetve a körzetek megyék helyett régiók szerinti szervezését. _______ One of the aims of the new electoral law of Hungary has been to apportion voters to voting districts more fairly. This is ensured by a set of rules rather more permissive than those put forward in the Code of Good Practice in Electoral Matters issued by the Venice Commission. These rules fix the size of the voting districts, and require voting districts not to split smaller towns and villages and not to cross county borders. The article shows that such an apportionment is mathematically impos-sible, and makes suggestions for a theoretical approach to resolving this problem: determine the optimal apportionment by studying the properties of their approach, and use the authors efficient algorithm on the data for the 2010 national elections. The article also examines the expected effect of demographic changes and formulates recommendations for adhering to the rules over the long term: increase the number of voting districts to about 130, allow the number of voting districts to change flexibly at each revision of the districts, and base the districts on regions rather than counties.
Resumo:
In our study we rely on a data mining procedure known as support vector machine (SVM) on the database of the first Hungarian bankruptcy model. The models constructed are then contrasted with the results of earlier bankruptcy models with the use of classification accuracy and the area under the ROC curve. In using the SVM technique, in addition to conventional kernel functions, we also examine the possibilities of applying the ANOVA kernel function and take a detailed look at data preparation tasks recommended in using the SVM method (handling of outliers). The results of the models assembled suggest that a significant improvement of classification accuracy can be achieved on the database of the first Hungarian bankruptcy model when using the SVM method as opposed to neural networks.
Resumo:
This research is motivated by a practical application observed at a printed circuit board (PCB) manufacturing facility. After assembly, the PCBs (or jobs) are tested in environmental stress screening (ESS) chambers (or batch processing machines) to detect early failures. Several PCBs can be simultaneously tested as long as the total size of all the PCBs in the batch does not violate the chamber capacity. PCBs from different production lines arrive dynamically to a queue in front of a set of identical ESS chambers, where they are grouped into batches for testing. Each line delivers PCBs that vary in size and require different testing (or processing) times. Once a batch is formed, its processing time is the longest processing time among the PCBs in the batch, and its ready time is given by the PCB arriving last to the batch. ESS chambers are expensive and a bottleneck. Consequently, its makespan has to be minimized. ^ A mixed-integer formulation is proposed for the problem under study and compared to a formulation recently published. The proposed formulation is better in terms of the number of decision variables, linear constraints and run time. A procedure to compute the lower bound is proposed. For sparse problems (i.e. when job ready times are dispersed widely), the lower bounds are close to optimum. ^ The problem under study is NP-hard. Consequently, five heuristics, two metaheuristics (i.e. simulated annealing (SA) and greedy randomized adaptive search procedure (GRASP)), and a decomposition approach (i.e. column generation) are proposed—especially to solve problem instances which require prohibitively long run times when a commercial solver is used. Extensive experimental study was conducted to evaluate the different solution approaches based on the solution quality and run time. ^ The decomposition approach improved the lower bounds (or linear relaxation solution) of the mixed-integer formulation. At least one of the proposed heuristic outperforms the Modified Delay heuristic from the literature. For sparse problems, almost all the heuristics report a solution close to optimum. GRASP outperforms SA at a higher computational cost. The proposed approaches are viable to implement as the run time is very short. ^
Resumo:
With the advantages and popularity of Permanent Magnet (PM) motors due to their high power density, there is an increasing incentive to use them in variety of applications including electric actuation. These applications have strict noise emission standards. The generation of audible noise and associated vibration modes are characteristics of all electric motors, it is especially problematic in low speed sensorless control rotary actuation applications using high frequency voltage injection technique. This dissertation is aimed at solving the problem of optimizing the sensorless control algorithm for low noise and vibration while achieving at least 12 bit absolute accuracy for speed and position control. The low speed sensorless algorithm is simulated using an improved Phase Variable Model, developed and implemented in a hardware-in-the-loop prototyping environment. Two experimental testbeds were developed and built to test and verify the algorithm in real time.^ A neural network based modeling approach was used to predict the audible noise due to the high frequency injected carrier signal. This model was created based on noise measurements in an especially built chamber. The developed noise model is then integrated into the high frequency based sensorless control scheme so that appropriate tradeoffs and mitigation techniques can be devised. This will improve the position estimation and control performance while keeping the noise below a certain level. Genetic algorithms were used for including the noise optimization parameters into the developed control algorithm.^ A novel wavelet based filtering approach was proposed in this dissertation for the sensorless control algorithm at low speed. This novel filter was capable of extracting the position information at low values of injection voltage where conventional filters fail. This filtering approach can be used in practice to reduce the injected voltage in sensorless control algorithm resulting in significant reduction of noise and vibration.^ Online optimization of sensorless position estimation algorithm was performed to reduce vibration and to improve the position estimation performance. The results obtained are important and represent original contributions that can be helpful in choosing optimal parameters for sensorless control algorithm in many practical applications.^