966 resultados para Helmholtz Machines
Resumo:
This research aims at a study of the hybrid flow shop problem which has parallel batch-processing machines in one stage and discrete-processing machines in other stages to process jobs of arbitrary sizes. The objective is to minimize the makespan for a set of jobs. The problem is denoted as: FF: batch1,sj:Cmax. The problem is formulated as a mixed-integer linear program. The commercial solver, AMPL/CPLEX, is used to solve problem instances to their optimality. Experimental results show that AMPL/CPLEX requires considerable time to find the optimal solution for even a small size problem, i.e., a 6-job instance requires 2 hours in average. A bottleneck-first-decomposition heuristic (BFD) is proposed in this study to overcome the computational (time) problem encountered while using the commercial solver. The proposed BFD heuristic is inspired by the shifting bottleneck heuristic. It decomposes the entire problem into three sub-problems, and schedules the sub-problems one by one. The proposed BFD heuristic consists of four major steps: formulating sub-problems, prioritizing sub-problems, solving sub-problems and re-scheduling. For solving the sub-problems, two heuristic algorithms are proposed; one for scheduling a hybrid flow shop with discrete processing machines, and the other for scheduling parallel batching machines (single stage). Both consider job arrival and delivery times. An experiment design is conducted to evaluate the effectiveness of the proposed BFD, which is further evaluated against a set of common heuristics including a randomized greedy heuristic and five dispatching rules. The results show that the proposed BFD heuristic outperforms all these algorithms. To evaluate the quality of the heuristic solution, a procedure is developed to calculate a lower bound of makespan for the problem under study. The lower bound obtained is tighter than other bounds developed for related problems in literature. A meta-search approach based on the Genetic Algorithm concept is developed to evaluate the significance of further improving the solution obtained from the proposed BFD heuristic. The experiment indicates that it reduces the makespan by 1.93 % in average within a negligible time when problem size is less than 50 jobs.
Resumo:
The proliferation of legalized gaming has significantly changed the nature of the hospitality industry. While several aspects of gaming have flourished, none has become more popular, profitable, or technologically advanced as the slot machine. While more than half of all casino gambling, and earnings, is generated by slot machines, little has been written about the technology integral to these devices. The author describes the workings of computer-controlled slot machines and exposes some of the popular operating myths.
Resumo:
This Paper discusses the food and beverage machines that are located at Memorial University's Grenfell Campus and endeavors to assess how much those vending machines are being used and how they affect sustainability initiatives on campus. A survey was conducted to gauge the use of vending machines, their content and what is purchased, and if participants did not purchase from thes machines they were also asked why they did not.This survey produced many other questions that are directly linked to vending machines.Water quality on campus was heavily disscussed, along with the use of bottled water and implications associated with drinking only from bottles that are thrown away. The study concludes with a discussion of the alternative choices that can be implemented to replace vending machines.
Resumo:
This paper draws upon part of the findings of an ethnographic study in which two seventeen year old girls were employed to interview their peer about engineering as a study and career choice. It argues that whilst girls do view engineering as being generally masculine in nature, other factors such as a lack of female role models and an emphasis on physics and maths act as barriers to young women entering the discipline. The paper concludes by noting that engineering has much to offer young women, the problem is, they simply don’t know this is the case!
Resumo:
This paper formulates a linear kernel support vector machine (SVM) as a regularized least-squares (RLS) problem. By defining a set of indicator variables of the errors, the solution to the RLS problem is represented as an equation that relates the error vector to the indicator variables. Through partitioning the training set, the SVM weights and bias are expressed analytically using the support vectors. It is also shown how this approach naturally extends to Sums with nonlinear kernels whilst avoiding the need to make use of Lagrange multipliers and duality theory. A fast iterative solution algorithm based on Cholesky decomposition with permutation of the support vectors is suggested as a solution method. The properties of our SVM formulation are analyzed and compared with standard SVMs using a simple example that can be illustrated graphically. The correctness and behavior of our solution (merely derived in the primal context of RLS) is demonstrated using a set of public benchmarking problems for both linear and nonlinear SVMs.
Resumo:
The astonishing development of diverse and different hardware platforms is twofold: on one side, the challenge for the exascale performance for big data processing and management; on the other side, the mobile and embedded devices for data collection and human machine interaction. This drove to a highly hierarchical evolution of programming models. GVirtuS is the general virtualization system developed in 2009 and firstly introduced in 2010 enabling a completely transparent layer among GPUs and VMs. This paper shows the latest achievements and developments of GVirtuS, now supporting CUDA 6.5, memory management and scheduling. Thanks to the new and improved remoting capabilities, GVirtus now enables GPU sharing among physical and virtual machines based on x86 and ARM CPUs on local workstations,computing clusters and distributed cloud appliances.
Resumo:
Using data obtained by the high-resolution CRisp Imaging SpectroPolarimeter instrument on the Swedish 1 m Solar Telescope, we investigate the dynamics and stability of quiet-Sun chromospheric jets observed at the disk center. Small-scale features, such as rapid redshifted and blueshifted excursions, appearing as high-peed jets in the wings of the Hα line, are characterized by short lifetimes and rapid fading without any descending behavior. To study the theoretical aspects of their stability without considering their formation mechanism, we model chromospheric jets as twisted magnetic flux tubes moving along their axis, and use the ideal linear incompressible magnetohydrodynamic approximation to derive the governing dispersion equation. Analytical solutions of the dispersion equation indicate that this type of jet is unstable to Kelvin–Helmholtz instability (KHI), with a very short (few seconds) instability growth time at high upflow speeds. The generated vortices and unresolved turbulent flows associated with the KHI could be observed as a broadening of chromospheric spectral lines. Analysis of the Hα line profiles shows that the detected structures have enhanced line widths with respect to the background. We also investigate the stability of a larger-scale Hα jet that was ejected along the line of sight. Vortex-like features, rapidly developing around the jet’s boundary, are considered as evidence of the KHI. The analysis of the energy equation in the partially ionized plasma shows that ion–neutral collisions may lead to fast heating of the KH vortices over timescales comparable to the lifetime of chromospheric jets.
Resumo:
Abstract: In the mid-1990s when I worked for a telecommunications giant I struggled to gain access to basic geodemographic data. It cost hundreds of thousands of dollars at the time to simply purchase a tile of satellite imagery from Marconi, and it was often cheaper to create my own maps using a digitizer and A0 paper maps. Everything from granular administrative boundaries to right-of-ways to points of interest and geocoding capabilities were either unavailable for the places I was working in throughout Asia or very limited. The control of this data was either in a government’s census and statistical bureau or was created by a handful of forward thinking corporations. Twenty years on we find ourselves inundated with data (location and other) that we are challenged to amalgamate, and much of it still “dirty” in nature. Open data initiatives such as ODI give us great hope for how we might be able to share information together and capitalize not only in the crowdsourcing behavior but in the implications for positive usage for the environment and for the advancement of humanity. We are already gathering and amassing a great deal of data and insight through excellent citizen science participatory projects across the globe. In early 2015, I delivered a keynote at the Data Made Me Do It conference at UC Berkeley, and in the preceding year an invited talk at the inaugural QSymposium. In gathering research for these presentations, I began to ponder on the effect that social machines (in effect, autonomous data collection subjects and objects) might have on social behaviors. I focused on studying the problem of data from various veillance perspectives, with an emphasis on the shortcomings of uberveillance which included the potential for misinformation, misinterpretation, and information manipulation when context was entirely missing. As we build advanced systems that rely almost entirely on social machines, we need to ponder on the risks associated with following a purely technocratic approach where machines devoid of intelligence may one day dictate what humans do at the fundamental praxis level. What might be the fallout of uberveillance? Bio: Dr Katina Michael is a professor in the School of Computing and Information Technology at the University of Wollongong. She presently holds the position of Associate Dean – International in the Faculty of Engineering and Information Sciences. Katina is the IEEE Technology and Society Magazine editor-in-chief, and IEEE Consumer Electronics Magazine senior editor. Since 2008 she has been a board member of the Australian Privacy Foundation, and until recently was the Vice-Chair. Michael researches on the socio-ethical implications of emerging technologies with an emphasis on an all-hazards approach to national security. She has written and edited six books, guest edited numerous special issue journals on themes related to radio-frequency identification (RFID) tags, supply chain management, location-based services, innovation and surveillance/ uberveillance for Proceedings of the IEEE, Computer and IEEE Potentials. Prior to academia, Katina worked for Nortel Networks as a senior network engineer in Asia, and also in information systems for OTIS and Andersen Consulting. She holds cross-disciplinary qualifications in technology and law.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-07
Resumo:
Ce travail présente une modélisation rapide d’ordre élévé capable de modéliser une configuration rotorique en cage complète ou en grille, de reproduire les courants de barre et tenir compte des harmoniques d’espace. Le modèle utilise une approche combinée d’éléments finis avec les circuits-couplés. En effet, le calcul des inductances est réalisé avec les éléments finis, ce qui confère une précision avancée au modèle. Cette méthode offre un gain important en temps de calcul sur les éléments finis pour des simulations transitoires. Deux outils de simulation sont développés, un dans le domaine du temps pour des résolutions dynamiques et un autre dans le domaine des phaseurs dont une application sur des tests de réponse en fréquence à l’arrêt (SSFR) est également présentée. La méthode de construction du modèle est décrite en détail de même que la procédure de modélisation de la cage du rotor. Le modèle est validé par l’étude de machines synchrones: une machine de laboratoire de 5.4 KVA et un grand alternateur de 109 MVA dont les mesures expérimentales sont comparées aux résultats de simulation du modèle pour des essais tels que des tests à vide, des courts-circuits triphasés, biphasés et un test en charge.
Resumo:
A correct understanding about how computers run code is mandatory in order to effectively learn to program. Lectures have historically been used in programming courses to teach how computers execute code, and students are assessed through traditional evaluation methods, such as exams. Constructivism learning theory objects to students passiveness during lessons, and traditional quantitative methods for evaluating a complex cognitive process such as understanding. Constructivism proposes complimentary techniques, such as conceptual contraposition and colloquies. We enriched lectures of a Programming II (CS2) course combining conceptual contraposition with program memory tracing, then we evaluated students understanding of programming concepts through colloquies. Results revealed that these techniques applied to the lecture are insufficient to help students develop satisfactory mental models of the C++ notional machine, and colloquies behaved as the most comprehensive traditional evaluations conducted in the course.
Resumo:
Second order matrix equations arise in the description of real dynamical systems. Traditional modal control approaches utilise the eigenvectors of the undamped system to diagonalise the system matrices. A regrettable consequence of this approach is the discarding of residual o-diagonal terms in the modal damping matrix. This has particular importance for systems containing skew-symmetry in the damping matrix which is entirely discarded in the modal damping matrix. In this paper a method to utilise modal control using the decoupled second order matrix equations involving nonclassical damping is proposed. An example of modal control sucessfully applied to a rotating system is presented in which the system damping matrix contains skew-symmetric components.
Resumo:
In the context of active control of rotating machines, standard optimal controller methods enable a trade-off to be made between (weighted) mean-square vibrations and (weighted) mean-square currents injected into magnetic bearings. One shortcoming of such controllers is that no concern is devoted to the voltages required. In practice, the voltage available imposes a strict limitation on the maximum possible rate of change of control force (force slew rate). This paper removes the aforementioned existing shortcomings of traditional optimal control.
Resumo:
Second order matrix equations arise in the description of real dynamical systems. Traditional modal control approaches utilise the eigenvectors of the undamped system to diagonalise the system matrices. A regrettable consequence of this approach is the discarding of residual off-diagonal terms in the modal damping matrix. This has particular importance for systems containing skew-symmetry in the damping matrix which is entirely discarded in the modal damping matrix. In this paper a method to utilise modal control using the decoupled second order matrix equations involving non-classical damping is proposed. An example of modal control successfully applied to a rotating system is presented in which the system damping matrix contains skew-symmetric components.
Resumo:
Les traitements acoustiques actuels pour parois d’avion sont performants en hautes fréquences mais nécessitent des améliorations en basses fréquences. En effet dans le cas des matériaux classiques cela nécessite une épaisseur élevée et donc les traitements ont une masse très importante. Des solutions sortant de ce cadre doivent donc être développées. Ce projet de maîtrise a pour but de créer un traitement acoustique à base de résonateurs de Helmholtz intégrés dans un matériau poreux, afin de réfléchir les ondes acoustiques basses fréquences tout en absorbant sur une large bande de fréquences en hautes fréquences. Le principe est basé sur la conception d’un méta-composite, optimisé numériquement et validé expérimentalement en tube d’impédance et chambres de transmission. La performance du concept sera également étudiée sur une maquette de la coiffe du lanceur Ariane 5 avec un modèle d’analyse énergétique statistique (SEA). Pour cela, on s’appuie sur les travaux précédents sur les résonateurs d’Helmholtz, les méta-matériaux, les méta-composites et la modélisation par matrices de transfert. L’optimisation se fait via un modèle basé sur les matrices de transfert placé dans une boucle d’optimisation.