821 resultados para Electronic Gaming Machines
Resumo:
To study the complex formation of group 5 elements (Nb, Ta, Ha, and pseudoanalog Pa) in aqueous HCI solutions of medium and high concentrations the electronic structures of anionic complexes of these elements [MCl_6]^-, [MOCl_4]^-, [M(OH)-2 Cl_4]^-, and [MOCl_5]^2- have been calculated using the relativistic Dirac-Slater Discrete-Variational Method. The charge density distribution analysis has shown that tantalum occupies a specific position in the group and has the highest tendency to form the pure halide complex, [TaCl_6-. This fact along with a high covalency of this complex explains its good extractability into aliphatic amines. Niobium has equal trends to form pure halide [NbCl_6]^- and oxyhalide [NbOCl_5]^2- species at medium and high acid concentrations. Protactinium has a slight preference for the [PaOCl_5]^2- form or for the pure halide complexes with coordination number higher than 6 under these conditions. Element 105 at high HCl concentrations will have a preference to form oxyhalide anionic complex [HaOCl_5]^2- rather than [HaCl_6]^-. For the same sort of anionic oxychloride complexes an estimate has been done of their partition between the organic and aqueous phases in the extraction by aliphatic amines, which shows the following succession of the partition coefficients: P_Nb < P_Ha < P_Pa.
Resumo:
The electronic structure of the group 6 oxyanions [MO_4]^2-, where M = Cr, Mo, W, and element 106 have been calculated using the Dirac-Slater Discrete Variational method. Results of the calculations show a relative decrease in the metal-oxygen bond strengths for the [E106O_4]^2- ion in the solid state compared to that for the [WO_4]^2- anion. Calculated energies of the electronic charge-transfer transitions are indicative of a strong possible luminescence of [El06O_4]^2- in the blue-violet area. In solutions [El06O_4]^2- will be the most stable ion out of the entire series. Estimated reduction potential E^0 (El06O^2-_4/E106O^3-_4) equal to -1.60V shows only a slightly increased stability of the +6 oxidation state for element 106 in comparison with W.
Resumo:
Bewegt man sich als Studentin oder Student in psychologischen, medizinischen oder pädagogischen Studiengängen, ist häufig eine systematische Literaturrecherche unerlässlich, um sich über den neuesten Forschungsstand in einem bestimmten Themenfeld zu informieren oder sich in ein neues Thema beispielsweise für die Abschlussarbeit einzuarbeiten. Literaturrecherchen sind zentraler Bestandteil jeden wissenschaftlichen Arbeitens. Die recherchierten Literaturangaben und Quellen bilden die Bausteine, auf denen die Darstellung des Wissens in Hausarbeiten, Magister-, Diplomarbeiten und später Doktorarbeiten basiert. In Form von Quellenangaben, z.B. durch indirekte oder direkte Zitate oder Paraphrasierungen werden Bezugnahmen auf die bereits vorhandene Literatur transparent gemacht. Häufig genügt ein Blick auf eine Literaturliste um zu erkennen, wie Literatur gesucht und zusammengestellt wurde. Wissenschaftliches Arbeiten unterscheidet sich von künstlerischem Arbeiten durch seine Systematik und intersubjektive Nachvollziehbarkeit. Diese Systematik sollte bereits bei der Literaturrecherche beginnen und sich am Ende der Arbeit in der Literaturliste widerspiegeln. Auch wenn in einer Magister-, Diplom- und noch weniger in einer Hausarbeit die gesamte, gefundene Literatur verwendet wird, sondern nur eine sehr kleine Auswahl in die Arbeit einfließt, ist es anstrebenswert, sich über die möglichen Suchstrategien im Einzelnen klar zu werden und sich systematisch durch den Berg von Literatur(einträgen) nach einer Recherche zu einer einschlägigen und begründeten Auswahl vorzuarbeiten. Hier stellen die schnell wachsenden Wissensbestände eine besondere Herausforderung an Studierende und Wissenschaftler. Die Recherche am „Zettelkasten‟ in der Bibliothek ist durch die Online-Recherche ersetzt worden. Wissenschaftliche Literatur wird heute in erster Linie digital gesucht, gefunden und verwaltet. Für die Literatursuche steht eine Vielfalt an Suchmaschinen zur Verfügung. Doch welche ist die richtige? Und wie suche ich systematisch? Wie dokumentiere ich meine Suche? Wie komme ich an die Literatur und wie verwalte ich die Literatur? Diese und weitere Fragen haben auch wir uns gestellt und für alle Studierenden der Fächer Psychologie, Psychoanalyse, Medizin und Pädagogik diese Handreichung geschrieben. Sie will eine Hilfe bei der konkreten Umsetzung einer Literaturrecherche sein.
Resumo:
Vorgestellt wird eine weltweit neue Methode, Schnittstellen zwischen Menschen und Maschinen für individuelle Bediener anzupassen. Durch Anwenden von Abstraktionen evolutionärer Mechanismen wie Selektion, Rekombination und Mutation in der EOGUI-Methodik (Evolutionary Optimization of Graphical User Interfaces) kann eine rechnergestützte Umsetzung der Methode für Graphische Bedienoberflächen, insbesondere für industrielle Prozesse, bereitgestellt werden. In die Evolutionäre Optimierung fließen sowohl die objektiven, d.h. messbaren Größen wie Auswahlhäufigkeiten und -zeiten, mit ein, als auch das anhand von Online-Fragebögen erfasste subjektive Empfinden der Bediener. Auf diese Weise wird die Visualisierung von Systemen den Bedürfnissen und Präferenzen einzelner Bedienern angepasst. Im Rahmen dieser Arbeit kann der Bediener aus vier Bedienoberflächen unterschiedlicher Abstraktionsgrade für den Beispielprozess MIPS ( MIschungsProzess-Simulation) die Objekte auswählen, die ihn bei der Prozessführung am besten unterstützen. Über den EOGUI-Algorithmus werden diese Objekte ausgewählt, ggf. verändert und in einer neuen, dem Bediener angepassten graphischen Bedienoberfläche zusammengefasst. Unter Verwendung des MIPS-Prozesses wurden Experimente mit der EOGUI-Methodik durchgeführt, um die Anwendbarkeit, Akzeptanz und Wirksamkeit der Methode für die Führung industrieller Prozesse zu überprüfen. Anhand der Untersuchungen kann zu großen Teilen gezeigt werden, dass die entwickelte Methodik zur Evolutionären Optimierung von Mensch-Maschine-Schnittstellen industrielle Prozessvisualisierungen tatsächlich an den einzelnen Bediener anpaßt und die Prozessführung verbessert.
Resumo:
Among organic materials, spirobifluorene derivatives represent a very attractive class of materials for electronic devices. These compounds have high melting points, glass transitions temperatures and morphological stability, which makes these materials suitable for organic electronic applications. In addition, some of spirobifluorenes can form porous supramolecular associations with significant volumes available for the inclusion of guests. These molecular associations based on the spirobifluorenes are noteworthy because they are purely molecular analogues of zeolites and other microporous solids, with potential applications in separation, catalysis, sensing and other areas.
Resumo:
Many ultrafast structural phenomena in solids at high fluences are related to the hardening or softening of particular lattice vibrations at lower fluences. In this paper we relate femtosecond-laser-induced phonon frequency changes to changes in the electronic density of states, which need to be evaluated only in the electronic ground state, following phonon displacement patterns. We illustrate this relationship for a particular lattice vibration of magnesium, for which we—surprisingly—find that there is both softening and hardening as a function of the femtosecond-laser fluence. Using our theory, we explain these behaviours as arising from Van Hove singularities: We show that at low excitation densities Van Hove singularities near the Fermi level dominate the change of the phonon frequency while at higher excitations Van Hove singularities that are further away in energy also become important. We expect that our theory can as well shed light on the effects of laser excitation of other materials.
Resumo:
Interatomic coulombic decay (ICD), a radiationless transition in weakly bonded systems, such as solutes or van der Waals bound aggregates, is an effective source for electrons of low kinetic energy. So far, the ICD processes could only be probed in ultra-high vacuum by using electron and/or ion spectroscopy. Here we show that resonant ICD processes can also be detected by measuring the subsequently emitted characteristic fluorescence radiation, which makes their study in dense media possible.
Resumo:
The structural, electronic and magnetic properties of one-dimensional 3d transition-metal (TM) monoatomic chains having linear, zigzag and ladder geometries are investigated in the frame-work of first-principles density-functional theory. The stability of long-range magnetic order along the nanowires is determined by computing the corresponding frozen-magnon dispersion relations as a function of the 'spin-wave' vector q. First, we show that the ground-state magnetic orders of V, Mn and Fe linear chains at the equilibrium interatomic distances are non-collinear (NC) spin-density waves (SDWs) with characteristic equilibrium wave vectors q that depend on the composition and interatomic distance. The electronic and magnetic properties of these novel spin-spiral structures are discussed from a local perspective by analyzing the spin-polarized electronic densities of states, the local magnetic moments and the spin-density distributions for representative values q. Second, we investigate the stability of NC spin arrangements in Fe zigzag chains and ladders. We find that the non-collinear SDWs are remarkably stable in the biatomic chains (square ladder), whereas ferromagnetic order (q =0) dominates in zigzag chains (triangular ladders). The different magnetic structures are interpreted in terms of the corresponding effective exchange interactions J(ij) between the local magnetic moments μ(i) and μ(j) at atoms i and j. The effective couplings are derived by fitting a classical Heisenberg model to the ab initio magnon dispersion relations. In addition they are analyzed in the framework of general magnetic phase diagrams having arbitrary first, second, and third nearest-neighbor (NN) interactions J(ij). The effect of external electric fields (EFs) on the stability of NC magnetic order has been quantified for representative monoatomic free-standing and deposited chains. We find that an external EF, which is applied perpendicular to the chains, favors non-collinear order in V chains, whereas it stabilizes the ferromagnetic (FM) order in Fe chains. Moreover, our calculations reveal a change in the magnetic order of V chains deposited on the Cu(110) surface in the presence of external EFs. In this case the NC spiral order, which was unstable in the absence of EF, becomes the most favorable one when perpendicular fields of the order of 0.1 V/Å are applied. As a final application of the theory we study the magnetic interactions within monoatomic TM chains deposited on graphene sheets. One observes that even weak chain substrate hybridizations can modify the magnetic order. Mn and Fe chains show incommensurable NC spin configurations. Remarkably, V chains show a transition from a spiral magnetic order in the freestanding geometry to FM order when they are deposited on a graphene sheet. Some TM-terminated zigzag graphene-nanoribbons, for example V and Fe terminated nanoribbons, also show NC spin configurations. Finally, the magnetic anisotropy energies (MAEs) of TM chains on graphene are investigated. It is shown that Co and Fe chains exhibit significant MAEs and orbital magnetic moments with in-plane easy magnetization axis. The remarkable changes in the magnetic properties of chains on graphene are correlated to charge transfers from the TMs to NN carbon atoms. Goals and limitations of this study and the resulting perspectives of future investigations are discussed.
Resumo:
Fine-grained parallel machines have the potential for very high speed computation. To program massively-concurrent MIMD machines, programmers need tools for managing complexity. These tools should not restrict program concurrency. Concurrent Aggregates (CA) provides multiple-access data abstraction tools, Aggregates, which can be used to implement abstractions with virtually unlimited potential for concurrency. Such tools allow programmers to modularize programs without reducing concurrency. I describe the design, motivation, implementation and evaluation of Concurrent Aggregates. CA has been used to construct a number of application programs. Multi-access data abstractions are found to be useful in constructing highly concurrent programs.
Resumo:
The Support Vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights and threshold such as to minimize an upper bound on the expected test error. The present study is devoted to an experimental comparison of these machines with a classical approach, where the centers are determined by $k$--means clustering and the weights are found using error backpropagation. We consider three machines, namely a classical RBF machine, an SV machine with Gaussian kernel, and a hybrid system with the centers determined by the SV method and the weights trained by error backpropagation. Our results show that on the US postal service database of handwritten digits, the SV machine achieves the highest test accuracy, followed by the hybrid approach. The SV approach is thus not only theoretically well--founded, but also superior in a practical application.
Resumo:
This paper presents an adaptive learning model for market-making under the reinforcement learning framework. Reinforcement learning is a learning technique in which agents aim to maximize the long-term accumulated rewards. No knowledge of the market environment, such as the order arrival or price process, is assumed. Instead, the agent learns from real-time market experience and develops explicit market-making strategies, achieving multiple objectives including the maximizing of profits and minimization of the bid-ask spread. The simulation results show initial success in bringing learning techniques to building market-making algorithms.
Resumo:
Support Vector Machines (SVMs) perform pattern recognition between two point classes by finding a decision surface determined by certain points of the training set, termed Support Vectors (SV). This surface, which in some feature space of possibly infinite dimension can be regarded as a hyperplane, is obtained from the solution of a problem of quadratic programming that depends on a regularization parameter. In this paper we study some mathematical properties of support vectors and show that the decision surface can be written as the sum of two orthogonal terms, the first depending only on the margin vectors (which are SVs lying on the margin), the second proportional to the regularization parameter. For almost all values of the parameter, this enables us to predict how the decision surface varies for small parameter changes. In the special but important case of feature space of finite dimension m, we also show that there are at most m+1 margin vectors and observe that m+1 SVs are usually sufficient to fully determine the decision surface. For relatively small m this latter result leads to a consistent reduction of the SV number.
Resumo:
We derive a new representation for a function as a linear combination of local correlation kernels at optimal sparse locations and discuss its relation to PCA, regularization, sparsity principles and Support Vector Machines. We first review previous results for the approximation of a function from discrete data (Girosi, 1998) in the context of Vapnik"s feature space and dual representation (Vapnik, 1995). We apply them to show 1) that a standard regularization functional with a stabilizer defined in terms of the correlation function induces a regression function in the span of the feature space of classical Principal Components and 2) that there exist a dual representations of the regression function in terms of a regularization network with a kernel equal to a generalized correlation function. We then describe the main observation of the paper: the dual representation in terms of the correlation function can be sparsified using the Support Vector Machines (Vapnik, 1982) technique and this operation is equivalent to sparsify a large dictionary of basis functions adapted to the task, using a variation of Basis Pursuit De-Noising (Chen, Donoho and Saunders, 1995; see also related work by Donahue and Geiger, 1994; Olshausen and Field, 1995; Lewicki and Sejnowski, 1998). In addition to extending the close relations between regularization, Support Vector Machines and sparsity, our work also illuminates and formalizes the LFA concept of Penev and Atick (1996). We discuss the relation between our results, which are about regression, and the different problem of pattern classification.
Resumo:
We study the relation between support vector machines (SVMs) for regression (SVMR) and SVM for classification (SVMC). We show that for a given SVMC solution there exists a SVMR solution which is equivalent for a certain choice of the parameters. In particular our result is that for $epsilon$ sufficiently close to one, the optimal hyperplane and threshold for the SVMC problem with regularization parameter C_c are equal to (1-epsilon)^{- 1} times the optimal hyperplane and threshold for SVMR with regularization parameter C_r = (1-epsilon)C_c. A direct consequence of this result is that SVMC can be seen as a special case of SVMR.