987 resultados para Birkhoff and Von Neumann ergodic theorems
Resumo:
"National Socialism": 1. Ankündigung einer Vorlesungsreihe November/Dezember 1941 von: Herbert Marcuse, A.R.L. Gurland, Franz Neumann, Otto Kirchheimer, Frederick Pollock. a) als Typoskript verfielfältigt, 1 Blatt, b) Typoskript, 1 Blatt; 2. Antwortbrief auf Einladungen zur Vorlesungsreihe, von Neilson, William A.; Packelis, Alexander H.; Michael, Jerome; McClung Lee, Alfred; Youtz, R.P.; Ginsburg, Isidor; Ganey, G.; Nunhauer, Arthur. 8 Blätter; "Autoritarian doctrines and modern European institutions" (1924): 1. Vorlesungs-Ankündigung Typoskript, 2 Blatt; 2. Ankündigungen der Vorlesungen von Neumann, Franz L.: "Stratification and Dominance in Germany"; "Bureaucracy as a Social and Political Institution", Typoskript, 2 Blatt; 3. Evans, Austin P.: 1 Brief (Abschrift) an Frederick Pollock, New York, 26.2.1924; "Eclipse of Reason", Fünf Vorlesungen 1943/44:; 1. I. Lecture. a) Typoskript mit eigenhändigen Korrekturen, 38 Blatt b) Typoskript, 29 Blatt c) Typoskript mit eigenhändigen und handschriftlichen Korrekturen, 31 Blatt d) Teilstück, Typoskript mit eigenhändigen Korrekturen, 2 Blatt e) Entwürfe, Typoskript mit eigenhändigen Korrekturen, 6 Blatt; 2. II. Lecture. a) Typoskript mit eigenhändigen Korrekturen, 27 Blatt, b) Typoskript mit handschriftlichen Korrekturen, 37 Blatt; 3. III. Lecture. Typoskript mit eigenhändigen Korrekturen, 27 Blatt; 4. IV. Lecture. Typoskript mit eigenhändigen Korrekturen, 23 Blatt; 5. V. Lecture. a) Typoskript mit eigenhändigen Korrekturen, 25 Blatt, b) Teilstücke, Typoskript mit eigenhändigen und handschriftlichen Korrekturen, 3 Blatt;
Resumo:
Three long-term temperature data series measured in Portugal were studied to detect and correct non-climatic homogeneity breaks and are now available for future studies of climate variability. Series of monthly minimum (Tmin) and maximum (Tmax) temperatures measured in the three Portuguese meteorological stations of Lisbon (from 1856 to 2008), Coimbra (from 1865 to 2005) and Porto (from 1888 to 2001) were studied to detect and correct non-climatic homogeneity breaks. These series together with monthly series of average temperature (Taver) and temperature range (DTR) derived from them were tested in order to detect homogeneity breaks, using, firstly, metadata, secondly, a visual analysis and, thirdly, four widely used homogeneity tests: von Neumann ratio test, Buishand test, standard normal homogeneity test and Pettitt test. The homogeneity tests were used in absolute (using temperature series themselves) and relative (using sea-surface temperature anomalies series obtained from HadISST2 close to the Portuguese coast or already corrected temperature series as reference series) modes. We considered the Tmin, Tmax and DTR series as most informative for the detection of homogeneity breaks due to the fact that Tmin and Tmax could respond differently to changes in position of a thermometer or other changes in the instrument's environment; Taver series have been used, mainly, as control. The homogeneity tests show strong inhomogeneity of the original data series, which could have both internal climatic and non-climatic origins. Homogeneity breaks which have been identified by the last three mentioned homogeneity tests were compared with available metadata containing data, such as instrument changes, changes in station location and environment, observing procedures, etc. Significant homogeneity breaks (significance 95% or more) that coincide with known dates of instrumental changes have been corrected using standard procedures. It was also noted that some significant homogeneity breaks, which could not be connected to the known dates of any changes in the park of instruments or stations location and environment, could be caused by large volcanic eruptions. The corrected series were again tested for homogeneity: the corrected series were considered free of non-climatic breaks when the tests of most of monthly series showed no significant (significance 95% or more) homogeneity breaks that coincide with dates of known instrument changes. Corrected series are now available in the frame of ERA-CLIM FP7 project for future studies of climate variability.
Resumo:
We introduce a new class of generalized isotropic Lipkin–Meshkov–Glick models with su(m+1) spin and long-range non-constant interactions, whose non-degenerate ground state is a Dicke state of su(m+1) type. We evaluate in closed form the reduced density matrix of a block of Lspins when the whole system is in its ground state, and study the corresponding von Neumann and Rényi entanglement entropies in the thermodynamic limit. We show that both of these entropies scale as a log L when L tends to infinity, where the coefficient a is equal to (m − k)/2 in the ground state phase with k vanishing magnon densities. In particular, our results show that none of these generalized Lipkin–Meshkov–Glick models are critical, since when L-->∞ their Rényi entropy R_q becomes independent of the parameter q. We have also computed the Tsallis entanglement entropy of the ground state of these generalized su(m+1) Lipkin–Meshkov–Glick models, finding that it can be made extensive by an appropriate choice of its parameter only when m-k≥3. Finally, in the su(3) case we construct in detail the phase diagram of the ground state in parameter space, showing that it is determined in a simple way by the weights of the fundamental representation of su(3). This is also true in the su(m+1) case; for instance, we prove that the region for which all the magnon densities are non-vanishing is an (m + 1)-simplex in R^m whose vertices are the weights of the fundamental representation of su(m+1).
Resumo:
"UIUCDCS-R-74-630"
Resumo:
We investigate boundary critical phenomena from a quantum-information perspective. Bipartite entanglement in the ground state of one-dimensional quantum systems is quantified using the Renyi entropy S-alpha, which includes the von Neumann entropy (alpha -> 1) and the single-copy entanglement (alpha ->infinity) as special cases. We identify the contribution of the boundaries to the Renyi entropy, and show that there is an entanglement loss along boundary renormalization group (RG) flows. This property, which is intimately related to the Affleck-Ludwig g theorem, is a consequence of majorization relations between the spectra of the reduced density matrix along the boundary RG flows. We also point out that the bulk contribution to the single-copy entanglement is half of that to the von Neumann entropy, whereas the boundary contribution is the same.
Resumo:
2000 Mathematics Subject Classification: Primary 13A99; Secondary 13A15, 13B02, 13E05.
Resumo:
A dolgozatban a Neumann-modell lehetséges elméleti és módszertani rokonságát elemezzük annak fényében, hogy mind a neoklasszikusok, mind a klasszikus hagyományokat felélesztő neoricardiánusok a magukénak vallják. Ennek során megvizsgáljuk a klasszikus és a neoklasszikus gazdaságfelfogás, az ex post és az ex ante szemléletű modellek közötti különbségeket, és azt a forradalmi jelentőségű módszertani változást, amely a sok szempontból joggal bírálható modern matematikai közgazdaságtan kialakulásához vezetett. Összevetjük Neumann modelljét az osztrák iskola árbeszámítási elméletével, a WalrasCassel- és a SchlesingerWald-féle modellekkel, illetve a Ricardo, Marx, Dmitriev, Leontief nevekkel fémjelezhető klasszikus vonulat eredményeivel. Rámutatunk arra, hogy Neumann voltaképpen az "igazságos és értelmes gazdaság" ősi ideáját öntötte kora modern fizikájában honos matematikai modell formájába. /===/ The paper investigates the potential theoretical and methodological sources of inspiration of the von Neumann model, in view of the fact that both the neoclassical and the neo-Ricardian economists claim heritage to it. In the course of that the author assesses the main differences of the classical and neoclassical, the ex post and ex ante modeling approaches. He also confronts the von Neumann model with the Walras–Cassel and the Schlesinger–Wald models, and with models worked out in the classical tradition a’la Ricardo, Marx, Dmitriev and Leontief. He concludes that the Neumann-model is, in fact, nothing but a reformulation of a very old belief in a “just and reasonable economic system” based on the modern modeling approach of contemporary physics and mathematics.
Resumo:
A dolgozat a klasszikusnak tekinthető Neumann-féle növekedési modell egy új alapra helyezését tartalmazza. Az eredeti Neumann-modellben expliciten vállalatok nem szerepelnek, csak technológiák vagy eljárások. A dolgozat egy olyan Neumann-típusú modellt vizsgál, amelyben az egyes technológiáknak vállalatokat feleltet meg, és azt vizsgálja, hogy ilyen feltételezés mellett egy ilyen gazdaságban léteznek-e olyan megoldások, amelyek mellett a vállalatok maximalizálják a nyereségüket. Ennek vizsgálata közben arra az eredményre juthatunk, hogy erre az esetre a klasszikus Neumann-modell által feltételezett nempozitív nyereséget felül kell vizsgálni, ami a klasszikus matematikai közgazdaságtan dualitáson alapuló alapfeltételezése. ______ The paper investigates a generalization of the classical growth model of John von Neumann. There are only technologies in model of von Neumann. The aim of the paper is to rename technologies as firms and it is analyzed whether there exist equilibrium prices and quantities for firms to maximize the total profit. The paper reexamines the classical assumption about the duality of prices, i.e. it is allowed a nonnegative profit of firms.
Resumo:
A szerző a tisztán elméleti célokra kifejlesztett Neumann-modellt és a gyakorlati alkalmazások céljára kifejlesztett Leontief-modellt veti össze. A Neumann-modell és a Leontief-modell éves termelési periódust feltételező, zárt, stacionárius változatának hasonló matematikai struktúrája azt a feltételezést sugallja, hogy az utóbbi a Neumann-modell sajátos eseteként értelmezhető. Az egyes modellek közgazdasági tartalmát és feltevéseit részletesen kibontva és egymással összevetve a szerző megmutatja, hogy a fenti következtetés félrevezető, két merőben különböző modellről van szó, nem lehet az egyikből a másikat levezetni. Az ikertermelés és technológiai választék lehetősége a Neumann-modell elengedhetetlen feltevése, az éves termelési periódus feltevése pedig kizárja folyam jellegű kibocsátások explicit figyelembevételét. Mindezek feltevések ugyanakkor idegenek a Leontief-modelltől. A két modell valójában egy általánosabb állomány–folyam jellegű zárt, stacionárius modell sajátos esete, méghozzá azok folyamváltozókra redukált alakja. _____ The paper compares the basic assumptions and methodology of the Von Neumann model, developed for purely abstract theoretical purposes, and those of the Leontief model, designed originally for practical applications. Study of the similar mathematical structures of the Von Neumann model and the closed, stationary Leontief model, with a unit length of production period, often leads to the false conclusion that the latter is just a simplified version of the former. It is argued that the economic assumptions of the two models are quite different, which makes such an assertion unfounded. Technical choice and joint production are indispensable features of the Von Neumann model, and the assumption of unitary length of production period excludes the possibility of taking service flows explicitly into account. All these features are completely alien to the Leontief model, however. It is shown that the two models are in fact special cases of a more general stock-flow stationary model, reduced to forms containing only flow variables.
Resumo:
In this thesis we introduce nuclear dimension and compare it with a stronger form of the completely positive approximation property. We show that the approximations forming this stronger characterisation of the completely positive approximation property witness finite nuclear dimension if and only if the underlying C*-algebra is approximately finite dimensional. We also extend this result to nuclear dimension at most omega. We review interactions between separably acting injective von Neumann algebras and separable nuclear C*-algebras. In particular, we discuss aspects of Connes' work and how some of his strategies have been used by C^*-algebraist to estimate the nuclear dimension of certain classes of C*-algebras. We introduce a notion of coloured isomorphisms between separable unital C*-algebras. Under these coloured isomorphisms ideal lattices, trace spaces, commutativity, nuclearity, finite nuclear dimension and weakly pure infiniteness are preserved. We show that these coloured isomorphisms induce isomorphisms on the classes of finite dimensional and commutative C*-algebras. We prove that any pair of Kirchberg algebras are 2-coloured isomorphic and any pair of separable, simple, unital, finite, nuclear and Z-stable C*-algebras with unique trace which satisfy the UCT are also 2-coloured isomorphic.
Resumo:
Analog In-memory Computing (AIMC) has been proposed in the context of Beyond Von Neumann architectures as a valid strategy to reduce internal data transfers energy consumption and latency, and to improve compute efficiency. The aim of AIMC is to perform computations within the memory unit, typically leveraging the physical features of memory devices. Among resistive Non-volatile Memories (NVMs), Phase-change Memory (PCM) has become a promising technology due to its intrinsic capability to store multilevel data. Hence, PCM technology is currently investigated to enhance the possibilities and the applications of AIMC. This thesis aims at exploring the potential of new PCM-based architectures as in-memory computational accelerators. In a first step, a preliminar experimental characterization of PCM devices has been carried out in an AIMC perspective. PCM cells non-idealities, such as time-drift, noise, and non-linearity have been studied to develop a dedicated multilevel programming algorithm. Measurement-based simulations have been then employed to evaluate the feasibility of PCM-based operations in the fields of Deep Neural Networks (DNNs) and Structural Health Monitoring (SHM). Moreover, a first testchip has been designed and tested to evaluate the hardware implementation of Multiply-and-Accumulate (MAC) operations employing PCM cells. This prototype experimentally demonstrates the possibility to reach a 95% MAC accuracy with a circuit-level compensation of cells time drift and non-linearity. Finally, empirical circuit behavior models have been included in simulations to assess the use of this technology in specific DNN applications, and to enhance the potentiality of this innovative computation approach.
Resumo:
Oggigiorno il concetto di informazione è diventato cruciale in fisica, pertanto, siccome la migliore teoria che abbiamo per compiere predizioni riguardo l'universo è la meccanica quantistica, assume una particolare importanza lo sviluppo di una versione quantistica della teoria dell'informazione. Questa centralità è confermata dal fatto che i buchi neri hanno entropia. Per questo motivo, in questo lavoro sono presentati elementi di teoria dell'informazione quantistica e della comunicazione quantistica e alcuni sono illustrati riferendosi a modelli quantistici altamente idealizzati della meccanica di buco nero. In particolare, nel primo capitolo sono forniti tutti gli strumenti quanto-meccanici per la teoria dell'informazione e della comunicazione quantistica. Successivamente, viene affrontata la teoria dell'informazione quantistica e viene trovato il limite di Bekenstein alla quantità di informazione chiudibile entro una qualunque regione spaziale. Tale questione viene trattata utilizzando un modello quantistico idealizzato della meccanica di buco nero supportato dalla termodinamica. Nell'ultimo capitolo, viene esaminato il problema di trovare un tasso raggiungibile per la comunicazione quantistica facendo nuovamente uso di un modello quantistico idealizzato di un buco nero, al fine di illustrare elementi della teoria. Infine, un breve sommario della fisica dei buchi neri è fornito in appendice.
Resumo:
A avaliação da dor em animais necessita da utilização de escalas de avaliação, que dependem da interpretação realizada por observadores. O objetivo do presente estudo foi avaliar a correlação entre a escala visual analógica (EVA), escala de Melbourne e os filamentos de Von Frey, na avaliação da dor pós-operatória em 42 cadelas adultas e saudáveis, submetidas à ovariossalpingohisterectomia (OSH). A dor pós-operatória foi avaliada por dois observadores cegos aos tratamentos analgésicos, em intervalos de uma hora, utilizando a EVA, a escala de Melbourne e os filamentos de Von Frey, aplicados ao redor da incisão cirúrgica. Foram considerados como critérios para realização da analgesia resgate uma pontuação de 50mm na EVA ou de 13 pontos na escala de Melbourne. A EVA revelou-se a escala mais sensível, uma vez que 100% dos animais receberam resgate seguindo esse método. Os valores obtidos na EVA e na escala de Melbourne determinaram boa correlação, com r=0,74, o que não ocorreu com os filamentos de Von Frey (r=-0,18). Já a correlação entre a escala de Melbourne e os filamentos de Von Frey foi de -0.37. Apesar de a EVA e a escala de Melbourne apresentarem boa correlação, sugere-se que se considere uma pontuação menor na escala de Melbourne como critério para administração de analgesia resgate.
Resumo:
In a quantum critical chain, the scaling regime of the energy and momentum of the ground state and low-lying excitations are described by conformal field theory (CFT). The same holds true for the von Neumann and Renyi entropies of the ground state, which display a universal logarithmic behavior depending on the central charge. In this Letter we generalize this result to those excited states of the chain that correspond to primary fields in CFT. It is shown that the nth Renyi entropy is related to a 2n-point correlator of primary fields. We verify this statement for the critical XX and XXZ chains. This result uncovers a new link between quantum information theory and CFT.