852 resultados para Birkhoff and Von Neumann ergodic theorems
Resumo:
"National Socialism": 1. Ankündigung einer Vorlesungsreihe November/Dezember 1941 von: Herbert Marcuse, A.R.L. Gurland, Franz Neumann, Otto Kirchheimer, Frederick Pollock. a) als Typoskript verfielfältigt, 1 Blatt, b) Typoskript, 1 Blatt; 2. Antwortbrief auf Einladungen zur Vorlesungsreihe, von Neilson, William A.; Packelis, Alexander H.; Michael, Jerome; McClung Lee, Alfred; Youtz, R.P.; Ginsburg, Isidor; Ganey, G.; Nunhauer, Arthur. 8 Blätter; "Autoritarian doctrines and modern European institutions" (1924): 1. Vorlesungs-Ankündigung Typoskript, 2 Blatt; 2. Ankündigungen der Vorlesungen von Neumann, Franz L.: "Stratification and Dominance in Germany"; "Bureaucracy as a Social and Political Institution", Typoskript, 2 Blatt; 3. Evans, Austin P.: 1 Brief (Abschrift) an Frederick Pollock, New York, 26.2.1924; "Eclipse of Reason", Fünf Vorlesungen 1943/44:; 1. I. Lecture. a) Typoskript mit eigenhändigen Korrekturen, 38 Blatt b) Typoskript, 29 Blatt c) Typoskript mit eigenhändigen und handschriftlichen Korrekturen, 31 Blatt d) Teilstück, Typoskript mit eigenhändigen Korrekturen, 2 Blatt e) Entwürfe, Typoskript mit eigenhändigen Korrekturen, 6 Blatt; 2. II. Lecture. a) Typoskript mit eigenhändigen Korrekturen, 27 Blatt, b) Typoskript mit handschriftlichen Korrekturen, 37 Blatt; 3. III. Lecture. Typoskript mit eigenhändigen Korrekturen, 27 Blatt; 4. IV. Lecture. Typoskript mit eigenhändigen Korrekturen, 23 Blatt; 5. V. Lecture. a) Typoskript mit eigenhändigen Korrekturen, 25 Blatt, b) Teilstücke, Typoskript mit eigenhändigen und handschriftlichen Korrekturen, 3 Blatt;
Resumo:
Three long-term temperature data series measured in Portugal were studied to detect and correct non-climatic homogeneity breaks and are now available for future studies of climate variability. Series of monthly minimum (Tmin) and maximum (Tmax) temperatures measured in the three Portuguese meteorological stations of Lisbon (from 1856 to 2008), Coimbra (from 1865 to 2005) and Porto (from 1888 to 2001) were studied to detect and correct non-climatic homogeneity breaks. These series together with monthly series of average temperature (Taver) and temperature range (DTR) derived from them were tested in order to detect homogeneity breaks, using, firstly, metadata, secondly, a visual analysis and, thirdly, four widely used homogeneity tests: von Neumann ratio test, Buishand test, standard normal homogeneity test and Pettitt test. The homogeneity tests were used in absolute (using temperature series themselves) and relative (using sea-surface temperature anomalies series obtained from HadISST2 close to the Portuguese coast or already corrected temperature series as reference series) modes. We considered the Tmin, Tmax and DTR series as most informative for the detection of homogeneity breaks due to the fact that Tmin and Tmax could respond differently to changes in position of a thermometer or other changes in the instrument's environment; Taver series have been used, mainly, as control. The homogeneity tests show strong inhomogeneity of the original data series, which could have both internal climatic and non-climatic origins. Homogeneity breaks which have been identified by the last three mentioned homogeneity tests were compared with available metadata containing data, such as instrument changes, changes in station location and environment, observing procedures, etc. Significant homogeneity breaks (significance 95% or more) that coincide with known dates of instrumental changes have been corrected using standard procedures. It was also noted that some significant homogeneity breaks, which could not be connected to the known dates of any changes in the park of instruments or stations location and environment, could be caused by large volcanic eruptions. The corrected series were again tested for homogeneity: the corrected series were considered free of non-climatic breaks when the tests of most of monthly series showed no significant (significance 95% or more) homogeneity breaks that coincide with dates of known instrument changes. Corrected series are now available in the frame of ERA-CLIM FP7 project for future studies of climate variability.
Resumo:
We introduce a new class of generalized isotropic Lipkin–Meshkov–Glick models with su(m+1) spin and long-range non-constant interactions, whose non-degenerate ground state is a Dicke state of su(m+1) type. We evaluate in closed form the reduced density matrix of a block of Lspins when the whole system is in its ground state, and study the corresponding von Neumann and Rényi entanglement entropies in the thermodynamic limit. We show that both of these entropies scale as a log L when L tends to infinity, where the coefficient a is equal to (m − k)/2 in the ground state phase with k vanishing magnon densities. In particular, our results show that none of these generalized Lipkin–Meshkov–Glick models are critical, since when L-->∞ their Rényi entropy R_q becomes independent of the parameter q. We have also computed the Tsallis entanglement entropy of the ground state of these generalized su(m+1) Lipkin–Meshkov–Glick models, finding that it can be made extensive by an appropriate choice of its parameter only when m-k≥3. Finally, in the su(3) case we construct in detail the phase diagram of the ground state in parameter space, showing that it is determined in a simple way by the weights of the fundamental representation of su(3). This is also true in the su(m+1) case; for instance, we prove that the region for which all the magnon densities are non-vanishing is an (m + 1)-simplex in R^m whose vertices are the weights of the fundamental representation of su(m+1).
Resumo:
"UIUCDCS-R-74-630"
Resumo:
We investigate boundary critical phenomena from a quantum-information perspective. Bipartite entanglement in the ground state of one-dimensional quantum systems is quantified using the Renyi entropy S-alpha, which includes the von Neumann entropy (alpha -> 1) and the single-copy entanglement (alpha ->infinity) as special cases. We identify the contribution of the boundaries to the Renyi entropy, and show that there is an entanglement loss along boundary renormalization group (RG) flows. This property, which is intimately related to the Affleck-Ludwig g theorem, is a consequence of majorization relations between the spectra of the reduced density matrix along the boundary RG flows. We also point out that the bulk contribution to the single-copy entanglement is half of that to the von Neumann entropy, whereas the boundary contribution is the same.
Resumo:
2000 Mathematics Subject Classification: Primary 13A99; Secondary 13A15, 13B02, 13E05.
Resumo:
A dolgozatban a Neumann-modell lehetséges elméleti és módszertani rokonságát elemezzük annak fényében, hogy mind a neoklasszikusok, mind a klasszikus hagyományokat felélesztő neoricardiánusok a magukénak vallják. Ennek során megvizsgáljuk a klasszikus és a neoklasszikus gazdaságfelfogás, az ex post és az ex ante szemléletű modellek közötti különbségeket, és azt a forradalmi jelentőségű módszertani változást, amely a sok szempontból joggal bírálható modern matematikai közgazdaságtan kialakulásához vezetett. Összevetjük Neumann modelljét az osztrák iskola árbeszámítási elméletével, a WalrasCassel- és a SchlesingerWald-féle modellekkel, illetve a Ricardo, Marx, Dmitriev, Leontief nevekkel fémjelezhető klasszikus vonulat eredményeivel. Rámutatunk arra, hogy Neumann voltaképpen az "igazságos és értelmes gazdaság" ősi ideáját öntötte kora modern fizikájában honos matematikai modell formájába. /===/ The paper investigates the potential theoretical and methodological sources of inspiration of the von Neumann model, in view of the fact that both the neoclassical and the neo-Ricardian economists claim heritage to it. In the course of that the author assesses the main differences of the classical and neoclassical, the ex post and ex ante modeling approaches. He also confronts the von Neumann model with the Walras–Cassel and the Schlesinger–Wald models, and with models worked out in the classical tradition a’la Ricardo, Marx, Dmitriev and Leontief. He concludes that the Neumann-model is, in fact, nothing but a reformulation of a very old belief in a “just and reasonable economic system” based on the modern modeling approach of contemporary physics and mathematics.
Resumo:
A dolgozat a klasszikusnak tekinthető Neumann-féle növekedési modell egy új alapra helyezését tartalmazza. Az eredeti Neumann-modellben expliciten vállalatok nem szerepelnek, csak technológiák vagy eljárások. A dolgozat egy olyan Neumann-típusú modellt vizsgál, amelyben az egyes technológiáknak vállalatokat feleltet meg, és azt vizsgálja, hogy ilyen feltételezés mellett egy ilyen gazdaságban léteznek-e olyan megoldások, amelyek mellett a vállalatok maximalizálják a nyereségüket. Ennek vizsgálata közben arra az eredményre juthatunk, hogy erre az esetre a klasszikus Neumann-modell által feltételezett nempozitív nyereséget felül kell vizsgálni, ami a klasszikus matematikai közgazdaságtan dualitáson alapuló alapfeltételezése. ______ The paper investigates a generalization of the classical growth model of John von Neumann. There are only technologies in model of von Neumann. The aim of the paper is to rename technologies as firms and it is analyzed whether there exist equilibrium prices and quantities for firms to maximize the total profit. The paper reexamines the classical assumption about the duality of prices, i.e. it is allowed a nonnegative profit of firms.
Resumo:
A szerző a tisztán elméleti célokra kifejlesztett Neumann-modellt és a gyakorlati alkalmazások céljára kifejlesztett Leontief-modellt veti össze. A Neumann-modell és a Leontief-modell éves termelési periódust feltételező, zárt, stacionárius változatának hasonló matematikai struktúrája azt a feltételezést sugallja, hogy az utóbbi a Neumann-modell sajátos eseteként értelmezhető. Az egyes modellek közgazdasági tartalmát és feltevéseit részletesen kibontva és egymással összevetve a szerző megmutatja, hogy a fenti következtetés félrevezető, két merőben különböző modellről van szó, nem lehet az egyikből a másikat levezetni. Az ikertermelés és technológiai választék lehetősége a Neumann-modell elengedhetetlen feltevése, az éves termelési periódus feltevése pedig kizárja folyam jellegű kibocsátások explicit figyelembevételét. Mindezek feltevések ugyanakkor idegenek a Leontief-modelltől. A két modell valójában egy általánosabb állomány–folyam jellegű zárt, stacionárius modell sajátos esete, méghozzá azok folyamváltozókra redukált alakja. _____ The paper compares the basic assumptions and methodology of the Von Neumann model, developed for purely abstract theoretical purposes, and those of the Leontief model, designed originally for practical applications. Study of the similar mathematical structures of the Von Neumann model and the closed, stationary Leontief model, with a unit length of production period, often leads to the false conclusion that the latter is just a simplified version of the former. It is argued that the economic assumptions of the two models are quite different, which makes such an assertion unfounded. Technical choice and joint production are indispensable features of the Von Neumann model, and the assumption of unitary length of production period excludes the possibility of taking service flows explicitly into account. All these features are completely alien to the Leontief model, however. It is shown that the two models are in fact special cases of a more general stock-flow stationary model, reduced to forms containing only flow variables.
Resumo:
In this thesis we introduce nuclear dimension and compare it with a stronger form of the completely positive approximation property. We show that the approximations forming this stronger characterisation of the completely positive approximation property witness finite nuclear dimension if and only if the underlying C*-algebra is approximately finite dimensional. We also extend this result to nuclear dimension at most omega. We review interactions between separably acting injective von Neumann algebras and separable nuclear C*-algebras. In particular, we discuss aspects of Connes' work and how some of his strategies have been used by C^*-algebraist to estimate the nuclear dimension of certain classes of C*-algebras. We introduce a notion of coloured isomorphisms between separable unital C*-algebras. Under these coloured isomorphisms ideal lattices, trace spaces, commutativity, nuclearity, finite nuclear dimension and weakly pure infiniteness are preserved. We show that these coloured isomorphisms induce isomorphisms on the classes of finite dimensional and commutative C*-algebras. We prove that any pair of Kirchberg algebras are 2-coloured isomorphic and any pair of separable, simple, unital, finite, nuclear and Z-stable C*-algebras with unique trace which satisfy the UCT are also 2-coloured isomorphic.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
Nanoindentation is a useful technique for probing the mechanical properties of bone, and finite element (FE) modeling of the indentation allows inverse determination of elasto-plastic constitutive properties. However, FE simulations to date have assumed frictionless contact between indenter and bone. The aim of this study was to explore the effect of friction in simulations of bone nanoindentation. Two dimensional axisymmetric FE simulations were performed using a spheroconical indenter of tip radius 0.6m and angle 90°. The coefficient of friction between indenter and bone was varied between 0.0 (frictionless) and 0.3. Isotropic linear elasticity was used in all simulations, with bone elastic modulus E=13.56GPa and Poisson’s ratio =0.3. Plasticity was incorporated using both Drucker-Prager and von Mises yield surfaces. Friction had a modest effect on the predicted force-indentation curve for both von Mises and Drucker-Prager plasticity, reducing maximum indenter displacement by 10% and 20% respectively as friction coefficient was increased from zero to 0.3 (at a maximum indenter force of 5mN). However, friction has a much greater effect on predicted pile-up after indentation, reducing predicted pile-up from 0.27m to 0.11m with a von Mises model, and from 0.09m to 0.02m with Drucker-Prager plasticity. We conclude that it is important to include friction in nanoindentation simulations of bone.
Resumo:
We consider a time and space-symmetric fractional diffusion equation (TSS-FDE) under homogeneous Dirichlet conditions and homogeneous Neumann conditions. The TSS-FDE is obtained from the standard diffusion equation by replacing the first-order time derivative by a Caputo fractional derivative, and the second order space derivative by a symmetric fractional derivative. First, a method of separating variables expresses the analytical solution of the TSS-FDE in terms of the Mittag--Leffler function. Second, we propose two numerical methods to approximate the Caputo time fractional derivative: the finite difference method; and the Laplace transform method. The symmetric space fractional derivative is approximated using the matrix transform method. Finally, numerical results demonstrate the effectiveness of the numerical methods and to confirm the theoretical claims.
Resumo:
We consider a time and space-symmetric fractional diffusion equation (TSS-FDE) under homogeneous Dirichlet conditions and homogeneous Neumann conditions. The TSS-FDE is obtained from the standard diffusion equation by replacing the first-order time derivative by the Caputo fractional derivative and the second order space derivative by the symmetric fractional derivative. Firstly, a method of separating variables is used to express the analytical solution of the tss-fde in terms of the Mittag–Leffler function. Secondly, we propose two numerical methods to approximate the Caputo time fractional derivative, namely, the finite difference method and the Laplace transform method. The symmetric space fractional derivative is approximated using the matrix transform method. Finally, numerical results are presented to demonstrate the effectiveness of the numerical methods and to confirm the theoretical claims.