899 resultados para Nash Equilibrium
Resumo:
The theory of vapour-liquid equilibria is reviewed, as is the present status or prediction methods in this field. After discussion of the experimental methods available, development of a recirculating equilibrium still based on a previously successful design (the modified Raal, Code and Best still of O'Donnell and Jenkins) is described. This novel still is designed to work at pressures up to 35 bar and for the measurement of both isothermal and isobaric vapour-liquid equilibrium data. The equilibrium still was first commissioned by measuring the saturated vapour pressures of pure ethanol and cyclohexane in the temperature range 77-124°C and 80-142°C respectively. The data obtained were compared with available literature experimental values and with values derived from an extended form of the Antoine equation for which parameters were given in the literature. Commissioning continued with the study of the phase behaviour of mixtures of the two pure components as such mixtures are strongly non-ideal, showing azeotopic behaviour. Existing data did not exist above one atmosphere pressure. Isothermal measurements were made at 83.29°C and 106.54°C, whilst isobaric measurements were made at pressures of 1 bar, 3 bar and 5 bar respectively. The experimental vapour-liquid equilibrium data obtained are assessed by a standard literature method incorporating a themodynamic consistency test that minimises the errors in all the measured variables. This assessment showed that reasonable x-P-T data-sets had been measured, from which y-values could be deduced, but that the experimental y-values indicated the need for improvements in the design of the still. The final discussion sets out the improvements required and outlines how they might be attained.
Resumo:
We study memory effects in a kinetic roughening model. For d=1, a different dynamic scaling is uncovered in the memory dominated phases; the Kardar-Parisi-Zhang scaling is restored in the absence of noise. dc=2 represents the critical dimension where memory is shown to smoothen the roughening front (a=0). Studies on a discrete atomistic model in the same universality class reconfirm the analytical results in the large time limit, while a different scaling behavior shows up for t
Resumo:
A discussion of how to promote employability within the curriculum
Resumo:
We study waveguide fabrication in lithium-niobo-phosphate glass, aiming at a practical method of single-stage fabrication of nonlinear integrated-optics devices. We observed chemical transformations or material redistribution during the course of high repetition rate femtosecond laser inscription. We believe that the laser-induced ultrafast heating and cooling followed by elements diffusion on a microscopic scale opens the way toward the engineering non-equilibrium sates of matter and thus can further enhance Refractive Index (RI) contrasts by virtue of changing glass composition in and around the fs tracks. © 2014 Optical Society of America.
Resumo:
We study the dynamics of a growing crystalline facet where the growth mechanism is controlled by the geometry of the local curvature. A continuum model, in (2+1) dimensions, is developed in analogy with the Kardar-Parisi-Zhang (KPZ) model is considered for the purpose. Following standard coarse graining procedures, it is shown that in the large time, long distance limit, the continuum model predicts a curvature independent KPZ phase, thereby suppressing all explicit effects of curvature and local pinning in the system, in the "perturbative" limit. A direct numerical integration of this growth equation, in 1+1 dimensions, supports this observation below a critical parametric range, above which generic instabilities, in the form of isolated pillared structures lead to deviations from standard scaling behaviour. Possibilities of controlling this instability by introducing statistically "irrelevant" (in the sense of renormalisation groups) higher ordered nonlinearities have also been discussed.
Resumo:
In the given work by authors new approach to the exposure of degree of influencing of medications of vegetable origin in a time of renewal of broken equilibrium of man organism is offered. During realization of the given approach it is suggested to use the mathematical vehicle of.
Resumo:
We characterize the preference domains on which the Borda count satisfies Maskin monotonicity. The basic concept is the notion of a "cyclic permutation domain" which arises by fixing one particular ordering of alternatives and including all its cyclic permutations. The cyclic permutation domains are exactly the maximal domains on which the Borda count is strategy-proof when combined with every possible tie breaking rule. It turns out that the Borda count is monotonic on a larger class of domains. We show that the maximal domains on which the Borda count satisfies Maskin monotonicity are the "cyclically nested permutation domains" which are obtained from the cyclic permutation domains in an appropriately specified recursive way. ------ *We thank József Mala for posing the question of Nash implementability on restricted domains that led to this research. We are very grateful to two anonymous referees and an associate editor for their helpful comments and suggestions. The second author gratefully acknowledges financial support from the Hungarian Academy of Sciences (MTA) through the Bolyai János research fellowship.
Resumo:
A new correlation scheme (leading to a special equilibrium called “soft” correlated equilibrium) is introduced for finite games. After randomization over the outcome space, players have the choice either to follow the recommendation of an umpire blindly or freely choose some other action except the one suggested. This scheme can lead to Pareto-better outcomes than the simple extension introduced by [Moulin, H., Vial, J.-P., 1978. Strategically zero-sum games: the class of games whose completely mixed equilibria cannot be improved upon. International Journal of Game Theory 7, 201–221]. The informational and interpretational aspects of soft correlated equilibria are also discussed in detail. The power of the generalization is illustrated in the prisoners’s dilemma and a congestion game.
Resumo:
A correlation scheme (leading to a special equilibrium called “soft” correlated equilibrium) is applied for two-person finite games in extensive form with perfect information. Randomization by an umpire takes place over the leaves of the game tree. At every decision point players have the choice either to follow the recommendation of the umpire blindly or freely choose any other action except the one suggested. This scheme can lead to Pareto-improved outcomes of other correlated equilibria. Computational issues of maximizing a linear function over the set of soft correlated equilibria are considered and a linear-time algorithm in terms of the number of edges in the game tree is given for a special procedure called “subgame perfect optimization”.
Resumo:
Cikkünk arról a paradox jelenségről szól, hogy a fogyasztást explicit módon megjelenítő Neumann-modell egyensúlyi megoldásaiban a munkabért meghatározó létszükségleti termékek ára esetenként nulla lehet, és emiatt a reálbér egyensúlyi értéke is nulla lesz. Ez a jelenség mindig bekövetkezik az olyan dekomponálható gazdaságok esetén, amelyekben eltérő növekedési és profitrátájú, alternatív egyensúlyi megoldások léteznek. A jelenség sokkal áttekinthetőbb formában tárgyalható a modell Leontief-eljárásra épülő egyszerűbb változatában is, amit ki is használunk. Megmutatjuk, hogy a legnagyobbnál alacsonyabb szintű növekedési tényezőjű megoldások közgazdasági szempontból értelmetlenek, és így érdektelenek. Ezzel voltaképpen egyrészt azt mutatjuk meg, hogy Neumann kiváló intuíciója jól működött, amikor ragaszkodott modellje egyértelmű megoldásához, másrészt pedig azt is, hogy ehhez nincs szükség a gazdaság dekomponálhatóságának feltételezésére. A vizsgált téma szorosan kapcsolódik az általános profitráta meghatározásának - Sraffa által modern formába öntött - Ricardo-féle elemzéséhez, illetve a neoklasszikus növekedéselmélet nevezetes bér-profit, illetve felhalmozás-fogyasztás átváltási határgörbéihez, ami jelzi a téma elméleti és elmélettörténeti érdekességét is. / === / In the Marx-Neumann version of the Neumann model introduced by Morishima, the use of commodities is split between production and consumption, and wages are determined as the cost of necessary consumption. In such a version it may occur that the equilibrium prices of all goods necessary for consumption are zero, so that the equilibrium wage rate becomes zero too. In fact such a paradoxical case will always arise when the economy is decomposable and the equilibrium not unique in terms of growth and interest rate. It can be shown that a zero equilibrium wage rate will appear in all equilibrium solutions where growth and interest rate are less than maximal. This is another proof of Neumann's genius and intuition, for he arrived at the uniqueness of equilibrium via an assumption that implied that the economy was indecomposable, a condition relaxed later by Kemeny, Morgenstern and Thompson. This situation occurs also in similar models based on Leontief technology and such versions of the Marx-Neumann model make the roots of the problem more apparent. Analysis of them also yields an interesting corollary to Ricardo's corn rate of profit: the real cause of the awkwardness is bad specification of the model: luxury commodities are introduced without there being a final demand for them, and production of them becomes a waste of resources. Bad model specification shows up as a consumption coefficient incompatible with the given technology in the more general model with joint production and technological choice. For the paradoxical situation implies the level of consumption could be raised and/or the intensity of labour diminished without lowering the equilibrium rate of the growth and interest. This entails wasteful use of resources and indicates again that the equilibrium conditions are improperly specified. It is shown that the conditions for equilibrium can and should be redefined for the Marx-Neumann model without assuming an indecomposable economy, in a way that ensures the existence of an equilibrium unique in terms of the growth and interest rate coupled with a positive value for the wage rate, so confirming Neumann's intuition. The proposed solution relates closely to findings of Bromek in a paper correcting Morishima's generalization of wage/profit and consumption/investment frontiers.
Resumo:
A szerző röviden összefoglalja a származtatott termékek árazásával kapcsolatos legfontosabb ismereteket és problémákat. A derivatív árazás elmélete a piacon levő termékek közötti redundanciát kihasználva próbálja meghatározni az egyes termékek relatív árát. Ezt azonban csak teljes piacon lehet megtenni, és így csak teljes piac esetén lehetséges a hasznossági függvények fogalmát az elméletből és a ráépülő gyakorlatból elhagyni, ezért a kockázatsemleges árazás elve félrevezető. Másképpen fogalmazva: a származtatott termékek elmélete csak azon az áron képes a hasznossági függvény fogalmától megszabadulni, ha a piac szerkezetére a valóságban nem teljesülő megkötéseket tesz. Ennek hangsúlyozása mind a piaci gyakorlatban, mind az oktatásban elengedhetetlen. / === / The author sums up briefly the main aspects and problems to do with the pricing of derived products. The theory of derivative pricing uses the redundancy among products on the market to arrive at relative product prices. But this can be done only on a complete market, so that only with a complete market does it become possible to omit from the theory and the practice built upon it the concept of utility functions, and for that reason the principle of risk-neutral pricing is misleading. To put it another way, the theory of derived products is capable of freeing itself from the concept of utility functions only at a price where in practice it places impossible restrictions on the market structure. This it is essential to emphasize in market practice and in teaching.
Resumo:
The “Nash program” initiated by Nash (Econometrica 21:128–140, 1953) is a research agenda aiming at representing every axiomatically determined cooperative solution to a game as a Nash outcome of a reasonable noncooperative bargaining game. The L-Nash solution first defined by Forgó (Interactive Decisions. Lecture Notes in Economics and Mathematical Systems, vol 229. Springer, Berlin, pp 1–15, 1983) is obtained as the limiting point of the Nash bargaining solution when the disagreement point goes to negative infinity in a fixed direction. In Forgó and Szidarovszky (Eur J Oper Res 147:108–116, 2003), the L-Nash solution was related to the solution of multiciteria decision making and two different axiomatizations of the L-Nash solution were also given in this context. In this paper, finite bounds are established for the penalty of disagreement in certain special two-person bargaining problems, making it possible to apply all the implementation models designed for Nash bargaining problems with a finite disagreement point to obtain the L-Nash solution as well. For another set of problems where this method does not work, a version of Rubinstein’s alternative offer game (Econometrica 50:97–109, 1982) is shown to asymptotically implement the L-Nash solution. If penalty is internalized as a decision variable of one of the players, then a modification of Howard’s game (J Econ Theory 56:142–159, 1992) also implements the L-Nash solution.
Resumo:
A szerző az alkalmazott többszektoros modellezés területén a lineáris programozási modellektől a számszerűsített általános egyensúlyi modellekig végbement változásokat tekinti át. Egy rövid történeti visszapillantás után a lineáris programozás módszereire épülő nemzetgazdasági szintű modellekkel összevetve mutatja be az általános egyensúlyi modellek közös, illetve eltérő jellemzőit. Egyidejűleg azt is érzékelteti, hogyan lehet az általános egyensúlyi modelleket a gazdaságpolitikai célok konzisztenciájának, a célok közötti átváltási lehetőségek elemzésére és általában a gazdaságpolitikai elképzelések érzékenységi vizsgálatára felhasználni. A szerző az elméleti-módszertani kérdések taglalását számszerűsített általános egyensúlyi modell segítségével illusztrálja. _______ The author surveys the changes having taken place in the field of multi-sector modeling, from the linear programming models to the quantified general equilibrium models. After a brief historical retrospection he presents the common and different characteristic features of the general equilibrium models by comparing them with the national economic level models based on the methods of linear programming. He also makes clear how the general equilibrium models can be used for analysing the consistency of economic policy targets, for the investigation of trade-off possibilities among the targets and, in general, for sensitivity analyses of economic policy targets. The discussion of theoretical and methodological quuestions is illustrated by the author with the aid of a quantified general equilibrium model.
Resumo:
This essay attempts to understand János Kornai’s works from a political economy perspective. It argues that Kornai has significantly contributed to the formation of a new paradigm of political economy. The main endeavor of Kornai has been the combination of analytical concepts of economics with the empirical description of real economies. After a certain period of theoretical experimentation János Kornai formulated his research program that can be called the shortage economy explanation of the socialist system. The Economics of Shortage and The Socialist System have created a new theoretical paradigm in a framework in which it has become possible to establish a connection between the analytical and empirical, universal and historical aspects of the theory studying the socialist system as a real economic entity. János Kornai has built his analysis of the socialist system on the primary role of politics in the creation of economic institutions. In his present work on capitalism he has extended this thesis to the capitalist system. This seems to be an important contribution of his to a new political economy paradigm that is just in the process of formation.