940 resultados para real von Neumann measurement


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze von Neumann-like quantum measurements in terms of simultaneous virtual paths constructed for two noncommuting variables. The approach is applied to measurements of operator functions of conjugate variables and to the joint measurements of such variables. The limits of applicability of the restricted phase space path integral are studied. We demonstrate that, for a simple joint measurement, using entangled meter states allows one to manipulate the order in which the measurements are conducted. The effects of '' weakening '' a measurement by choosing unsharp meter states are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The state disturbance induced by locally measuring a quantum system yields a signature of nonclassical correlations beyond entanglement. Here, we present a detailed study of such correlations for two-qubit mixed states. To overcome the asymmetry of quantum discord and the unfaithfulness of measurement-induced disturbance (severely overestimating quantum correlations), we propose an ameliorated measurement-induced disturbance as nonclassicality indicator, optimized over joint local measurements, and we derive its closed expression for relevant two-qubit states. We study its analytical relation with discord, and characterize the maximally quantum-correlated mixed states, that simultaneously extremize both quantifiers at given von Neumann entropy: among all two-qubit states, these states possess the most robust quantum correlations against noise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method of measuring the temperature of the fast electrons produced in ultraintense laser-plasma interactions is described by inducing photonuclear reactions, in particular (gamma,n) and (gamma,3n) reactions in tantalum. Analysis of the gamma rays emitted by the daughter nuclei of these reactions using a germanium counter enables a relatively straightforward near real-time temperature measurement to be made. This is especially important for high temperature plasmas where alternative diagnostic techniques are usually difficult and time consuming. This technique can be used while other experiments are being conducted. (C) 2002 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La thèse est divisée principalement en deux parties. La première partie regroupe les chapitres 2 et 3. La deuxième partie regroupe les chapitres 4 et 5. La première partie concerne l'échantillonnage de distributions continues non uniformes garantissant un niveau fixe de précision. Knuth et Yao démontrèrent en 1976 comment échantillonner exactement n'importe quelle distribution discrète en n'ayant recours qu'à une source de bits non biaisés indépendants et identiquement distribués. La première partie de cette thèse généralise en quelque sorte la théorie de Knuth et Yao aux distributions continues non uniformes, une fois la précision fixée. Une borne inférieure ainsi que des bornes supérieures pour des algorithmes génériques comme l'inversion et la discrétisation figurent parmi les résultats de cette première partie. De plus, une nouvelle preuve simple du résultat principal de l'article original de Knuth et Yao figure parmi les résultats de cette thèse. La deuxième partie concerne la résolution d'un problème en théorie de la complexité de la communication, un problème qui naquit avec l'avènement de l'informatique quantique. Étant donné une distribution discrète paramétrée par un vecteur réel de dimension N et un réseau de N ordinateurs ayant accès à une source de bits non biaisés indépendants et identiquement distribués où chaque ordinateur possède un et un seul des N paramètres, un protocole distribué est établi afin d'échantillonner exactement ladite distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this article is to test the hypothesis that utility preferences that incorporate asymmetric reactions between gains and losses generate better results than the classic Von Neumann-Morgenstern utility functions in the Brazilian market. The asymmetric behavior can be computed through the introduction of a disappointment (or loss) aversion coefficient in the classical expected utility function, which increases the impact of losses against gains. The results generated by both traditional and loss aversion utility functions are compared with real data from the Brazilian market regarding stock market participation in the investment portfolio of pension funds and individual investors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article we propose an exact efficient simulation algorithm for the generalized von Mises circular distribution of order two. It is an acceptance-rejection algorithm with a piecewise linear envelope based on the local extrema and the inflexion points of the generalized von Mises density of order two. We show that these points can be obtained from the roots of polynomials and degrees four and eight, which can be easily obtained by the methods of Ferrari and Weierstrass. A comparative study with the von Neumann acceptance-rejection, with the ratio-of-uniforms and with a Markov chain Monte Carlo algorithms shows that this new method is generally the most efficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A dolgozatban a Neumann-modell lehetséges elméleti és módszertani rokonságát elemezzük annak fényében, hogy mind a neoklasszikusok, mind a klasszikus hagyományokat felélesztő neoricardiánusok a magukénak vallják. Ennek során megvizsgáljuk a klasszikus és a neoklasszikus gazdaságfelfogás, az ex post és az ex ante szemléletű modellek közötti különbségeket, és azt a forradalmi jelentőségű módszertani változást, amely a sok szempontból joggal bírálható modern matematikai közgazdaságtan kialakulásához vezetett. Összevetjük Neumann modelljét az osztrák iskola árbeszámítási elméletével, a Walras­Cassel- és a Schlesinger­Wald-féle modellekkel, illetve a Ricardo, Marx, Dmitriev, Leontief nevekkel fémjelezhető klasszikus vonulat eredményeivel. Rámutatunk arra, hogy Neumann voltaképpen az "igazságos és értelmes gazdaság" ősi ideáját öntötte kora modern fizikájában honos matematikai modell formájába. /===/ The paper investigates the potential theoretical and methodological sources of inspiration of the von Neumann model, in view of the fact that both the neoclassical and the neo-Ricardian economists claim heritage to it. In the course of that the author assesses the main differences of the classical and neoclassical, the ex post and ex ante modeling approaches. He also confronts the von Neumann model with the Walras–Cassel and the Schlesinger–Wald models, and with models worked out in the classical tradition a’la Ricardo, Marx, Dmitriev and Leontief. He concludes that the Neumann-model is, in fact, nothing but a reformulation of a very old belief in a “just and reasonable economic system” based on the modern modeling approach of contemporary physics and mathematics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A dolgozat a klasszikusnak tekinthető Neumann-féle növekedési modell egy új alapra helyezését tartalmazza. Az eredeti Neumann-modellben expliciten vállalatok nem szerepelnek, csak technológiák vagy eljárások. A dolgozat egy olyan Neumann-típusú modellt vizsgál, amelyben az egyes technológiáknak vállalatokat feleltet meg, és azt vizsgálja, hogy ilyen feltételezés mellett egy ilyen gazdaságban léteznek-e olyan megoldások, amelyek mellett a vállalatok maximalizálják a nyereségüket. Ennek vizsgálata közben arra az eredményre juthatunk, hogy erre az esetre a klasszikus Neumann-modell által feltételezett nempozitív nyereséget felül kell vizsgálni, ami a klasszikus matematikai közgazdaságtan dualitáson alapuló alapfeltételezése. ______ The paper investigates a generalization of the classical growth model of John von Neumann. There are only technologies in model of von Neumann. The aim of the paper is to rename technologies as firms and it is analyzed whether there exist equilibrium prices and quantities for firms to maximize the total profit. The paper reexamines the classical assumption about the duality of prices, i.e. it is allowed a nonnegative profit of firms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A szerző a tisztán elméleti célokra kifejlesztett Neumann-modellt és a gyakorlati alkalmazások céljára kifejlesztett Leontief-modellt veti össze. A Neumann-modell és a Leontief-modell éves termelési periódust feltételező, zárt, stacionárius változatának hasonló matematikai struktúrája azt a feltételezést sugallja, hogy az utóbbi a Neumann-modell sajátos eseteként értelmezhető. Az egyes modellek közgazdasági tartalmát és feltevéseit részletesen kibontva és egymással összevetve a szerző megmutatja, hogy a fenti következtetés félrevezető, két merőben különböző modellről van szó, nem lehet az egyikből a másikat levezetni. Az ikertermelés és technológiai választék lehetősége a Neumann-modell elengedhetetlen feltevése, az éves termelési periódus feltevése pedig kizárja folyam jellegű kibocsátások explicit figyelembevételét. Mindezek feltevések ugyanakkor idegenek a Leontief-modelltől. A két modell valójában egy általánosabb állomány–folyam jellegű zárt, stacionárius modell sajátos esete, méghozzá azok folyamváltozókra redukált alakja. _____ The paper compares the basic assumptions and methodology of the Von Neumann model, developed for purely abstract theoretical purposes, and those of the Leontief model, designed originally for practical applications. Study of the similar mathematical structures of the Von Neumann model and the closed, stationary Leontief model, with a unit length of production period, often leads to the false conclusion that the latter is just a simplified version of the former. It is argued that the economic assumptions of the two models are quite different, which makes such an assertion unfounded. Technical choice and joint production are indispensable features of the Von Neumann model, and the assumption of unitary length of production period excludes the possibility of taking service flows explicitly into account. All these features are completely alien to the Leontief model, however. It is shown that the two models are in fact special cases of a more general stock-flow stationary model, reduced to forms containing only flow variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the study of complex networks, vertex centrality measures are used to identify the most important vertices within a graph. A related problem is that of measuring the centrality of an edge. In this paper, we propose a novel edge centrality index rooted in quantum information. More specifically, we measure the importance of an edge in terms of the contribution that it gives to the Von Neumann entropy of the graph. We show that this can be computed in terms of the Holevo quantity, a well known quantum information theoretical measure. While computing the Von Neumann entropy and hence the Holevo quantity requires computing the spectrum of the graph Laplacian, we show how to obtain a simplified measure through a quadratic approximation of the Shannon entropy. This in turns shows that the proposed centrality measure is strongly correlated with the negative degree centrality on the line graph. We evaluate our centrality measure through an extensive set of experiments on real-world as well as synthetic networks, and we compare it against commonly used alternative measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Productivity is basic statistical information for many international comparisons and country performance assessments. This study estimates the construction labour productivity of 79 selected economies. The real (purchasing power parities converted) and nominal construction expenditure from the Report of 2005 International Comparison Programme published by the World Bank and construction employment from the database of labour statistics (LABORSTA) operated by the Bureau of Statistics of International Labour Organization were used in the estimation. The inference statistics indicate that the descending order of nominal construction labour productivity from high income economies to low income economies is not established. The average construction labour productivity of low income economies is higher than middle income economies when the productivity calculation uses purchasing power parities converted data. Malaysia ranked 50th and 63rd position among the 79 selected economies on real and nominal measurement respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the regret of optimal strategies for online convex optimization games. Using von Neumann's minimax theorem, we show that the optimal regret in this adversarial setting is closely related to the behavior of the empirical minimization algorithm in a stochastic process setting: it is equal to the maximum, over joint distributions of the adversary's action sequence, of the difference between a sum of minimal expected losses and the minimal empirical loss. We show that the optimal regret has a natural geometric interpretation, since it can be viewed as the gap in Jensen's inequality for a concave functional--the minimizer over the player's actions of expected loss--defined on a set of probability distributions. We use this expression to obtain upper and lower bounds on the regret of an optimal strategy for a variety of online learning problems. Our method provides upper bounds without the need to construct a learning algorithm; the lower bounds provide explicit optimal strategies for the adversary. Peter L. Bartlett, Alexander Rakhlin

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Damage detection in structures has become increasingly important in recent years. While a number of damage detection and localization methods have been proposed, very few attempts have been made to explore the structure damage with noise polluted data which is unavoidable effect in real world. The measurement data are contaminated by noise because of test environment as well as electronic devices and this noise tend to give error results with structural damage identification methods. Therefore it is important to investigate a method which can perform better with noise polluted data. This paper introduces a new damage index using principal component analysis (PCA) for damage detection of building structures being able to accept noise polluted frequency response functions (FRFs) as input. The FRF data are obtained from the function datagen of MATLAB program which is available on the web site of the IASC-ASCE (International Association for Structural Control– American Society of Civil Engineers) Structural Health Monitoring (SHM) Task Group. The proposed method involves a five-stage process: calculation of FRFs, calculation of damage index values using proposed algorithm, development of the artificial neural networks and introducing damage indices as input parameters and damage detection of the structure. This paper briefly describes the methodology and the results obtained in detecting damage in all six cases of the benchmark study with different noise levels. The proposed method is applied to a benchmark problem sponsored by the IASC-ASCE Task Group on Structural Health Monitoring, which was developed in order to facilitate the comparison of various damage identification methods. The illustrated results show that the PCA-based algorithm is effective for structural health monitoring with noise polluted FRFs which is of common occurrence when dealing with industrial structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Awareness to avoid losses and casualties due to rain-induced landslide is increasing in regions that routinely experience heavy rainfall. Improvements in early warning systems against rain-induced landslide such as prediction modelling using rainfall records, is urgently needed in vulnerable regions. The existing warning systems have been applied using stability chart development and real-time displacement measurement on slope surfaces. However, there are still some drawbacks such as: ignorance of rain-induced instability mechanism, mislead prediction due to the probabilistic prediction and short time for evacuation. In this research, a real-time predictive method was proposed to alleviate the drawbacks mentioned above. A case-study soil slope in Indonesia that failed in 2010 during rainfall was used to verify the proposed predictive method. Using the results from the field and laboratory characterizations, numerical analyses can be applied to develop a model of unsaturated residual soils slope with deep cracks and subject to rainwater infiltration. Real-time rainfall measurement in the slope and the prediction of future rainfall are needed. By coupling transient seepage and stability analysis, the variation of safety factor of the slope with time were provided as a basis to develop method for the real-time prediction of the rain-induced instability of slopes. This study shows the proposed prediction method has the potential to be used in an early warning system against landslide hazard, since the FOS value and the timing of the end-result of the prediction can be provided before the actual failure of the case study slope.