155 resultados para Boolean Functions, Equivalence Class
Resumo:
We exhibit and characterize an entire class of simple adaptive strategies,in the repeated play of a game, having the Hannan-consistency property: In the long-run, the player is guaranteed an average payoff as large as the best-reply payoff to the empirical distribution of play of the otherplayers; i.e., there is no "regret." Smooth fictitious play (Fudenberg and Levine [1995]) and regret-matching (Hart and Mas-Colell [1998]) areparticular cases. The motivation and application of this work come from the study of procedures whose empirical distribution of play is, in thelong-run, (almost) a correlated equilibrium. The basic tool for the analysis is a generalization of Blackwell's [1956a] approachability strategy for games with vector payoffs.
Resumo:
We investigate identifiability issues in DSGE models and their consequences for parameter estimation and model evaluation when the objective function measures the distance between estimated and model impulse responses. We show that observational equivalence, partial and weak identification problems are widespread, that they lead to biased estimates, unreliable t-statistics and may induce investigators to select false models. We examine whether different objective functions affect identification and study how small samples interact with parameters and shock identification. We provide diagnostics and tests to detect identification failures and apply them to a state-of-the-art model.
Resumo:
Equivalence classes of normal form games are defined using the geometryof correspondences of standard equilibiurm concepts like correlated, Nash,and robust equilibrium or risk dominance and rationalizability. Resultingequivalence classes are fully characterized and compared across differentequilibrium concepts for 2 x 2 games. It is argued that the procedure canlead to broad and game-theoretically meaningful distinctions of games aswell as to alternative ways of viewing and testing equilibrium concepts.Larger games are also briefly considered.
Resumo:
We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.
Resumo:
Kahneman and Tversky asserted a fundamental asymmetry between gains and losses, namely a reflection effect which occurs when an individual prefers a sure gain of $ pz to anuncertain gain of $ z with probability p, while preferring an uncertain loss of $z with probability p to a certain loss of $ pz.We focus on this class of choices (actuarially fair), and explore the extent to which thereflection effect, understood as occurring at a range of wealth levels, is compatible with single-self preferences.We decompose the reflection effect into two components, a probability switch effect,which is compatible with single-self preferences, and a translation effect, which is not. To argue the first point, we analyze two classes of single-self, nonexpected utility preferences, which we label homothetic and weakly homothetic. In both cases, we characterize the switch effect as well as the dependence of risk attitudes on wealth.We also discuss two types of utility functions of a form reminiscent of expected utility but with distorted probabilities. Type I always distorts the probability of the worst outcome downwards, yielding attraction to small risks for all probabilities. Type II distorts low probabilities upwards, and high probabilities downwards, implying risk aversion when the probability of the worst outcome is low. By combining homothetic or weak homothetic preferences with Type I or Type II distortion functions, we present four explicit examples: All four display a switch effect and, hence, a form of reflection effect consistent a single self preferences.
Resumo:
In 1952 F. Riesz and Sz.Nágy published an example of a monotonic continuous function whose derivative is zero almost everywhere, that is to say, a singular function. Besides, the function was strictly increasing. Their example was built as the limit of a sequence of deformations of the identity function. As an easy consequence of the definition, the derivative, when it existed and was finite, was found to be zero. In this paper we revisit the Riesz-N´agy family of functions and we relate it to a system for real numberrepresentation which we call (t, t-1) expansions. With the help of these real number expansions we generalize the family. The singularity of the functions is proved through some metrical properties of the expansions used in their definition which also allows us to give a more precise way of determining when the derivative is 0 or infinity.
Resumo:
We propose a model and solution methods, for locating a fixed number ofmultiple-server, congestible common service centers or congestible publicfacilities. Locations are chosen so to minimize consumers congestion (orqueuing) and travel costs, considering that all the demand must be served.Customers choose the facilities to which they travel in order to receiveservice at minimum travel and congestion cost. As a proxy for thiscriterion, total travel and waiting costs are minimized. The travel costis a general function of the origin and destination of the demand, whilethe congestion cost is a general function of the number of customers inqueue at the facilities.
Resumo:
The classical binary classification problem is investigatedwhen it is known in advance that the posterior probability function(or regression function) belongs to some class of functions. We introduceand analyze a method which effectively exploits this knowledge. The methodis based on minimizing the empirical risk over a carefully selected``skeleton'' of the class of regression functions. The skeleton is acovering of the class based on a data--dependent metric, especiallyfitted for classification. A new scale--sensitive dimension isintroduced which is more useful for the studied classification problemthan other, previously defined, dimension measures. This fact isdemonstrated by performance bounds for the skeleton estimate in termsof the new dimension.
Resumo:
This paper proposes an exploration of the methodology of utilityfunctions that distinguishes interpretation from representation. Whilerepresentation univocally assigns numbers to the entities of the domainof utility functions, interpretation relates these entities withempirically observable objects of choice. This allows us to makeexplicit the standard interpretation of utility functions which assumesthat two objects have the same utility if and only if the individual isindifferent among them. We explore the underlying assumptions of suchan hypothesis and propose a non-standard interpretation according towhich objects of choice have a well-defined utility although individualsmay vary in the way they treat these objects in a specific context.We provide examples of such a methodological approach that may explainsome reversal of preferences and suggest possible mathematicalformulations for further research.
Resumo:
El treball desenvolupat ha consistit en analitzar el sistema d'informació Logic Class sota la perspectiva de la necessitat de construir un sistema d'indicadors (quadre de comandament operatiu) que integri informació de les diferents fonts de dades.
Resumo:
[cat] Es presenta un estimador nucli transformat que és adequat per a distribucions de cua pesada. Utilitzant una transformació basada en la distribució de probabilitat Beta l’elecció del paràmetre de finestra és molt directa. Es presenta una aplicació a dades d’assegurances i es mostra com calcular el Valor en Risc.
Resumo:
[cat] En aquest treball s'analitza un model estocàstic en temps continu en el que l'agent decisor descompta les utilitats instantànies i la funció final amb taxes de preferència temporal constants però diferents. En aquest context es poden modelitzar problemes en els quals, quan el temps s'acosta al moment final, la valoració de la funció final incrementa en comparació amb les utilitats instantànies. Aquest tipus d'asimetria no es pot descriure ni amb un descompte estàndard ni amb un variable. Per tal d'obtenir solucions consistents temporalment es deriva l'equació de programació dinàmica estocàstica, les solucions de la qual són equilibris Markovians. Per a aquest tipus de preferències temporals, s'estudia el model clàssic de consum i inversió (Merton, 1971) per a les funcions d'utilitat del tipus CRRA i CARA, comparant els equilibris Markovians amb les solucions inconsistents temporalment. Finalment es discuteix la introducció del temps final aleatori.
Resumo:
[eng] A multi-sided Böhm-Bawerk assignment game (Tejada, to appear) is a model for a multilateral market with a finite number of perfectly complementary indivisible commodities owned by different sellers, and inflexible demand and support functions. We show that for each such market game there is a unique vector of competitive prices for the commodities that is vertical syndication-proof, in the sense that, at those prices, syndication of sellers each owning a different commodity is neither beneficial nor detrimental for the buyers. Since, moreover, the benefits obtained by the agents at those prices correspond to the nucleolus of the market game, we provide a syndication-based foundation for the nucleolus as an appropriate solution concept for market games. For different solution concepts a syndicate can be disadvantageous and there is no escape to Aumman’s paradox (Aumann, 1973). We further show that vertical syndicationproofness and horizontal syndication-proofness – in which sellers of the same commodity collude – are incompatible requirements under some mild assumptions. Our results build on a self-interesting link between multi-sided Böhm-Bawerk assignment games and bankruptcy games (O’Neill, 1982). We identify a particular subset of Böhm-Bawerk assignment games and we show that it is isomorphic to the whole class of bankruptcy games. This isomorphism enables us to show the uniqueness of the vector of vertical syndication-proof prices for the whole class of Böhm-Bawerk assignment market using well-known results of bankruptcy problems.
Resumo:
[cat] Es presenta un estimador nucli transformat que és adequat per a distribucions de cua pesada. Utilitzant una transformació basada en la distribució de probabilitat Beta l’elecció del paràmetre de finestra és molt directa. Es presenta una aplicació a dades d’assegurances i es mostra com calcular el Valor en Risc.