127 resultados para Equivalence-preserving
Resumo:
We characterize the Walrasian allocations correspondence by means offour axioms: consistency, replica invariance, individual rationality andPareto optimality. It is shown that for any given class of exchange economiesany solution that satisfies the axioms is a selection from the Walrasianallocations with slack. Preferences are assumed to be smooth, but may besatiated and non--convex. A class of economies is defined as all economieswhose agents' preferences belong to an arbitrary family (finite or infinite)of types. The result can be modified to characterize equal budget Walrasianallocations with slack by replacing individual rationality with individualrationality from equal division. The results are valid also for classes ofeconomies in which core--Walras equivalence does not hold.
Resumo:
We investigate identifiability issues in DSGE models and their consequences for parameter estimation and model evaluation when the objective function measures the distance between estimated and model impulse responses. We show that observational equivalence, partial and weak identification problems are widespread, that they lead to biased estimates, unreliable t-statistics and may induce investigators to select false models. We examine whether different objective functions affect identification and study how small samples interact with parameters and shock identification. We provide diagnostics and tests to detect identification failures and apply them to a state-of-the-art model.
Resumo:
This paper establishes a general framework for metric scaling of any distance measure between individuals based on a rectangular individuals-by-variables data matrix. The method allows visualization of both individuals and variables as well as preserving all the good properties of principal axis methods such as principal components and correspondence analysis, based on the singular-value decomposition, including the decomposition of variance into components along principal axes which provide the numerical diagnostics known as contributions. The idea is inspired from the chi-square distance in correspondence analysis which weights each coordinate by an amount calculated from the margins of the data table. In weighted metric multidimensional scaling (WMDS) we allow these weights to be unknown parameters which are estimated from the data to maximize the fit to the original distances. Once this extra weight-estimation step is accomplished, the procedure follows the classical path in decomposing a matrix and displaying its rows and columns in biplots.
Resumo:
We present a model of price discrimination where a monopolistfaces a consumer who is privately informed about thedistribution of his valuation for an indivisible unit ofgood but has yet to learn privately the actual valuation.The monopolist sequentially screens the consumer with amenu of contracts:the consumer self-selects once by choosing a contract andthen self-selects again when he learns the actual valuation. A deterministic sequential mechanism is a menu of refundcontracts, each consisting of an advance payment and a refundamount in case of no consumption, but sequential mechanismsmay involve randomization.We characterize the optimal sequential mechanism when someconsumer types are more eager in the sense of first-orderstochastic dominance, and when some types face greatervaluation uncertainty in the sense of mean-preserving-spread.We show that it can be optimal to subsidize consumer typeswith smaller valuation uncertainty (through low refund, as inairplane ticket pricing) in order to reduce the rent to thosewith greater uncertainty. The size of distortion depends bothon the type distribution and on how informative the consumer'sinitial private knowledge is about his valuation, but noton how much he initially knows about the valuation per se.
Resumo:
Two finite extensive-form games are empirically equivalent when theempirical distribution on action profiles generated by every behaviorstrategy in one can also be generated by an appropriately chosen behaviorstrategy in the other. This paper provides a characterization ofempirical equivalence. The central idea is to relate a game's informationstructure to the conditional independencies in the empirical distributionsit generates. We present a new analytical device, the influence opportunitydiagram of a game, describe how such a diagram is constructed for a givenextensive-form game, and demonstrate that it provides a complete summaryof the information needed to test empirical equivalence between two games.
Resumo:
Our task in this paper is to analyze the organization of trading in the era of quantitativefinance. To do so, we conduct an ethnography of arbitrage, the trading strategy that bestexemplifies finance in the wake of the quantitative revolution. In contrast to value andmomentum investing, we argue, arbitrage involves an art of association - the constructionof equivalence (comparability) of properties across different assets. In place of essentialor relationa l characteristics, the peculiar valuation that takes place in arbitrage is based on an operation that makes something the measure of something else - associating securities to each other. The process of recognizing opportunities and the practices of making novel associations are shaped by the specific socio-spatial and socio-technical configurations of the trading room. Calculation is distributed across persons and instruments as the trading room organizes interaction among diverse principles of valuation.
Resumo:
The present paper revisits a property embedded in most dynamic macroeconomic models: the stationarity of hours worked. First, I argue that, contrary to what is often believed, there are many reasons why hours could be nonstationary in those models, while preserving the property of balanced growth. Second, I show that the postwar evidence for most industrialized economies is clearly at odds with the assumption of stationary hours per capita. Third, I examine the implications of that evidence for the role of technology as a source of economic fluctuations in the G7 countries.
Resumo:
The paper explores an efficiency hypothesis regarding the contractual process between large retailers, such as Wal-Mart and Carrefour, and their suppliers. The empirical evidence presented supports the idea that large retailers play a quasi-judicial role, acting as "courts of first instance" in their relationships with suppliers. In this role, large retailers adjust the terms of trade to on-going changes and sanction performance failures, sometimes delaying payments. A potential abuse of their position is limited by the need for re-contracting and preserving their reputations. Suppliers renew their confidence in their retailers on a yearly basis, through writing new contracts. This renovation contradicts the alternative hypothesis that suppliers are expropriated by large retailers as a consequence of specific investments.
Resumo:
We show the equivalence between the use of correspondence analysis (CA)of concadenated tables and the application of a particular version ofconjoint analysis called categorical conjoint measurement (CCM). Theconnection is established using canonical correlation (CC). The second part introduces the interaction e¤ects in all three variants of theanalysis and shows how to pass between the results of each analysis.
Resumo:
We construct a weighted Euclidean distance that approximates any distance or dissimilarity measure between individuals that is based on a rectangular cases-by-variables data matrix. In contrast to regular multidimensional scaling methods for dissimilarity data, the method leads to biplots of individuals and variables while preserving all the good properties of dimension-reduction methods that are based on the singular-value decomposition. The main benefits are the decomposition of variance into components along principal axes, which provide the numerical diagnostics known as contributions, and the estimation of nonnegative weights for each variable. The idea is inspired by the distance functions used in correspondence analysis and in principal component analysis of standardized data, where the normalizations inherent in the distances can be considered as differential weighting of the variables. In weighted Euclidean biplots we allow these weights to be unknown parameters, which are estimated from the data to maximize the fit to the chosen distances or dissimilarities. These weights are estimated using a majorization algorithm. Once this extra weight-estimation step is accomplished, the procedure follows the classical path in decomposing the matrix and displaying its rows and columns in biplots.
Resumo:
A partir d’uns quants entrebancs en la tasca de la traducció al català de l’obra An Ideal Husband d’Oscar Wilde (anomenats 'anècdotes'), s’observen i s’analitzen determinats conceptes teòrics sobre la traducció i algunes consideracions generals sobre l’especificitat de la traducció teatral (dites 'categories'), com són els de funcionalitat, representabilitat, fluïdesa, visibilitat, equivalència, reescriptura, fidelitat, adequació, acceptabilitat o estrangerització. La consideració d’aquests valors manejats per la teoria de la traducció entra aquí en diàleg amb les vicissituds de la pràctica de la traducció dramàtica, diàleg del qual resulta una reafirmació d’alguns d’aquests valors i, per contra, l’evidència de l’escassa productivitat (almenys amb caràcter universal) d’alguns altres
Resumo:
This article brings a geometric proposal which can be applied to the bar structures. The starting point is the substitution of the usual knots in a structural web by a system of combining the bars tivo by two, which is achieved by twisting the bars in each knot. The tensile forces that appear and the introduction of joints in each of these knots allow the transition from a rigid or undeformablegeometiy to a neiv flexible" one leading to the possibility of one and the same structural web adopting different sizes while preserving its original geometric form
Resumo:
We present a rule-based Huet’s style anti-unification algorithm for simply-typed lambda-terms in ɳ long β normal form, which computes a least general higher-order pattern generalization. For a pair of arbitrary terms of the same type, such a generalization always exists and is unique modulo α equivalence and variable renaming. The algorithm computes it in cubic time within linear space. It has been implemented and the code is freely available
Resumo:
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo α equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadràtic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently
Resumo:
L’empresa TER és una empresa dedicada al disseny i la construcció de projectes electrònics. La necessitat de comprovar el funcionament dels seus dissenys ha motivat a realitzar un projecte capaç de recollir dades significatives de diferents àmbits com pressió, voltatge, intensitat, temperatura etc. En el mercat les dos maneres més freqüents de recollir aquestes dades són per sensors que donen una equivalència d’un paràmetre físic a un rang de voltatge (0 a 10v) o per corrent (4 a 20mA). Aquestes dades seran adquirides i processades periòdicament per un microcontrolador que les emmagatzemarà una a una per posteriorment visualitzar-les en un LCD o en un programa fet per Visual Basic capaç de generar un document que guardi les dades en Excel. Com a conclusions es pot dir que s’han assolit els objectius, tant els personals com els proposats, per tal de tenir un prototip funcional