122 resultados para STRICT EQUIVALENCE
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
We propose a definition of egalitarian equivalence that extends Pazner and Schmeidler's (1978) concept to environments with incomplete information. If every feasible allocation rule can be implemented by an incentive compatible mechanism (as, for instance, in the case of non-exclusive information), then interim egalitarian equivalence and interim incentive efficiency remain compatible, as they were under complete information. When incentive constraints are more restrictive, on the other hand, the two criteria may become incompatible.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt"
Resumo:
The network revenue management (RM) problem arises in airline, hotel, media,and other industries where the sale products use multiple resources. It can be formulatedas a stochastic dynamic program but the dynamic program is computationallyintractable because of an exponentially large state space, and a number of heuristicshave been proposed to approximate it. Notable amongst these -both for their revenueperformance, as well as their theoretically sound basis- are approximate dynamic programmingmethods that approximate the value function by basis functions (both affinefunctions as well as piecewise-linear functions have been proposed for network RM)and decomposition methods that relax the constraints of the dynamic program to solvesimpler dynamic programs (such as the Lagrangian relaxation methods). In this paperwe show that these two seemingly distinct approaches coincide for the network RMdynamic program, i.e., the piecewise-linear approximation method and the Lagrangianrelaxation method are one and the same.
Resumo:
Equivalence classes of normal form games are defined using the geometryof correspondences of standard equilibiurm concepts like correlated, Nash,and robust equilibrium or risk dominance and rationalizability. Resultingequivalence classes are fully characterized and compared across differentequilibrium concepts for 2 x 2 games. It is argued that the procedure canlead to broad and game-theoretically meaningful distinctions of games aswell as to alternative ways of viewing and testing equilibrium concepts.Larger games are also briefly considered.
Resumo:
We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.
Resumo:
A geometrical treatment of the path integral for gauge theories with first-class constraints linear in the momenta is performed. The equivalence of reduced, Polyakov, Faddeev-Popov, and Faddeev path-integral quantization of gauge theories is established. In the process of carrying this out we find a modified version of the original Faddeev-Popov formula which is derived under much more general conditions than the usual one. Throughout this paper we emphasize the fact that we only make use of the information contained in the action for the system, and of the natural geometrical structures derived from it.
Resumo:
The equivalence between the Lagrangian and Hamiltonian formalism is studied for constraint systems. A procedure to construct the Lagrangian constraints from the Hamiltonian constraints is given. Those Hamiltonian constraints that are first class with respect to the Hamiltonian constraints produce Lagrangian constraints that are FL-projectable.
Resumo:
This paper provides a systematic approach to theproblem of nondata aided symbol-timing estimation for linearmodulations. The study is performed under the unconditionalmaximum likelihood framework where the carrier-frequencyerror is included as a nuisance parameter in the mathematicalderivation. The second-order moments of the received signal arefound to be the sufficient statistics for the problem at hand and theyallow the provision of a robust performance in the presence of acarrier-frequency error uncertainty. We particularly focus on theexploitation of the cyclostationary property of linear modulations.This enables us to derive simple and closed-form symbol-timingestimators which are found to be based on the well-known squaretiming recovery method by Oerder and Meyr. Finally, we generalizethe OM method to the case of linear modulations withoffset formats. In this case, the square-law nonlinearity is foundto provide not only the symbol-timing but also the carrier-phaseerror.
Resumo:
We describe an equivalence of categories between the category of mixed Hodge structures and a category of vector bundles on the toric complex projective plane which verify some semistability condition. We then apply this correspondence to define an invariant which generalises the notion of R-split mixed Hodge structure and compute extensions in the category of mixed Hodge structures in terms of extensions of the corresponding vector bundles. We also give a relative version of this correspondence and apply it to define stratifications of the bases of the variations of mixed Hodge structure.
Resumo:
The aim of this paper is to find normative foundations of Approval Voting. In order to show that Approval Voting is the only social choice function that satisfies anonymity, neutrality, strategy-proofness and strict monotonicity we rely on an intermediate result which relates strategy-proofness of a social choice function to the properties of Independence of Irrelevant Alternatives and monotonicity of the corresponding social welfare function. Afterwards we characterize Approval Voting by means of strict symmetry, neutrality and strict monotonicity and relate this result to May's Theorem. Finally, we show that it is possible to substitute the property of strict monotonicity by the one efficiency of in the second characterization.
Resumo:
This paper aims at assessing the importance of the initial technological endowments when firms decide to establish a technological agreement. We propose a Bertrand duopoly model where firms evaluate the advantages they can get from the agreement according to its length. Allowing them to exploit a learning process, we depict a strict connection between the starting point and the final result. Moreover, as far as learning is evaluated as an iterative process, the set of initial conditions that lead to successful ventures switches from a continuum of values to a Cantor set.
Resumo:
Income distribution in Spain has experienced a substantial improvement towards equalisation during the second half of the seventies and the eighties; a period during which most OECD countries experienced the opposite trend. In spite of the many recent papers on the Spanish income distribution, the period covered by those stops in 1990. The aim of this paper is to extent the analysis to 1996 employing the same methodology and the same data set (ECPF). Our results not only corroborate the (decreasing inequality) trend found by others during the second half of the eighties, but also suggest that this trend extends over the first half of the nineties. We also show that our main conclusions are robust to changes in the equivalence scale, to changes in the definition of income and to potential data contamination. Finally, we analyse some of the causes which may be driving the overall picture of income inequality using two decomposition techniques. From this analyses three variables emerge as the major responsible factors for the observed improvement in the income distribution: education, household composition and socioeconomic situation of the household head.