105 resultados para distributional equivalence

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Starting with logratio biplots for compositional data, which are based on the principle of subcompositional coherence, and then adding weights, as in correspondence analysis, we rediscover Lewi's spectral map and many connections to analyses of two-way tables of non-negative data. Thanks to the weighting, the method also achieves the property of distributional equivalence

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We compare two methods for visualising contingency tables and developa method called the ratio map which combines the good properties of both.The first is a biplot based on the logratio approach to compositional dataanalysis. This approach is founded on the principle of subcompositionalcoherence, which assures that results are invariant to considering subsetsof the composition. The second approach, correspondence analysis, isbased on the chi-square approach to contingency table analysis. Acornerstone of correspondence analysis is the principle of distributionalequivalence, which assures invariance in the results when rows or columnswith identical conditional proportions are merged. Both methods may bedescribed as singular value decompositions of appropriately transformedmatrices. Correspondence analysis includes a weighting of the rows andcolumns proportional to the margins of the table. If this idea of row andcolumn weights is introduced into the logratio biplot, we obtain a methodwhich obeys both principles of subcompositional coherence and distributionalequivalence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a definition of egalitarian equivalence that extends Pazner and Schmeidler's (1978) concept to environments with incomplete information. If every feasible allocation rule can be implemented by an incentive compatible mechanism (as, for instance, in the case of non-exclusive information), then interim egalitarian equivalence and interim incentive efficiency remain compatible, as they were under complete information. When incentive constraints are more restrictive, on the other hand, the two criteria may become incompatible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt"

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low concentrations of elements in geochemical analyses have the peculiarity of beingcompositional data and, for a given level of significance, are likely to be beyond thecapabilities of laboratories to distinguish between minute concentrations and completeabsence, thus preventing laboratories from reporting extremely low concentrations of theanalyte. Instead, what is reported is the detection limit, which is the minimumconcentration that conclusively differentiates between presence and absence of theelement. A spatially distributed exhaustive sample is employed in this study to generateunbiased sub-samples, which are further censored to observe the effect that differentdetection limits and sample sizes have on the inference of population distributionsstarting from geochemical analyses having specimens below detection limit (nondetects).The isometric logratio transformation is used to convert the compositional data in thesimplex to samples in real space, thus allowing the practitioner to properly borrow fromthe large source of statistical techniques valid only in real space. The bootstrap method isused to numerically investigate the reliability of inferring several distributionalparameters employing different forms of imputation for the censored data. The casestudy illustrates that, in general, best results are obtained when imputations are madeusing the distribution best fitting the readings above detection limit and exposes theproblems of other more widely used practices. When the sample is spatially correlated, itis necessary to combine the bootstrap with stochastic simulation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The network revenue management (RM) problem arises in airline, hotel, media,and other industries where the sale products use multiple resources. It can be formulatedas a stochastic dynamic program but the dynamic program is computationallyintractable because of an exponentially large state space, and a number of heuristicshave been proposed to approximate it. Notable amongst these -both for their revenueperformance, as well as their theoretically sound basis- are approximate dynamic programmingmethods that approximate the value function by basis functions (both affinefunctions as well as piecewise-linear functions have been proposed for network RM)and decomposition methods that relax the constraints of the dynamic program to solvesimpler dynamic programs (such as the Lagrangian relaxation methods). In this paperwe show that these two seemingly distinct approaches coincide for the network RMdynamic program, i.e., the piecewise-linear approximation method and the Lagrangianrelaxation method are one and the same.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Equivalence classes of normal form games are defined using the geometryof correspondences of standard equilibiurm concepts like correlated, Nash,and robust equilibrium or risk dominance and rationalizability. Resultingequivalence classes are fully characterized and compared across differentequilibrium concepts for 2 x 2 games. It is argued that the procedure canlead to broad and game-theoretically meaningful distinctions of games aswell as to alternative ways of viewing and testing equilibrium concepts.Larger games are also briefly considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A geometrical treatment of the path integral for gauge theories with first-class constraints linear in the momenta is performed. The equivalence of reduced, Polyakov, Faddeev-Popov, and Faddeev path-integral quantization of gauge theories is established. In the process of carrying this out we find a modified version of the original Faddeev-Popov formula which is derived under much more general conditions than the usual one. Throughout this paper we emphasize the fact that we only make use of the information contained in the action for the system, and of the natural geometrical structures derived from it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The equivalence between the Lagrangian and Hamiltonian formalism is studied for constraint systems. A procedure to construct the Lagrangian constraints from the Hamiltonian constraints is given. Those Hamiltonian constraints that are first class with respect to the Hamiltonian constraints produce Lagrangian constraints that are FL-projectable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides a systematic approach to theproblem of nondata aided symbol-timing estimation for linearmodulations. The study is performed under the unconditionalmaximum likelihood framework where the carrier-frequencyerror is included as a nuisance parameter in the mathematicalderivation. The second-order moments of the received signal arefound to be the sufficient statistics for the problem at hand and theyallow the provision of a robust performance in the presence of acarrier-frequency error uncertainty. We particularly focus on theexploitation of the cyclostationary property of linear modulations.This enables us to derive simple and closed-form symbol-timingestimators which are found to be based on the well-known squaretiming recovery method by Oerder and Meyr. Finally, we generalizethe OM method to the case of linear modulations withoffset formats. In this case, the square-law nonlinearity is foundto provide not only the symbol-timing but also the carrier-phaseerror.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe an equivalence of categories between the category of mixed Hodge structures and a category of vector bundles on the toric complex projective plane which verify some semistability condition. We then apply this correspondence to define an invariant which generalises the notion of R-split mixed Hodge structure and compute extensions in the category of mixed Hodge structures in terms of extensions of the corresponding vector bundles. We also give a relative version of this correspondence and apply it to define stratifications of the bases of the variations of mixed Hodge structure.