104 resultados para Finite Simple Groups
Resumo:
Hierarchical clustering is a popular method for finding structure in multivariate data,resulting in a binary tree constructed on the particular objects of the study, usually samplingunits. The user faces the decision where to cut the binary tree in order to determine the numberof clusters to interpret and there are various ad hoc rules for arriving at a decision. A simplepermutation test is presented that diagnoses whether non-random levels of clustering are presentin the set of objects and, if so, indicates the specific level at which the tree can be cut. The test isvalidated against random matrices to verify the type I error probability and a power study isperformed on data sets with known clusteredness to study the type II error.
Resumo:
We introduce several exact nonparametric tests for finite sample multivariatelinear regressions, and compare their powers. This fills an important gap inthe literature where the only known nonparametric tests are either asymptotic,or assume one covariate only.
Resumo:
Working Paper no longer available. Please contact the author.
Resumo:
Small sample properties are of fundamental interest when only limited data is avail-able. Exact inference is limited by constraints imposed by speci.c nonrandomizedtests and of course also by lack of more data. These e¤ects can be separated as we propose to evaluate a test by comparing its type II error to the minimal type II error among all tests for the given sample. Game theory is used to establish this minimal type II error, the associated randomized test is characterized as part of a Nash equilibrium of a .ctitious game against nature.We use this method to investigate sequential tests for the di¤erence between twomeans when outcomes are constrained to belong to a given bounded set. Tests ofinequality and of noninferiority are included. We .nd that inference in terms oftype II error based on a balanced sample cannot be improved by sequential sampling or even by observing counter factual evidence providing there is a reasonable gap between the hypotheses.
Resumo:
We will call a game a reachable (pure strategy) equilibria game if startingfrom any strategy by any player, by a sequence of best-response moves weare able to reach a (pure strategy) equilibrium. We give a characterizationof all finite strategy space duopolies with reachable equilibria. Wedescribe some applications of the sufficient conditions of the characterization.
Resumo:
This paper explores the relationships between noncooperative bargaining games and the consistent value for non-transferable utility (NTU) cooperative games. A dynamic approach to the consistent value for NTU games is introduced: the consistent vector field. The main contribution of the paper is to show that the consistent field is intimately related to the concept of subgame perfection for finite horizon noncooperative bargaining games, as the horizon goes to infinity and the cost of delay goes to zero. The solutions of the dynamic system associated to the consistent field characterize the subgame perfect equilibrium payoffs of the noncooperative bargaining games. We show that for transferable utility, hyperplane and pure bargaining games, the dynamics of the consistent fields converge globally to the unique consistent value. However, in the general NTU case, the dynamics of the consistent field can be complex. An example is constructed where the consistent field has cyclic solutions; moreover, the finite horizon subgame perfect equilibria do not approach the consistent value.
Resumo:
We propose a simple adaptive procedure for playing a game. In thisprocedure, players depart from their current play with probabilities thatare proportional to measures of regret for not having used other strategies(these measures are updated every period). It is shown that our adaptiveprocedure guaranties that with probability one, the sample distributionsof play converge to the set of correlated equilibria of the game. Tocompute these regret measures, a player needs to know his payoff functionand the history of play. We also offer a variation where every playerknows only his own realized payoff history (but not his payoff function).
Resumo:
This paper ia an attempt to clarify the relationship between fractionalization,polarization and conflict. The literature on the measurement of ethnic diversityhas taken as given that the proper measure for heterogeneity can be calculatedby using the fractionalization index. This index is widely used in industrialeconomics and, for empirical purposes, the ethnolinguistic fragmentation isready available for regression exercises. Nevertheless the adequacy of asynthetic index of hetergeneity depends on the intrinsic characteristicsof the heterogeneous dimension to be measured. In the case of ethnicdiversity there is a very strong conflictive dimension. For this reasonwe argue that the measure of heterogeneity should be one of the class ofpolarization measures. In fact the intuition of the relationship betweenconflict and fractionalization do not hold for more than two groups. Incontrast with the usual problem of polarization indices, which are ofdifficult empirical implementation without making some arbitrary choiceof parameters, we show that the RQ index, proposed by Reynal-Querol (2002),is the only discrete polarization measure that satisfies the basic propertiesof polarization. Additionally we present a derivation of the RQ index froma simple rent seeking model. In the empirical section we show that whileethnic polarization has a positive effect on civil wars and, indirectly ongrowth, this effect is not present when we use ethnic fractionalization.
Resumo:
The effectiveness of decision rules depends on characteristics of bothrules and environments. A theoretical analysis of environments specifiesthe relative predictive accuracies of the lexicographic rule 'take-the-best'(TTB) and other simple strategies for binary choice. We identify threefactors: how the environment weights variables; characteristics of choicesets; and error. For cases involving from three to five binary cues, TTBis effective across many environments. However, hybrids of equal weights(EW) and TTB models are more effective as environments become morecompensatory. In the presence of error, TTB and similar models do not predictmuch better than a naïve model that exploits dominance. We emphasizepsychological implications and the need for more complete theories of theenvironment that include the role of error.
Resumo:
In the fixed design regression model, additional weights areconsidered for the Nadaraya--Watson and Gasser--M\"uller kernel estimators.We study their asymptotic behavior and the relationships between new andclassical estimators. For a simple family of weights, and considering theIMSE as global loss criterion, we show some possible theoretical advantages.An empirical study illustrates the performance of the weighted estimatorsin finite samples.
Resumo:
In this paper we attempt to describe the general reasons behind the world populationexplosion in the 20th century. The size of the population at the end of the century inquestion, deemed excessive by some, was a consequence of a dramatic improvementin life expectancies, attributable, in turn, to scientific innovation, the circulation ofinformation and economic growth. Nevertheless, fertility is a variable that plays acrucial role in differences in demographic growth. We identify infant mortality, femaleeducation levels and racial identity as important exogenous variables affecting fertility.It is estimated that in poor countries one additional year of primary schooling forwomen leads to 0.614 child less per couple on average (worldwide). While it may bepossible to identify a global tendency towards convergence in demographic trends,particular attention should be paid to the case of Africa, not only due to its differentdemographic patterns, but also because much of the continent's population has yet toexperience improvement in quality of life generally enjoyed across the rest of theplanet.
Resumo:
This paper studies, on the one hand, theories set out around theconsideration of the external partners in the consolidated informationand on the other hand, financial models that discuss the convenience ofthe separation or not of the different elements that form part of theliabilities of the balance sheet of the companies. A Model is proposed,the External Partners Model, which financially argues a certain presentationand processing of such and that, in our opinion, facilitates the analysisof the consolidated financial statements. This model is based on twohypotheses: (1) the economic and financial variables are not independentand (2) the value of the company depends, among other factors, of thetype of sources that constitute their capital. These two hypotheses willimply that a separation should be included in the consolidated balance sheet between equity and liabilities as they are different sources ofcapital and then its separation will give relevant information to itsusers.
Resumo:
Departures from pure self interest in economic experiments have recently inspired models of "social preferences". We conduct experiments on simple two-person and three-person games with binary choices that test these theories more directly than the array of games conventionally considered. Our experiments show strong support for the prevalence of "quasi-maximin" preferences: People sacrifice to increase the payoffs for all recipients, but especially for the lowest-payoff recipients. People are also motivated by reciprocity: While people are reluctant to sacrifice to reciprocate good or bad behavior beyond what they would sacrifice for neutral parties, they withdraw willingness to sacrifice to achieve a fair outcome when others are themselves unwilling to sacrifice. Some participants are averse to getting different payoffs than others, but based on our experiments and reinterpretation of previous experiments we argue that behavior that has been presented as "difference aversion" in recent papers is actually a combination of reciprocal and quasi-maximin motivations. We formulate a model in which each player is willing to sacrifice to allocate the quasi-maximin allocation only to those players also believed to be pursuing the quasi-maximin allocation, and may sacrifice to punish unfair players.
Resumo:
We present an exact test for whether two random variables that have known bounds on their support are negatively correlated. The alternative hypothesis is that they are not negatively correlated. No assumptions are made on the underlying distributions. We show by example that the Spearman rank correlation test as the competing exact test of correlation in nonparametric settings rests on an additional assumption on the data generating process without which it is not valid as a test for correlation.We then show how to test for the significance of the slope in a linear regression analysis that invovles a single independent variable and where outcomes of the dependent variable belong to a known bounded set.