948 resultados para [JEL:C70] Mathematical and Quantitative Methods - Game Theory and Bargaining Theory - General
Resumo:
When individuals in a population can acquire traits through learning, each individual may express a certain number of distinct cultural traits. These traits may have been either invented by the individual himself or acquired from others in the population. Here, we develop a game theoretic model for the accumulation of cultural traits through individual and social learning. We explore how the rates of innovation, decay, and transmission of cultural traits affect the evolutionary stable (ES) levels of individual and social learning and the number of cultural traits expressed by an individual when cultural dynamics are at a steady-state. We explore the evolution of these phenotypes in both panmictic and structured population settings. Our results suggest that in panmictic populations, the ES level of learning and number of traits tend to be independent of the social transmission rate of cultural traits and is mainly affected by the innovation and decay rates. By contrast, in structured populations, where interactions occur between relatives, the ES level of learning and the number of traits per individual can be increased (relative to the panmictic case) and may then markedly depend on the transmission rate of cultural traits. This suggests that kin selection may be one additional solution to Rogers's paradox of nonadaptive culture.
Resumo:
Des del principi dels temps històrics, la Matemàtica s'ha generat en totes les civilitzacions sobre la base de la resolució de problemes pràctics.Tanmateix, a partir del període grec la Història ens mostra la necessitat de fer un pas més endavant: l'evolució històrica de la Matemàtica situa els mètodes de raonament com a eix central de la recerca en Matemàtica. A partir d'una ullada als objectius i mètodes de treball d'alguns autors cabdals en la Història dels conceptes matemàtics postulem l'aprenentatge de les formes de raonament matemàtic com l'objectiu central de l'educació matemàtica, i la resolució de problemes com el mitjà més eficient per a coronar aquest objectiu.English version.From the beginning of the historical times, mathematics has been generated in all the civilizations on the base of the resolution of practical problems. Nevertheless, from the greek period History shows us the necessity to take one more step: the historical evolution of mathematics locates the methods of reasoning as the central axis of the research in mathematics. Glancing over the objectives and methods of work used bysome fundamental authors in the History of the mathematical concepts we postulated the learning of the forms of mathematical reasoning like the central objective of the mathematical education, and the resolution of problems as the most efficient way to carry out this objective.
Resumo:
We present simple procedures for the prediction of a real valued sequence. The algorithms are based on a combinationof several simple predictors. We show that if the sequence is a realization of a bounded stationary and ergodic random process then the average of squared errors converges, almost surely, to that of the optimum, given by the Bayes predictor. We offer an analog result for the prediction of stationary gaussian processes.
Resumo:
Minimax lower bounds for concept learning state, for example, thatfor each sample size $n$ and learning rule $g_n$, there exists a distributionof the observation $X$ and a concept $C$ to be learnt such that the expectederror of $g_n$ is at least a constant times $V/n$, where $V$ is the VC dimensionof the concept class. However, these bounds do not tell anything about therate of decrease of the error for a {\sl fixed} distribution--concept pair.\\In this paper we investigate minimax lower bounds in such a--stronger--sense.We show that for several natural $k$--parameter concept classes, includingthe class of linear halfspaces, the class of balls, the class of polyhedrawith a certain number of faces, and a class of neural networks, for any{\sl sequence} of learning rules $\{g_n\}$, there exists a fixed distributionof $X$ and a fixed concept $C$ such that the expected error is larger thana constant times $k/n$ for {\sl infinitely many n}. We also obtain suchstrong minimax lower bounds for the tail distribution of the probabilityof error, which extend the corresponding minimax lower bounds.
Resumo:
Sequential randomized prediction of an arbitrary binary sequence isinvestigated. No assumption is made on the mechanism of generating the bit sequence. The goal of the predictor is to minimize its relative loss, i.e., to make (almost) as few mistakes as the best ``expert'' in a fixed, possibly infinite, set of experts. We point out a surprising connection between this prediction problem and empirical process theory. First, in the special case of static (memoryless) experts, we completely characterize the minimax relative loss in terms of the maximum of an associated Rademacher process. Then we show general upper and lower bounds on the minimaxrelative loss in terms of the geometry of the class of experts. As main examples, we determine the exact order of magnitude of the minimax relative loss for the class of autoregressive linear predictors and for the class of Markov experts.
Resumo:
This paper presents an Optimised Search Heuristic that combines a tabu search method with the verification of violated valid inequalities. The solution delivered by the tabu search is partially destroyed by a randomised greedy procedure, and then the valid inequalities are used to guide the reconstruction of a complete solution. An application of the new method to the Job-Shop Scheduling problem is presented.
Resumo:
We compare two methods for visualising contingency tables and developa method called the ratio map which combines the good properties of both.The first is a biplot based on the logratio approach to compositional dataanalysis. This approach is founded on the principle of subcompositionalcoherence, which assures that results are invariant to considering subsetsof the composition. The second approach, correspondence analysis, isbased on the chi-square approach to contingency table analysis. Acornerstone of correspondence analysis is the principle of distributionalequivalence, which assures invariance in the results when rows or columnswith identical conditional proportions are merged. Both methods may bedescribed as singular value decompositions of appropriately transformedmatrices. Correspondence analysis includes a weighting of the rows andcolumns proportional to the margins of the table. If this idea of row andcolumn weights is introduced into the logratio biplot, we obtain a methodwhich obeys both principles of subcompositional coherence and distributionalequivalence.
Resumo:
We consider the application of normal theory methods to the estimation and testing of a general type of multivariate regressionmodels with errors--in--variables, in the case where various data setsare merged into a single analysis and the observable variables deviatepossibly from normality. The various samples to be merged can differ on the set of observable variables available. We show that there is a convenient way to parameterize the model so that, despite the possiblenon--normality of the data, normal--theory methods yield correct inferencesfor the parameters of interest and for the goodness--of--fit test. Thetheory described encompasses both the functional and structural modelcases, and can be implemented using standard software for structuralequations models, such as LISREL, EQS, LISCOMP, among others. An illustration with Monte Carlo data is presented.
Resumo:
Theorem 1 of Euler s paper of 1737 'Variae Observationes Circa Series Infinitas', states the astonishing result that the series of all unit fractions whose denominators are perfect powers of integers minus unity has sum one. Euler attributes the Theorem to Goldbach. The proof is one of those examples of misuse of divergent series to obtain correct results so frequent during the seventeenth and eighteenth centuries. We examine this proof closelyand, with the help of some insight provided by a modern (and completely dierent) proof of the Goldbach-Euler Theorem, we present a rational reconstruction in terms which could be considered rigorous by modern Weierstrassian standards. At the same time, with a few ideas borrowed from nonstandard analysis we see how the same reconstruction can be also be considered rigorous by modern Robinsonian standards. This last approach, though, is completely in tune with Goldbach and Euler s proof. We hope to convince the reader then how, a few simple ideas from nonstandard analysis, vindicate Euler's work.
Resumo:
This work presents an application of the multilevel analysis techniques tothe study of the abstention in the 2000 Spanish general election. Theinterest of the study is both, substantive and methodological. From thesubstantive point of view the article intends to explain the causes ofabstention and analyze the impact of associationism on it. From themethodological point of view it is intended to analyze the interaction betweenindividual and context with a modelisation that takes into account thehierarchical structure of data. The multilevel study of this paper validatesthe one level results obtained in previous analysis of the abstention andshows that only a fraction of the differences in abstention are explained bythe individual characteristics of the electors. Another important fraction ofthese differences is due to the political and social characteristics of thecontext. Relating to associationism, the data suggest that individualparticipation in associations decrease the probability of abstention. However,better indicators are needed in order to catch more properly the effect ofassociationism in electoral behaviour.
Resumo:
We establish the validity of subsampling confidence intervals for themean of a dependent series with heavy-tailed marginal distributions.Using point process theory, we study both linear and nonlinear GARCH-liketime series models. We propose a data-dependent method for the optimalblock size selection and investigate its performance by means of asimulation study.
Resumo:
Unemployment rates in developed countries have recently reached levels not seenin a generation, and workers of all ages are facing increasing probabilities of losingtheir jobs and considerable losses in accumulated assets. These events likely increasethe reliance that most older workers will have on public social insurance programs,exactly at a time that public finances are suffering from a large drop in contributions.Our paper explicitly accounts for employment uncertainty and unexpectedwealth shocks, something that has been relatively overlooked in the literature, butthat has grown in importance in recent years. Using administrative and householdlevel data we empirically characterize a life-cycle model of retirement and claimingdecisions in terms of the employment, wage, health, and mortality uncertainty facedby individuals. Our benchmark model explains with great accuracy the strikinglyhigh proportion of individuals who claim benefits exactly at the Early RetirementAge, while still explaining the increased claiming hazard at the Normal RetirementAge. We also discuss some policy experiments and their interplay with employmentuncertainty. Additionally, we analyze the effects of negative wealth shocks on thelabor supply and claiming decisions of older Americans. Our results can explainwhy early claiming has remained very high in the last years even as the early retirementpenalties have increased substantially compared with previous periods, andwhy labor force participation has remained quite high for older workers even in themidst of the worse employment crisis in decades.
Resumo:
We examine the relationship between institutions, culture and cyclical fluctuations for a sampleof 45 European, Middle Eastern and North African countries. Better governance is associated withshorter and less severe contractions and milder expansions. Certain cultural traits, such as lack ofacceptance of power distance and individualism, are also linked business cycle features. Businesscycle synchronization is tightly related to similarities in the institutional environment. Mediterraneancountries conform to these general tendencies.
Resumo:
This paper studies the determinants of school choice, focusing on the role of information. Weconsider how parents' search efforts and their capacity to process information (i.e., tocorrectly assess schools) affect the quality of the schools they choose for their children. Usinga novel dataset, we are able to identify parents' awareness of schools in their neighborhoodand measure their capacity to rank the quality of the school with respect to the officialrankings. We find that parents education and wealth are important factors in determiningtheir level of school awareness and information gathering. Moreover, these search effortshave important consequences in terms of the quality of school choice.
Resumo:
Asymptotic chi-squared test statistics for testing the equality ofmoment vectors are developed. The test statistics proposed aregeneralizedWald test statistics that specialize for different settings by inserting andappropriate asymptotic variance matrix of sample moments. Scaled teststatisticsare also considered for dealing with situations of non-iid sampling. Thespecializationwill be carried out for testing the equality of multinomial populations, andtheequality of variance and correlation matrices for both normal andnon-normaldata. When testing the equality of correlation matrices, a scaled versionofthe normal theory chi-squared statistic is proven to be an asymptoticallyexactchi-squared statistic in the case of elliptical data.