17 resultados para Rank-size rule
Resumo:
We conduct experiments to investigate the effects of different majority requirements on bargaining outcomes in small and large groups. In particular, we use a Baron-Ferejohn protocol and investigate the effects of decision rules on delay (number of bargaining rounds needed to reach agreement) and measures of "fairness" (inclusiveness of coalitions, equality of the distribution within a coalition). We find that larger groups and unanimity rule are associated with significantly larger decision making costs in the sense that first round proposals more often fail, leading to more costly delay. The higher rate of failure under unanimity rule and in large groups is a combination of three facts: (1) in these conditions, a larger number of individuals must agree, (2) an important fraction of individuals reject offers below the equal share, and (3) proposers demand more (relative to the equal share) in large groups.
Resumo:
This paper uses a structural approach based on the indirect inference principle to estimate a standard version of the new Keynesian monetary (NKM) model augmented with term structure using both revised and real-time data. The estimation results show that the term spread and policy inertia are both important determinants of the U.S. estimated monetary policy rule whereas the persistence of shocks plays a small but significant role when revised and real-time data of output and inflation are both considered. More importantly, the relative importance of term spread and persistent shocks in the policy rule and the shock transmission mechanism drastically change when it is taken into account that real-time data are not well behaved.
Resumo:
Published as an article in: Spanish Economic Review, 2008, vol. 10, issue 4, pages 251-277.
Resumo:
Using US data for the period 1967:5-2002:4, this paper empirically investigates the performance of an augmented version of the Taylor rule (ATR) that (i) allows for the presence of switching regimes, (ii) considers the long-short term spread in addition to the typical variables, (iii) uses an alternative monthly indicator of general economic activity suggested by Stock and Watson (1999), and (iv) considers interest rate smoothing. The estimation results show the existence of switching regimes, one characterized by low volatility and the other by high volatility. Moreover, the scale of the responses of the Federal funds rate to movements in the term spread, inflation and the economic activity index depend on the regime. The estimation results also show robust empirical evidence that the ATR has been more stable during the term of office of Chairman Greenspan than in the pre-Greenspan period. However, a closer look at the Greenspan period shows the existence of two alternative regimes and that the response of the Fed funds rate to inflation has not been significant during this period once the term spread is considered.
Resumo:
In this study we define a cost sharing rule for cost sharing problems. This rule is related to the serial cost-sharing rule defined by Moulin and Shenker (1992). We give some formulas and axiomatic characterizations for the new rule. The axiomatic characterizations are related to some previous ones provided by Moulin and Shenker (1994) and Albizuri (2010).
Resumo:
A manager/mechanism designer must allocate a set of money prizes ($1, $2, .., $n) between n agents working in a team. The agents know the state i.e. who contributed most, second most, etc. The agents' prefer- ences over prizes are state independent. We incorporate the possibility that the manager knows the state with a tiny probability and present a simple mechanism that uniquely implement prizes that respects the true state.
Resumo:
5 p.
Resumo:
The learning of probability distributions from data is a ubiquitous problem in the fields of Statistics and Artificial Intelligence. During the last decades several learning algorithms have been proposed to learn probability distributions based on decomposable models due to their advantageous theoretical properties. Some of these algorithms can be used to search for a maximum likelihood decomposable model with a given maximum clique size, k, which controls the complexity of the model. Unfortunately, the problem of learning a maximum likelihood decomposable model given a maximum clique size is NP-hard for k > 2. In this work, we propose a family of algorithms which approximates this problem with a computational complexity of O(k · n^2 log n) in the worst case, where n is the number of implied random variables. The structures of the decomposable models that solve the maximum likelihood problem are called maximal k-order decomposable graphs. Our proposals, called fractal trees, construct a sequence of maximal i-order decomposable graphs, for i = 2, ..., k, in k − 1 steps. At each step, the algorithms follow a divide-and-conquer strategy based on the particular features of this type of structures. Additionally, we propose a prune-and-graft procedure which transforms a maximal k-order decomposable graph into another one, increasing its likelihood. We have implemented two particular fractal tree algorithms called parallel fractal tree and sequential fractal tree. These algorithms can be considered a natural extension of Chow and Liu’s algorithm, from k = 2 to arbitrary values of k. Both algorithms have been compared against other efficient approaches in artificial and real domains, and they have shown a competitive behavior to deal with the maximum likelihood problem. Due to their low computational complexity they are especially recommended to deal with high dimensional domains.
Resumo:
Rule the World es una aplicación para móviles Android. Consiste en introducir al jugador en una realidad aumentada, mediante el uso de su localización, debe de recoger diferentes objetos para darles diferentes usos, como llevarlos equipados, usarlos para construir otros objetos o enviárselos a amigos. En el siguiente documento se muestra el completo desarrollo de este proyecto, como se ha realizado la gestión, en que partes se ha dividido, la planificación que se ha llevado para realizar el trabajo, el análisis que se hizo de la aplicación, junto con su diseño, como se ha realizado el desarrollo y las pruebas. Este proyecto ha servido para afianzar conocimientos adquiridos a lo largo del grado, como el desarrollo de bases de datos, seguridad y arquitecturas y algoritmos software. Pero también ha servido para aprender nuevas cosas, como programar para un sistema diferente, utilizar elementos poco vistos en el grado, como la geolocalización y los mapas.
Resumo:
[EN] The objective of this study was to determine whether a short training program, using real foods, would decreased their portion-size estimation errors after training. 90 student volunteers (20.18±0.44 y old) of the University of the Basque Country (Spain) were trained in observational techniques and tested in food-weight estimation during and after a 3-hour training period. The program included 57 commonly consumed foods that represent a variety of forms (125 different shapes). Estimates of food weight were compared with actual weights. Effectiveness of training was determined by examining change in the absolute percentage error for all observers and over all foods over time. Data were analyzed using SPSS vs. 13.0. The portion-size errors decreased after training for most of the foods. Additionally, the accuracy of their estimates clearly varies by food group and forms. Amorphous was the food type estimated least accurately both before and after training. Our findings suggest that future dietitians can be trained to estimate quantities by direct observation across a wide range of foods. However this training may have been too brief for participants to fully assimilate the application.
Resumo:
In this paper we introduce a new cost sharing rule-the minimal overlap cost sharing rule-which is associated with the minimal overlap rule for claims problems defined by O'Neill (1982). An axiomatic characterization is given by employing a unique axiom: demand separability. Variations of this axiom enable the serial cost sharing rule (Moulin and Shenker, 1992) and the rules of a family (Albizuri, 2010) that generalize the serial cost sharing rule to be characterized. Finally, a family that includes the minimal overlap cost sharing rule is defined and obtained by means of an axiomatic characterization.
Resumo:
Females of different species might exert female mate choice for different reasons, one of them the aim of avoiding inbreeding. In this study I examine the implication of inbreeding avoidance as a mechanism driving female mate choice in Verreaux’s sifaka lemurs (Propithecus verreauxi). In fact, in this species females are dominant and appear to be able to choose certain males to mate with, while observations indicate that rank, body size, canine size and proportions of fights won are not factors influencing female mate choice. So I hypothesized that females mate choice is driven by inbreeding avoidance in Verreaux’s sifaka lemurs. Tissue and fecal samples were collected in the Kirindy Mitea National Park in western Madagascar as a source of DNA. Parentage was assigned for a sample of the population and relatedness coefficients between dams and sires were estimated and compared to those of between random female and male pairs, dams and other candidate sires within the population and within the groups were the offspring were conceived. I found that there were no significant differences in none of the comparisons which means that Verreaux’s sifaka females do not mate more with males that are more distantly related to them. I concluded that inbreeding avoidance does not appear to be the main force driving female mate choice in Verreaux’s sifaka lemurs and I addressed explanations for these findings. With this study I contribute to our knowledge of female mate choice in lemurs.
Resumo:
Póster presentado en The Energy and Materials Research Conference - EMR2015 celebrado en Madrid (España) entre el 25-27 de febrero de 2015
Resumo:
[EN] This study analyzes the relationship between board size and economic-financial performance in a sample of European firms that constitute the EUROSTOXX50 Index. Based on previous literature, resource dependency and agency theories, and considering regulation developed by the OECD and European Union on the normative of corporate governance for each country in the sample, the authors propose the hypotheses of both positive linear and quadratic relationships between the researched parameters. Using ROA as a benchmark of financial performance and the number of members of the board as measurement of the board size, two OLS estimations are performed. To confirm the robustness of the results the empirical study is tested with two other similar financial ratios, ROE and Tobin s Q. Due to the absence of significant results, an additional factor, firm size, is employed in order to check if it affects firm performance. Delving further into the nature of this relationship, it is revealed that there exists a strong and negative relation between firm size and financial performance. Consequently, it can be asseverated that the generic recommendation one size fits all cannot be applied in this case; which conforms to the Recommendations of the European Union that dissuade using generic models for all countries.
Resumo:
This paper analyses the economic inequality in the municipalities of the Basque Country during the period 1996 and 2010. We have used dates from the Udalmap database mainly the GDP per capita. We have drawn Lorenz Curves and also we have computed Gini indexes to analyse the evolution of inequality during this period. Therefore, we have concluded that there has been an increase of the economic inequality in the municipalities of the Basque Country during this period of time.