61 resultados para scaling rules
Resumo:
We analyze the statistics of rain-event sizes, rain-event durations, and dry-spell durations in a network of 20 rain gauges scattered in an area situated close to the NW Mediterranean coast. Power-law distributions emerge clearly for the dryspell durations, with an exponent around 1.50 ± 0.05, although for event sizes and durations the power-law ranges are rather limited, in some cases. Deviations from power-law behavior are attributed to finite-size effects. A scaling analysis helps to elucidate the situation, providing support for the existence of scale invariance in these distributions. It is remarkable that rain data of not very high resolution yield findings in agreement with self-organized critical phenomena.
Resumo:
The idea of ensuring a guarantee (a minimum amount of the resources) to each agent has recently acquired great relevance, in both social and politi- cal terms. Furthermore, the notion of Solidarity has been treated frequently in redistribution problems to establish that any increment of the resources should be equally distributed taking into account some relevant characteris- tics. In this paper, we combine these two general concepts, guarantee and solidarity, to characterize the uniform rules in bankruptcy problems (Con- strained Equal Awards and Constrained Equal Losses rules). Keywords: Constrained Equal Awards, Constrained Equal Losses, Lower bounds, Bankruptcy problems, Solidarity. JEL classification: C71, D63, D71.
Resumo:
We explore in depth the validity of a recently proposed scaling law for earthquake inter-event time distributions in the case of the Southern California, using the waveform cross-correlation catalog of Shearer et al. Two statistical tests are used: on the one hand, the standard two-sample Kolmogorov-Smirnov test is in agreement with the scaling of the distributions. On the other hand, the one-sample Kolmogorov-Smirnov statistic complemented with Monte Carlo simulation of the inter-event times, as done by Clauset et al., supports the validity of the gamma distribution as a simple model of the scaling function appearing on the scaling law, for rescaled inter-event times above 0.01, except for the largest data set (magnitude greater than 2). A discussion of these results is provided.
Resumo:
Is it important to negotiate on proportions rather than on numbers? To answer this question, we analyze the behavior of well-known bargaining solutions and the claims rules they induce when they are applied to a "proportionally transformed" bargaining set SP -so-called bargaining-in-proportions set. The idea of applying bargaining solutions to claims problems was already developed in Dagan and Volij (1993). They apply the bargaining solutions over a bargaining set that is the one de ned by the claims and the endowment. A comparison among our results and theirs is provided. Keywords: Bargaining problem, Claims problem, Proportional, Constrained Equal Awards, Constrained Equal Losses, Nash bargaining solution. JEL classi fication: C71, D63, D71.
Resumo:
Can rules be used to shield public resources from political interference? The Brazilian constitution and national tax code stipulate that revenue sharing transfers to municipal governments be determined by the size of counties in terms of estimated population. In this paper I document that the population estimates which went into the transfer allocation formula for the year 1991 were manipulated, resulting in significant transfer differentials over the entire 1990's. I test whether conditional on county characteristics that might account for the manipulation, center-local party alignment, party popularity and the extent of interparty fragmentation at the county level are correlated with estimated populations in 1991. Results suggest that revenue sharing transfers were targeted at right-wing national deputies in electorally fragmented counties as well as aligned local executives.
Resumo:
Manipulation of government finances for the benefit of narrowly defined groups is usuallythought to be limited to the part of the budget over which politicians exercise discretion inthe short run, such as earmarks. Analyzing a revenue-sharing program between the centraland local governments in Brazil that uses an allocation formula based on local population estimates,I document two main results: first, that the population estimates entering the formulawere manipulated and second, that this manipulation was political in nature. Consistent withswing-voter targeting by the right-wing central government, I find that municipalities withroughly equal right-wing and non-right-wing vote shares benefited relative to opposition orconservative core support municipalities. These findings suggest that the exclusive focus ondiscretionary transfers in the extant empirical literature on special-interest politics may understatethe true scope of tactical redistribution that is going on under programmatic disguise.
Resumo:
One of the assumptions of the Capacitated Facility Location Problem (CFLP) is thatdemand is known and fixed. Most often, this is not the case when managers take somestrategic decisions such as locating facilities and assigning demand points to thosefacilities. In this paper we consider demand as stochastic and we model each of thefacilities as an independent queue. Stochastic models of manufacturing systems anddeterministic location models are put together in order to obtain a formula for thebacklogging probability at a potential facility location.Several solution techniques have been proposed to solve the CFLP. One of the mostrecently proposed heuristics, a Reactive Greedy Adaptive Search Procedure, isimplemented in order to solve the model formulated. We present some computationalexperiments in order to evaluate the heuristics performance and to illustrate the use ofthis new formulation for the CFLP. The paper finishes with a simple simulationexercise.
Resumo:
We estimate a forward-looking monetary policy reaction function for thepostwar United States economy, before and after Volcker's appointmentas Fed Chairman in 1979. Our results point to substantial differencesin the estimated rule across periods. In particular, interest ratepolicy in the Volcker-Greenspan period appears to have been much moresensitive to changes in expected inflation than in the pre-Volckerperiod. We then compare some of the implications of the estimated rulesfor the equilibrium properties of inflation and output, using a simplemacroeconomic model, and show that the Volcker-Greenspan rule is stabilizing.
Resumo:
Scoring rules that elicit an entire belief distribution through the elicitation of point beliefsare time-consuming and demand considerable cognitive e¤ort. Moreover, the results are validonly when agents are risk-neutral or when one uses probabilistic rules. We investigate a classof rules in which the agent has to choose an interval and is rewarded (deterministically) onthe basis of the chosen interval and the realization of the random variable. We formulatean e¢ ciency criterion for such rules and present a speci.c interval scoring rule. For single-peaked beliefs, our rule gives information about both the location and the dispersion of thebelief distribution. These results hold for all concave utility functions.
Resumo:
Recent research on the dynamics of moral behavior has documented two contrastingphenomena - moral consistency and moral balancing. Moral balancing refers to thephenomenon whereby behaving (un)ethically decreases the likelihood of doing so againat a later time. Moral consistency describes the opposite pattern - engaging in(un)ethical behavior increases the likelihood of doing so later on. Three studies supportthe hypothesis that individuals' ethical mindset (i.e., outcome-based versus rule-based)moderates the impact of an initial (un)ethical act on the likelihood of behaving ethicallyin a subsequent occasion. More specifically, an outcome-based mindset facilitates moralbalancing and a rule-based mindset facilitates moral consistency.
Resumo:
The origins of electoral systems have received scant attention in the literature. Looking at the history of electoral rules in the advanced world in the last century, this paper shows that the existing wide variation in electoral rules across nations can be traced to the strategic decisions that the current ruling parties, anticipating the coordinating consequences of different electoral regimes, make to maximize their representation according to the following conditions. On the one hand, as long as the electoral arena does not change substantially and the current electoral regime serves the ruling parties well, the latter have no incentives to modify the electoral regime. On the other hand, as soon as the electoral arena changes (due to the entry of new voters or a change in their preferences), the ruling parties will entertain changing the electoral system, depending on two main conditions: the emergence of new parties and the coordinating capacities of the old ruling parties. Accordingly, if the new parties are strong, the old parties shift from plurality/majority rules to proportional representation (PR) only if the latter are locked into a 'non-Duvergerian' equilibrium; i.e. if no old party enjoys a dominant position (the case of most small European states)--conversely, they do not if a Duvergerian equilibrium exists (the case of Great Britain). Similarly, whenever the new entrants are weak, a non-PR system is maintained, regardless of the structure of the old party system (the case of the USA). The paper discusses as well the role of trade and ethnic and religious heterogeneity in the adoption of PR rules.
Resumo:
This paper establishes a general framework for metric scaling of any distance measure between individuals based on a rectangular individuals-by-variables data matrix. The method allows visualization of both individuals and variables as well as preserving all the good properties of principal axis methods such as principal components and correspondence analysis, based on the singular-value decomposition, including the decomposition of variance into components along principal axes which provide the numerical diagnostics known as contributions. The idea is inspired from the chi-square distance in correspondence analysis which weights each coordinate by an amount calculated from the margins of the data table. In weighted metric multidimensional scaling (WMDS) we allow these weights to be unknown parameters which are estimated from the data to maximize the fit to the original distances. Once this extra weight-estimation step is accomplished, the procedure follows the classical path in decomposing a matrix and displaying its rows and columns in biplots.