800 resultados para pareto inefficiency
Resumo:
Protectionism enjoys surprising popular support, in spite of deadweight losses. At thesame time, trade barriers appear to decline with public information about protection.This paper develops an electoral model with heterogeneously informed voters whichexplains both facts and predicts the pattern of trade policy across industries. In themodel, each agent endogenously acquires more information about his sector of employment. As a result, voters support protectionism, because they learn more about thetrade barriers that help them as producers than those that hurt them as consumers.In equilibrium, asymmetric information induces a universal protectionist bias. Thestructure of protection is Pareto inefficient, in contrast to existing models. The modelpredicts a Dracula effect: trade policy for a sector is less protectionist when there ismore public information about it. Using a measure of newspaper coverage across industries, I find that cross-sector evidence from the United States bears out my theoreticalpredictions.
Resumo:
Um evento extremo de precipitação ocorreu na primeira semana do ano 2000, de 1º a 5 de janeiro, no Vale do Paraíba, parte leste do Estado de São Paulo, Brasil, causando enorme impacto socioeconômico, com mortes e destruição. Este trabalho estudou este evento em 10 estações meteorológicas selecionadas que foram consideradas como aquelas tendo dados mais homogêneos do Que outras estações na região. O modelo de distribuição generalizada de Pareto (DGP) para valores extremos de precipitação de 5 dias foi desenvolvido, individualmente para cada uma dessas estações. Na modelagem da DGP, foi adotada abordagem não-estacionaria considerando o ciclo anual e tendência de longo prazo como co-variaveis. Uma conclusão desta investigação é que as quantidades de precipitação acumulada durante os 5 dias do evento estudado podem ser classificadas como extremamente raras para a região, com probabilidade de ocorrência menor do que 1% para maioria das estações, e menor do que 0,1% em três estações.
Resumo:
In a decentralized setting the game-theoretical predictions are that only strong blockings are allowed to rupture the structure of a matching. This paper argues that, under indifferences, also weak blockings should be considered when these blockings come from the grand coalition. This solution concept requires stability plus Pareto optimality. A characterization of the set of Pareto-stable matchings for the roommate and the marriage models is provided in terms of individually rational matchings whose blocking pairs, if any, are formed with unmatched agents. These matchings always exist and give an economic intuition on how blocking can be done by non-trading agents, so that the transactions need not be undone as agents reach the set of stable matchings. Some properties of the Pareto-stable matchings shared by the Marriage and Roommate models are obtained.
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde.
Resumo:
Mestrado em Engenharia Eletrotécnica e de Computadores - Área de Especialização de Sistemas e Planeamento Industrial
Resumo:
A methodology to increase the probability of delivering power to any load point through the identification of new investments in distribution network components is proposed in this paper. The method minimizes the investment cost as well as the cost of energy not supplied in the network. A DC optimization model based on mixed integer non-linear programming is developed considering the Pareto front technique in order to identify the adequate investments in distribution networks components which allow increasing the probability of delivering power for any customer in the distribution system at the minimum possible cost for the system operator, while minimizing the energy not supplied cost. Thus, a multi-objective problem is formulated. To illustrate the application of the proposed methodology, the paper includes a case study which considers a 180 bus distribution network
Resumo:
We show a standard model where the optimal tax reform is to cut labor taxes and leave capital taxes very high in the short and medium run. Only in the very long run would capital taxes be zero. Our model is a version of Chamley??s, with heterogeneous agents, without lump sum transfers, an upper bound on capital taxes, and a focus on Pareto improving plans. For our calibration labor taxes should be low for the first ten to twenty years, while capital taxes should be at their maximum. This policy ensures that all agents benefit from the tax reform and that capital grows quickly after when the reform begins. Therefore, the long run optimal tax mix is the opposite from the short and medium run tax mix. The initial labor tax cut is financed by deficits that lead to a positive long run level of government debt, reversing the standard prediction that government accumulates savings in models with optimal capital taxes. If labor supply is somewhat elastic benefits from tax reform are high and they can be shifted entirely to capitalists or workers by varying the length of the transition. With inelastic labor supply there is an increasing part of the equilibrium frontier, this means that the scope for benefitting the workers is limited and the total benefits from reforming taxes are much lower.
Resumo:
Traditionally, it is assumed that the population size of cities in a country follows a Pareto distribution. This assumption is typically supported by nding evidence of Zipf's Law. Recent studies question this nding, highlighting that, while the Pareto distribution may t reasonably well when the data is truncated at the upper tail, i.e. for the largest cities of a country, the log-normal distribution may apply when all cities are considered. Moreover, conclusions may be sensitive to the choice of a particular truncation threshold, a yet overlooked issue in the literature. In this paper, then, we reassess the city size distribution in relation to its sensitivity to the choice of truncation point. In particular, we look at US Census data and apply a recursive-truncation approach to estimate Zipf's Law and a non-parametric alternative test where we consider each possible truncation point of the distribution of all cities. Results con rm the sensitivity of results to the truncation point. Moreover, repeating the analysis over simulated data con rms the di culty of distinguishing a Pareto tail from the tail of a log-normal and, in turn, identifying the city size distribution as a false or a weak Pareto law.
Resumo:
This paper is a contribution to the growing literature on constrained inefficiencies in economies with financial frictions. The purpose is to present two simple examples, inspired by the stochastic models in Gersbach-Rochet (2012) and Lorenzoni (2008), of deterministic environments in which such inefficiencies arise through credit constraints. Common to both examples is a pecuniary externality, which operates through an asset price. In the second example, a simple transfer between two groups of agents can bring about a Pareto improvement. In a first best economy, there are no pecuniary externalities because marginal productivities are equalised. But when agents face credit constraints, there is a wedge between their marginal productivities and those of the non-credit-constrained agents. The wedge is the source of the pecuniary externality: economies with these kinds of imperfections in credit markets are not second-best efficient. This is akin to the constrained inefficiency of an economy with incomplete markets, as in Geanakoplos and Polemarchakis (1986).
Resumo:
We analyse the liberal ethics of non-interference applied to social choice. A liberal principle capturing noninterfering views of society and inspired by John Stuart Mill s conception of liberty, is examined. The principle captures the idea that society should not penalise agents after changes in their situation that do not a¤ect others. An impossibility for liberal approaches is highlighted: every social decision rule that satis es unanimity and a general principle of noninterference must be dictatorial. This raises some important issues for liberal approaches in social choice and political philosophy.
Resumo:
This paper estimates a translog stochastic frontier production function in the analysis of all 48 contiguous U.S. states in the period 1970-1983, to attempt to measure and explain changes in technical efficiency. The model allows technical inefficiency to vary over time, and inefficiency effects to be a function of a set of explanatory variables in which the level and composition of public capital plays an important role. Results indicated that U.S. state inefficiency levels were significantly and positively correlated with the ratio of public capital to private capital. The proportion of public capital devoted to highways is negatively correlated with technical inefficiency, suggesting that not only the level but also the composition of public capital influenced state efficiency.
Resumo:
This paper aims to estimate a translog stochastic frontier production function in the analysis of a panel of 150 mixed Catalan farms in the period 1989-1993, in order to attempt to measure and explain variation in technical inefficiency scores with a one-stage approach. The model uses gross value added as the output aggregate measure. Total employment, fixed capital, current assets, specific costs and overhead costs are introduced into the model as inputs. Stochasticfrontier estimates are compared with those obtained using a linear programming method using a two-stage approach. The specification of the translog stochastic frontier model appears as an appropriate representation of the data, technical change was rejected and the technical inefficiency effects were statistically significant. The mean technical efficiency in the period analyzed was estimated to be 64.0%. Farm inefficiency levels were found significantly at 5%level and positively correlated with the number of economic size units.
Resumo:
In the analysis of equilibrium policies in a di erential game, if agents have different time preference rates, the cooperative (Pareto optimum) solution obtained by applying the Pontryagin's Maximum Principle becomes time inconsistent. In this work we derive a set of dynamic programming equations (in discrete and continuous time) whose solutions are time consistent equilibrium rules for N-player cooperative di erential games in which agents di er in their instantaneous utility functions and also in their discount rates of time preference. The results are applied to the study of a cake-eating problem describing the management of a common property exhaustible natural resource. The extension of the results to a simple common property renewable natural resource model in in nite horizon is also discussed.