130 resultados para Distribution (Economic theory)
Resumo:
While economic theory acknowledges that some features of technology (e.g., indivisibilities, economies of scale and specialization) can fundamentally violate the traditional convexity assumption, almost all empirical studies accept the convexity property on faith. In this contribution, we apply two alternative flexible production technologies to measure total factor productivity growth and test the significance of the convexity axiom using a nonparametric test of closeness between unknown distributions. Based on unique field level data on the petroleum industry, the empirical results reveal significant differences, indicating that this production technology is most likely non-convex. Furthermore, we also show the impact of convexity on answers to traditional convergence questions in the productivity growth literature.
Resumo:
Large infrastructure projects are a major responsibility of urban and regional governments, who usually lack expertise to fully specify the demanded projects. Contractors, typically experts on such projects due to experience with similar projects,advise of the needed design as well as the cost of construction in their bids. Producing the right design is costly. We model such infrastructure projects taking into account their credence goods feature and the costly design effort they require and examine the performance of commonly used contracting methods. We show that when building costs are homogeneous and public information, simultaneous bidding involving shortlisting of two contractors and contingent compensation of both contractors on design efforts outperforms sequential search. If building costs are private information of the contractors and are revealed to them after design cost is sunk,sequential search may be superior to simultaneous bidding.
Resumo:
This paper addresses the gap in economic theory underlying the multidimensional concept of food security and observed data by deriving a composite food security index using the latent class model. The link between poverty and food security is then examined using the new food security index and the robustness of the link is compared with two unidimensional measures often used in the literature. Using Vietnam as a case study, it was found that a weak link exists for the rural but not for the urban composite food security index. The unidimensional measures on the other hand show a strong link in both the rural and urban regions. The results on the link are also different and mixed when two poverty types given by persistent and transient poverty are considered. These findings have important policy implications for a targeted approach to addressing food security.
Resumo:
Speculative property developers, criticised for building dog boxes and the slums of tomorrow, are generally hated by urban planners and the public alike. But the doors of state governments are seemingly always open to developers and their lobbyists. Politicians find it hard to say no to the demands of the development industry for concessions because of the contribution housing construction makes to the economic bottom line and because there is a need for well located housing. New supply is also seen as a solution to declining housing affordability. Classical economic theory however is too simplistic for housing supply. Instead, an offshoot of Game Theory - Market Design – not only offers greater insight into apartment supply but also can simultaneously address price, design and quality issues. New research reveals the most significant risk in residential development is settlement risk – when buyers fail to proceed with their purchase despite there being a pre-sale contract. At the point of settlement, the developer has expended all the project funds only to see forecast revenue evaporate. While new buyers may be found, this process is likely to strip the profitability out of the project. As the global financial crisis exposed, buyers are inclined to walk if property values slide. This settlement problem reflects a poor legal mechanism (the pre-sale contract), and a lack of incentive for truthfulness. A second problem is the search costs of finding buyers. At around 10% of project costs, pre-sales are more expensive to developers than finance. This is where Market Design comes in.
Resumo:
Remedying the mischief of phoenix activity is of practical importance. The benefits include continued confidence in our economy, law that inspires best practice among directors, and law that is articulated in a manner such that penalties act as a sufficient deterrent and the regulatory system is able to detect offenders and bring them to account. Any further reforms must accommodate and tolerate legal phoenix activity. Phoenix activity pushes tolerance of entrepreneurial activity to its absolute limits. The wisest approach would be to front end the reforms so as to alleviate the considerable detection and enforcement burden upon regulatory bodies. There is little doubt that breach of the existing law is difficult and expensive to detect; and this is a significant burden when regulators have shrinking budgets and are rapidly losing feet on the ground. This front end approach may need to include restrictions on access to limited liability. The more limited liability is misused, the stronger the argument to limit access to limited liability. This paper proposes that such an approach is a legitimate next step for a robust and mature capitalist economy.
Resumo:
Using an OLG-model with endogenous growth and public capital we show, that an international capital tax competition leads to inefficiently low tax rates, and as a consequence to lower welfare levels and growth rates. Each national government has an incentive to reduce the capital income tax rates in its effort to ensure that this policy measure increases the domestic private capital stock, domestic income and domestic economic growth. This effort is justified as long as only one country applies this policy. However, if all countries follow this path then all of them will be made worse off in the long run.
Resumo:
Theories of search and search behavior can be used to glean insights and generate hypotheses about how people interact with retrieval systems. This paper examines three such theories, the long standing Information Foraging Theory, along with the more recently proposed Search Economic Theory and the Interactive Probability Ranking Principle. Our goal is to develop a model for ad-hoc topic retrieval using each approach, all within a common framework, in order to (1) determine what predictions each approach makes about search behavior, and (2) show the relationships, equivalences and differences between the approaches. While each approach takes a different perspective on modeling searcher interactions, we show that under certain assumptions, they lead to similar hypotheses regarding search behavior. Moreover, we show that the models are complementary to each other, but operate at different levels (i.e., sessions, patches and situations). We further show how the differences between the approaches lead to new insights into the theories and new models. This contribution will not only lead to further theoretical developments, but also enables practitioners to employ one of the three equivalent models depending on the data available.
Resumo:
We theoretically analyze the impact of changes in foreign income from tourism source countries on the growth of tourism dependent small island economies. Using a general theoretical construct, we attempt to answer the question of how price elasticity of demand, income elasticity of tourist and the degree of competition in the service sector influence the economic development of small economies. One of the main results is that politicians may consider applying policies which lead to a competitive environment in the service sector to maximize growth and the consequent labor income share.
Resumo:
This paper demonstrates that under conditions of imperfect (oligopolistic) competition, a transition from separate accounting (SA) to formula apportionment (FA) does not eliminate the problem of profit shifting via transfer pricing. In particular, if affiliates of a multinational firm face oligopolistic competition, it is beneficial for the multinational to manipulate transfer prices for tax–saving as well as strategic reasons under both FA and SA. The analysis shows that a switch from SA rules to FA rules may actually strengthen profit shifting activities by multinationals.
Resumo:
The policy reform literature is primarily concerned with the construction of reforms that yield welfare gains. By contrast, this paper’s contribution is to develop a theoretical concept for which the focus is upon the sizes of welfare gains accruing from policy reforms rather than upon their signs. In undertaking this task, and by focusing on tariff reforms, we introduce the concept of a steepest ascent policy reform, which is a locally optimal reform in the sense that it achieves the highest marginal gain in utility of any feasible local reform. We argue that this reform presents itself as a natural benchmark for the evaluation of the welfare effectiveness of other popular tariff reforms such as the proportional tariff reduction and the concertina rules, since it provides the maximal welfare gain of all possible local reforms. We derive properties of the steepest ascent tariff reform, construct an index to measure the relative welfare effectiveness of any given tariff reform, determine conditions under which proportional and concertina reforms are locally optimal and provide illustrative examples.
Resumo:
We examine how a multinational's choice to centralize or decentralize its decision structure is affected by country tax differentials. Within a simple model that emphasizes the multiple conflicting roles of transfer prices in multinational enterprises (MNEs)—here, as a strategic precommitment device and a tax manipulation instrument—we show that centralization is more profitable when tax differentials are large. When tax differentials are small, decentralization can be performed in two different ways each providing the highest profits in a particular range of the tax differential. Hence, the paper emphasizes the organizational flexibility that MNEs have in pursuing tax optimization.
Resumo:
Animal Spirits is multi-channel video portrait of key personalities involved in the Global Financial Crisis. The four-screen installation displays these twelve decapitated apostles of free-market economic theory in a tableau of droning pontification. Trapped in a purgatorial loop, they endlessly spout vague and obfuscating explanations and defenses of their ideologies and (in)actions. The work takes a creatively quotidian approach to understanding the language of economics and the financial services industry. Through its endless loop of sound, image, and spoken text, the installation examines some of the ideas, narratives and power dynamics that foster and reward hubris and greed.
Resumo:
The concept of radar was developed for the estimation of the distance (range) and velocity of a target from a receiver. The distance measurement is obtained by measuring the time taken for the transmitted signal to propagate to the target and return to the receiver. The target's velocity is determined by measuring the Doppler induced frequency shift of the returned signal caused by the rate of change of the time- delay from the target. As researchers further developed conventional radar systems it become apparent that additional information was contained in the backscattered signal and that this information could in fact be used to describe the shape of the target itself. It is due to the fact that a target can be considered to be a collection of individual point scatterers, each of which has its own velocity and time- delay. DelayDoppler parameter estimation of each of these point scatterers thus corresponds to a mapping of the target's range and cross range, thus producing an image of the target. Much research has been done in this area since the early radar imaging work of the 1960s. At present there are two main categories into which radar imaging falls. The first of these is related to the case where the backscattered signal is considered to be deterministic. The second is related to the case where the backscattered signal is of a stochastic nature. In both cases the information which describes the target's scattering function is extracted by the use of the ambiguity function, a function which correlates the backscattered signal in time and frequency with the transmitted signal. In practical situations, it is often necessary to have the transmitter and the receiver of the radar system sited at different locations. The problem in these situations is 'that a reference signal must then be present in order to calculate the ambiguity function. This causes an additional problem in that detailed phase information about the transmitted signal is then required at the receiver. It is this latter problem which has led to the investigation of radar imaging using time- frequency distributions. As will be shown in this thesis, the phase information about the transmitted signal can be extracted from the backscattered signal using time- frequency distributions. The principle aim of this thesis was in the development, and subsequent discussion into the theory of radar imaging, using time- frequency distributions. Consideration is first given to the case where the target is diffuse, ie. where the backscattered signal has temporal stationarity and a spatially white power spectral density. The complementary situation is also investigated, ie. where the target is no longer diffuse, but some degree of correlation exists between the time- frequency points. Computer simulations are presented to demonstrate the concepts and theories developed in the thesis. For the proposed radar system to be practically realisable, both the time- frequency distributions and the associated algorithms developed must be able to be implemented in a timely manner. For this reason an optical architecture is proposed. This architecture is specifically designed to obtain the required time and frequency resolution when using laser radar imaging. The complex light amplitude distributions produced by this architecture have been computer simulated using an optical compiler.