942 resultados para Transaction Cost Economics
Resumo:
Employment subsidies, wage subsidies, unemployment, displacement
Resumo:
Magdeburg, Univ., Fak. für Wirtschaftswiss., Diss., 2010
Resumo:
Economies are open complex adaptive systems far from thermodynamic equilibrium, and neo-classical environmental economics seems not to be the best way to describe the behaviour of such systems. Standard econometric analysis (i.e. time series) takes a deterministic and predictive approach, which encourages the search for predictive policy to ‘correct’ environmental problems. Rather, it seems that, because of the characteristics of economic systems, an ex-post analysis is more appropriate, which describes the emergence of such systems’ properties, and which sees policy as a social steering mechanism. With this background, some of the recent empirical work published in the field of ecological economics that follows the approach defended here is presented. Finally, the conclusion is reached that a predictive use of econometrics (i.e. time series analysis) in ecological economics should be limited to cases in which uncertainty decreases, which is not the normal situation when analysing the evolution of economic systems. However, that does not mean we should not use empirical analysis. On the contrary, this is to be encouraged, but from a structural and ex-post point of view.
Resumo:
Ecological economics is a recently developed field, which sees the economy as a subsystem of a larger finite global ecosystem. Ecological economists question the sustainability of the economy because of its environmental impacts and its material and energy requirements, and also because of the growth of population. Attempts at assigning money values to environmental services and losses, and attempts at correcting macroeconomic accounting, are part of ecological economics, but its main thrust is rather in developing physical indicators and indexes of sustainability. Ecological economists also work on the relations between property rights and resource management, they model the interactions between the economy and the environment, they study ecological distribution conflicts, they use management tools such as integrated environmental assessment and multi-criteria decision aids, and they propose new instruments of environmental policy.
Resumo:
We study the relation between the number of firms and price-cost margins under price competition with uncertainty about competitors' costs. We present results of an experiment in which two, three and four identical firms repeatedly interact in this environment. In line with the theoretical prediction, market prices decrease with the number of firms, but on average stay above marginal costs. Pricing is less aggressive in duopolies than in triopolies and tetrapolies. However, independently from the number of firms, pricing is more aggressive than in the theoretical equilibrium. Both the absolute and the relative surpluses increase with the number of firms. Total surplus is close to the equilibrium level, since enhanced consumer surplus through lower prices is counteracted by occasional displacements of the most efficient firm in production.
Resumo:
Description of a costing model developed by digital production librarian to determine the cost to put an item into the Claremont Colleges Digital Library at the Claremont University Consortium. This case study includes variables such as material types and funding sources, data collection methods, and formulas and calculations for analysis. This model is useful for grant applications, cost allocations, and budgeting for digital project coordinators and digital library projects.
Resumo:
In this note we quantify to what extent indirect taxation influences and distorts prices. To do so we use the networked accounting structure of the most recent input-output table of Catalonia, an autonomous region of Spain, to model price formation. The role of indirect taxation is considered both from a classical value perspective and a more neoclassical flavoured one. We show that they would yield equivalent results under some basic premises. The neoclassical perspective, however, offers a bit more flexibility to distinguish among different tax figures and hence provide a clearer disaggregate picture of how an indirect tax ends up affecting, and by how much, the cost structure.
Resumo:
BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.
Resumo:
INTRODUCTION: Hip fractures are responsible for excessive mortality, decreasing the 5-year survival rate by about 20%. From an economic perspective, they represent a major source of expense, with direct costs in hospitalization, rehabilitation, and institutionalization. The incidence rate sharply increases after the age of 70, but it can be reduced in women aged 70-80 years by therapeutic interventions. Recent analyses suggest that the most efficient strategy is to implement such interventions in women at the age of 70 years. As several guidelines recommend bone mineral density (BMD) screening of postmenopausal women with clinical risk factors, our objective was to assess the cost-effectiveness of two screening strategies applied to elderly women aged 70 years and older. METHODS: A cost-effectiveness analysis was performed using decision-tree analysis and a Markov model. Two alternative strategies, one measuring BMD of all women, and one measuring BMD only of those having at least one risk factor, were compared with the reference strategy "no screening". Cost-effectiveness ratios were measured as cost per year gained without hip fracture. Most probabilities were based on data observed in EPIDOS, SEMOF and OFELY cohorts. RESULTS: In this model, which is mostly based on observed data, the strategy "screen all" was more cost effective than "screen women at risk." For one woman screened at the age of 70 and followed for 10 years, the incremental (additional) cost-effectiveness ratio of these two strategies compared with the reference was 4,235 euros and 8,290 euros, respectively. CONCLUSION: The results of this model, under the assumptions described in the paper, suggest that in women aged 70-80 years, screening all women with dual-energy X-ray absorptiometry (DXA) would be more effective than no screening or screening only women with at least one risk factor. Cost-effectiveness studies based on decision-analysis trees maybe useful tools for helping decision makers, and further models based on different assumptions should be performed to improve the level of evidence on cost-effectiveness ratios of the usual screening strategies for osteoporosis.
Resumo:
Although extended secondary prophylaxis with low-molecular-weight heparin was recently shown to be more effective than warfarin for cancer-related venous thromboembolism, its cost-effectiveness compared to traditional prophylaxis with warfarin is uncertain. We built a decision analytic model to evaluate the clinical and economic outcomes of a 6-month course of low-molecular-weight heparin or warfarin therapy in 65-year-old patients with cancer-related venous thromboembolism. We used probability estimates and utilities reported in the literature and published cost data. Using a US societal perspective, we compared strategies based on quality-adjusted life-years (QALYs) and lifetime costs. The incremental cost-effectiveness ratio of low-molecular-weight heparin compared with warfarin was 149,865 dollars/QALY. Low-molecular-weight heparin yielded a quality-adjusted life expectancy of 1.097 QALYs at the cost of 15,329 dollars. Overall, 46% (7108 dollars) of the total costs associated with low-molecular-weight heparin were attributable to pharmacy costs. Although the low-molecular-weigh heparin strategy achieved a higher incremental quality-adjusted life expectancy than the warfarin strategy (difference of 0.051 QALYs), this clinical benefit was offset by a substantial cost increment of 7,609 dollars. Cost-effectiveness results were sensitive to variation of the early mortality risks associated with low-molecular-weight heparin and warfarin and the pharmacy costs for low-molecular-weight heparin. Based on the best available evidence, secondary prophylaxis with low-molecular-weight heparin is more effective than warfarin for cancer-related venous thromboembolism. However, because of the substantial pharmacy costs of extended low-molecular-weight heparin prophylaxis in the US, this treatment is relatively expensive compared with warfarin.
Resumo:
The approaches and opinions of economists often dominate public policy discussion. Economists have gained this privileged position partly (or perhaps mainly) because of the obvious relevance of their subject matter, but also because of the unified methodology (neo-classical economics) that the vast majority of modern economists bring to their analysis of policy problems and proposed solutions. The idea of Pareto efficiency and its potential trade-off with equity is a central idea that is understood by all economists and this common language provides the economics profession with a powerful voice in public affairs. The purpose of this paper is to review and reflect upon the way in which economists find themselves analysing and providing suggestions for social improvements and how this role has changed over roughly the last 60 years. We focus on the fundamental split in the public economics tradition between those that adhere to public finance and those that adhere to public choice. A pure public finance perspective views failures in society as failures of the market. The solutions are technical, as might be enacted by a benevolent dictator. The pure public choice view accepts (sometimes grudgingly) that markets may fail, but so, it insists, does politics. This signals institutional reforms to constrain the potential for political failure. Certain policy recommendations may be viewed as compatible with both traditions, but other policy proposals will be the opposite of that proposed within the other tradition. In recent years a political economics synthesis emerged. This accepts that institutions are very important and governments require constraints, but that some degree of benevolence on the part of policy makers should not be assumed non-existent. The implications for public policy from this approach are, however, much less clear and perhaps more piecemeal. We also discuss analyses of systematic failure, not so much on the part of markets or politicians, but by voters. Most clearly this could lead to populism and relaxing the idea that voters necessarily choose their interests. The implications for public policy are addressed. Throughout the paper we will relate the discussion to the experience of UK government policy-making.
Resumo:
We propose a theoretical analysis of democratization processes in which an elite extends the franchise to the poor when threatened with a revolution. The poor could govern without changing the political system by maintaining a continuous revolutionary threat on the elite. Revolutionary threats, however, are costly to the poor and democracy is a superior sys- tem in which political agreement is reached through costless voting. This provides a rationale for democratic transitions that has not been discussed in the literature.
Resumo:
We consider a general equilibrium model a la Bhaskar (Review of Economic Studies 2002): there are complementarities across sectors, each of which comprise (many) heterogenous monopolistically competitive firms. Bhaskar's model is extended in two directions: production requires capital, and labour markets are segmented. Labour market segmentation models the difficulties of labour migrating across international barriers (in a trade context) or from a poor region to a richer one (in a regional context), whilst the assumption of a single capital market means that capital flows freely between countries or regions. The model is solved analytically and a closed form solution is provided. Adding labour market segmentation to Bhaskar's two-tier industrial structure allows us to study, inter alia, the impact of competition regulations on wages and - financial flows both in the regional and international context, and the output, welfare and financial implications of relaxing immigration laws. The analytical approach adopted allows us, not only to sign the effect of policies, but also to quantify their effects. Introducing capital as a factor of production improves the realism of the model and refi nes its empirically testable implications.