953 resultados para Computable general equilibrium modelling
Resumo:
This paper explores the interaction between upstream firms and downstream firms in a two-region general equilibrium model. In many countries, lower tariff rates are set for intermediate manufactured goods and higher tariff rates are set for final manufactured goods. The derived results imply that such settings of tariff rates tend to preserve a symmetric spread of upstream and downstream firms, and continuing tariff reduction may cause core-periphery structures. In the case in which the circular causality between upstream and downstream firms is focused as agglomeration forces, the present model is fully solved. Thus, we find that (1) the present model displays, at most, three interior steady states, (2) when the asymmetric steady-states exist, they are unstable and (3) location displays hysteresis when the transport costs of intermediate manufactured goods are sufficiently high.
Resumo:
Peer reviewed
Resumo:
Fire has been always a major concern for designers of steel and concrete structures. Designing fire-resistant structural elements is not an easy task due to several limitations such as the lack of fire-resistant construction materials. Concrete reinforcement cover and external insulation are the most commonly adopted systems to protect concrete and steel from overheating, while spalling of concrete is minimised by using HPFRC instead of standard concrete. Although these methodologies work very well for low rise concrete structures, this is not the case for high-rise and inaccessible buildings where fire loading is much longer. Fire can permanently damage structures that cost a lot of money. This is unsafe and can lead to loss of life. In this research, the author proposes a new type of main reinforcement for concrete structures which can provide better fire-resistance than steel or FRP re-bars. This consists of continuous braided fibre rope, generally made from fire-resistant materials such as carbon or glass fibre. These fibres have excellent tensile strengths, sometimes in excess of ten times greater than steel. In addition to fire-resistance, these ropes can produce lighter and corrosive resistant structures. Avoiding the use of expensive resin binders, fibres are easily bound together using braiding techniques, ensuring that tensile stress is evenly distributed throughout the reinforcement. In order to consider braided ropes as a form of reinforcement it is first necessary to establish the mechanical performance at room temperature and investigate the pull-out resistance for both unribbed and ribbed ropes. Ribbing of ropes was achieved by braiding the rope over a series of glass beads. Adhesion between the rope and concrete was drastically improved due to ribbing, and further improved by pre-stressing ropes and reducing the slacked fibres. Two types of material have been considered for the ropes: carbon and aramid. An implicit finite element approach is proposed to model braided fibres using Total Lagrangian formulation, based on the theory of small strains and large rotations. Modelling tows and strands as elastic transversely isotropic materials was a good assumption when stiff and brittle fibres such as carbon and glass fibres are considered. The rope-to-concrete and strand-to-strand bond interaction/adhesion was numerically simulated using newly proposed hierarchical higher order interface elements. Elastic and linear damage cohesive models were used effectively to simulate non-penetrative 'free' sliding interaction between strands, and the adhesion between ropes and concrete respectively. Numerical simulation showed similar de-bonding features when compared with experimental pull-out results of braided ribbed rope reinforced concrete.
Resumo:
Este trabalho tem por objetivo discutir o surgimento de um programa de pesquisa na Ciência Econômica, no que concerne a análise das assimetrias de informação, as diferenças epistemológicas e as implicações em termos de equilíbrio ótimo de Pareto, em contraponto à abordagem neoclássica standard. Em busca de tal objetivo, foi necessário destacar o método de ambos os paradigmas; todavia, era igualmente necessário discutir a filosofia/epistemologia da ciência envolvida e que serviria de base para uma abordagem relacionada a mudanças paradigmáticas na ciência. No capítulo 1, discutimos a epistemologia da ciência, a partir de três autores: Popper, Kuhn e Lakatos. Definimos o conjunto de hipóteses que podem ser associadas ao método empregado pela Escola Neoclássica, a partir da filosofia da ciência proposta por Lakatos. Em seguida, no capítulo 2, fizemos uma longa exposição do método neoclássico, definindo os axiomas inerentes às preferências bem-comportadas, apresentando algebricamente o equilíbrio geral walrasiano, exemplificando o relaxamento de hipóteses auxiliares do modelo neoclássico a partir de Friedman e, por fim, aplicando o instrumental neoclássico ao relaxamento da hipótese auxiliar de perfeição da informação, a partir do modelo desenvolvido por Grossman & Stiglitz (1976), bem como da expansão matemática desenvolvida pelo presente trabalho. Finalmente, encerramos a presente dissertação com o capítulo 3, no qual, basicamente, expomos as principais contribuições de autores como Stiglitz, Akerlof e Arrow, no que concerne a mercados permeados por informações assimétricas e comportamentos oportunistas. Procuramos mostrar as consequências para o próprio mercado, chegando a resultados em que o mesmo era extinto. Apresentamos a segunda parte do modelo de Grossman & Stiglitz, enfatizando a natureza imperfeita do sistema de preços, sua incapacidade de transmitir todas as informações sobre os bens ao conjunto dos agentes, e, por fim, discutimos tópicos variados ligados à Economia da Informação.
Resumo:
This paper aims at analyzing the effects of lobbying over economic growth and primarily welfare. We model explicitly the interaction between policy-makers and firms in a setup where the latter undertakes political contributions to the former in exchange for more restrictive market regulations which induce exit and enhance the profitability of the market. In a sectorial equilibrium, despite stimulating growth, lobbying restricts the market structure and reduces welfare when compared to the free-entry outcome. However, once general equilibrium considerations are taken into account, we find that lobbying may improve welfare over a welfare maximizing free-entry equilibrium, by means of an expansion in aggregate demand. This introduces a new paradigm in the literature about the effects of lobbying over economic performance.
Resumo:
A Masters Thesis, presented as part of the requirements for the award of a Research Masters Degree in Economics from NOVA – School of Business and Economics
Resumo:
We use a novel pricing model to imply time series of diffusive volatility and jump intensity from S&P 500 index options. These two measures capture the ex ante risk assessed by investors. Using a simple general equilibrium model, we translate the implied measures of ex ante risk into an ex ante risk premium. The average premium that compensates the investor for the ex ante risks is 70% higher than the premium for realized volatility. The equity premium implied from option prices is shown to significantly predict subsequent stock market returns.
Resumo:
This paper presents a general equilibrium model of money demand where the velocity of money changes in response to endogenous fluctuations in the interest rate. The parameter space can be divided into two subsets: one where velocity is constant as in standard cash-in-advance models, and another one where velocity fluctuates as in Baumol (1952). The model provides an explanation of why, for a sample of 79 countries, the correlation between the velocity of money and the inflation rate appears to be low, unlike common wisdom would suggest. The reason is the diverse transaction technologies available in different economies.
Resumo:
Actual tax systems do not follow the normative recommendations of yhe theory of optimal taxation. There are two reasons for this. Firstly, the informational difficulties of knowing or estimating all relevant elasticities and parameters. Secondly, the political complexities that would arise if a new tax implementation would depart too much from current systems that are perceived as somewhat egalitarians. Hence an ex-novo overhaul of the tax system might just be non-viable. In contrast, a small marginal tax reform could be politically more palatable to accept and economically more simple to implement. The goal of this paper is to evaluate, as a step previous to any tax reform, the marginal welfare cost of the current tax system in Spain. We do this by using a computational general equilibrium model calibrated to a point-in-time micro database. The simulations results show that the Spanish tax system gives rise to a considerable marginal excess burden. Its order of magnitude is of about 0.50 money units for each additional money unit collected through taxes.
Resumo:
Recently there has been a renewed research interest in the properties of non survey updates of input-output tables and social accounting matrices (SAM). Along with the venerable and well known scaling RAS method, several alternative new procedures related to entropy minimization and other metrics have been suggested, tested and used in the literature. Whether these procedures will eventually substitute or merely complement the RAS approach is still an open question without a definite answer. The performance of many of the updating procedures has been tested using some kind of proximity or closeness measure to a reference input-output table or SAM. The first goal of this paper, in contrast, is the proposal of checking the operational performance of updating mechanisms by way of comparing the simulation results that ensue from adopting alternative databases for calibration of a reference applied general equilibrium model. The second goal is to introduce a new updatin! g procedure based on information retrieval principles. This new procedure is then compared as far as performance is concerned to two well-known updating approaches: RAS and cross-entropy. The rationale for the suggested cross validation is that the driving force for having more up to date databases is to be able to conduct more current, and hopefully more credible, policy analyses.
Resumo:
The choice of either the rate of monetary growth or the nominal interest rate as the instrument controlled by monetary authorities has both positive and normative implications for economic performance. We reexamine some of the issues related to the choice of the monetary policy instrument in a dynamic general equilibrium model exhibiting endogenous growth in which a fraction of productive government spending is financed by means of issuing currency. When we evaluate the performance of the two monetary instruments attending to the fluctuations of endogenous variables, we find that the inflation rate is less volatile under nominal interest rate targeting. Concerning the fluctuations of consumption and of the growth rate, both monetary policy instruments lead to statistically equivalent volatilities. Finally, we show that none of these two targeting procedures displays unambiguously higher welfare levels.
Resumo:
This paper investigates the role of variable capacity utilization as a source of asymmetries in the relationship between monetary policy and economic activity within a dynamic stochastic general equilibrium framework. The source of the asymmetry is directly linked to the bottlenecks and stock-outs that emerge from the existence of capacity constraints in the real side of the economy. Money has real effects due to the presence of rigidities in households' portfolio decisions in the form of a Luces-Fuerst 'limited participation' constraint. The model features variable capacity utilization rates across firms due to demand uncertainty. A monopolistic competitive structure provides additional effects through optimal mark-up changes. The overall message of this paper for monetary policy is that the same actions may have different effects depending on the capacity utilization rate of the economy.
Resumo:
We prove the non-emptiness of the core of an NTU game satisfying a condition of payoff-dependent balancedness, based on transfer rate mappings. We also define a new equilibrium condition on transfer rates and we prove the existence of core payoff vectors satisfying this condition. The additional requirement of transfer rate equilibrium refines the core concept and allows the selection of specific core payoff vectors. Lastly, the class of parametrized cooperative games is introduced. This new setting and its associated equilibrium-core solution extend the usual cooperative game framework and core solution to situations depending on an exogenous environment. A non-emptiness result for the equilibrium-core is also provided in the context of a parametrized cooperative game. Our proofs borrow mathematical tools and geometric constructions from general equilibrium theory with non convexities. Applications to extant results taken from game theory and economic theory are given.
Resumo:
This paper studies the quantitative implications of changes in the composition of taxes for long-run growth and expected lifetime utility in the UK economy over 1970-2005. Our setup is a dynamic stochastic general equilibrium model incorporating a detailed scal policy struc- ture, and where the engine of endogenous growth is human capital accumulation. The government s spending instruments include pub- lic consumption, investment and education spending. On the revenue side, labour, capital and consumption taxes are employed. Our results suggest that if the goal of tax policy is to promote long-run growth by altering relative tax rates, then it should reduce labour taxes while simultaneously increasing capital or consumption taxes to make up for the loss in labour tax revenue. In contrast, a welfare promoting policy would be to cut capital taxes, while concurrently increasing labour or consumption taxes to make up for the loss in capital tax revenue.
Resumo:
In this paper, we quantitatively assess the welfare implications of alternative public education spending rules. To this end, we employ a dynamic stochastic general equilibrium model in which human capital externalities and public education expenditures, nanced by distorting taxes, enhance the productivity of private education choices. We allow public education spending, as share of output, to respond to various aggregate indicators in an attempt to minimize the market imperfection due to human capital externalities. We also expose the economy to varying degrees of uncertainty via changes in the variance of total factor productivity shocks. Our results indicate that, in the face of increasing aggregate uncertainty, active policy can signi cantly outperform passive policy (i.e. maintaining a constant public education to output ratio) but only when the policy instrument is successful in smoothing the growth rate of human capital.