940 resultados para revised Aleph Account
Resumo:
The literature on the welfare costs of ináation universally assumes that the many-person household can be treated as a single economic agent. This paper explores what the heterogeneity of the agents in a household might imply for such welfare analyses. First, we show that allowing for a one-person or for a many-person transacting technology impacts the money demand function and, therefore, the welfare costs of ináation. Second, more importantly, we derive su¢ cient conditions under which welfare assessments which depart directly from the knowledge of the money demand function (as in Lucas (2000)) are robust (invariant) under the number of persons considered in the household. Third, we show that Baileyís (1956) partial-equilibrium measure of the welfare costs of ináation can be obtained as a Örst-order approximation of the general-equilibrium welfare measure derived in this paper using a many-person transacting technology.
Resumo:
Using national accounts data for the revenue-GDP and expenditureGDP ratios from 1947 to 1992, we examine three central issues in public finance. First, was the path of public debt sustainable during this period? Second, if debt is sustainable, how has the government historically balanced the budget after shocks to either revenues or expenditures? Third, are expenditures exogenous? The results show that (i) public deficit is stationary (bounded asymptotic variance), with the budget in Brazil being balanced almost entirely through changes in taxes, regardless of the cause of the initial imbalance. Expenditures are weakly exogenous, but tax revenues are not; (ii) the behavior of a rational Brazilian consumer may be consistent with Ricardian Equivalence; (iii) seigniorage revenues are critical to restore intertemporal budget equilibrium, since, when we exclude them from total revenues, debt is not sustainable in econometric tests.
Resumo:
This paper argues that trade specialization played an indispensable role in supporting the Industrial Revolution. We calibrate a two-good and two-sector overlapping generations model to Englandís historical development and investigate how much different Englandís development path would have been if it had not globalized in 1840. The open-economy model is able to closely match the data, but the closed-economy model cannot explain the fall in the value of land relative to wages observed in the 19th century. Without globalization, the transition period in the British economy would be considerably longer than that observed in the data and key variables, such as the share of labor force in agriculture, would have converged to Ögures very distant from the actual ones.
Resumo:
This paper discusses distribution and the historical phases of capitalism. It assumes that technical progress and growth are taking place, and, given that, its question is on the functional distribution of income between labor and capital, having as reference classical theory of distribution and Marx’s falling tendency of the rate of profit. Based on the historical experience, it, first, inverts the model, making the rate of profit as the constant variable in the long run and the wage rate, as the residuum; second, it distinguishes three types of technical progress (capital-saving, neutral and capital-using) and applies it to the history of capitalism, having the UK and France as reference. Given these three types of technical progress, it distinguishes four phases of capitalist growth, where only the second is consistent with Marx prediction. The last phase, after World War II, should be, in principle, capital-saving, consistent with growth of wages above productivity. Instead, since the 1970s wages were kept stagnant in rich countries because of, first, the fact that the Information and Communication Technology Revolution proved to be highly capital using, opening room for a new wage of substitution of capital for labor; second, the new competition coming from developing countries; third, the emergence of the technobureaucratic or professional class; and, fourth, the new power of the neoliberal class coalition associating rentier capitalists and financiers
Resumo:
The objective of these notes is to present a simple mathematical model of the determination of current account real exchange rate as defined by Bresser-Pereira (2010); i.e. the real exchange rate that guarantees the inter temporal equilibrium of balance of payments and to show the relation between Real Exchange rate and Productive Specialization at theoretical and empirical level.
Resumo:
Latin America has recently experienced three cycles of capital inflows, the first two ending in major financial crises. The first took place between 1973 and the 1982 ‘debt-crisis’. The second took place between the 1989 ‘Brady bonds’ agreement (and the beginning of the economic reforms and financial liberalisation that followed) and the Argentinian 2001/2002 crisis, and ended up with four major crises (as well as the 1997 one in East Asia) — Mexico (1994), Brazil (1999), and two in Argentina (1995 and 2001/2). Finally, the third inflow-cycle began in 2003 as soon as international financial markets felt reassured by the surprisingly neo-liberal orientation of President Lula’s government; this cycle intensified in 2004 with the beginning of a (purely speculative) commodity price-boom, and actually strengthened after a brief interlude following the 2008 global financial crash — and at the time of writing (mid-2011) this cycle is still unfolding, although already showing considerable signs of distress. The main aim of this paper is to analyse the financial crises resulting from this second cycle (both in LA and in East Asia) from the perspective of Keynesian/ Minskyian/ Kindlebergian financial economics. I will attempt to show that no matter how diversely these newly financially liberalised Developing Countries tried to deal with the absorption problem created by the subsequent surges of inflow (and they did follow different routes), they invariably ended up in a major crisis. As a result (and despite the insistence of mainstream analysis), these financial crises took place mostly due to factors that were intrinsic (or inherent) to the workings of over-liquid and under-regulated financial markets — and as such, they were both fully deserved and fairly predictable. Furthermore, these crises point not just to major market failures, but to a systemic market failure: evidence suggests that these crises were the spontaneous outcome of actions by utility-maximising agents, freely operating in friendly (‘light-touch’) regulated, over-liquid financial markets. That is, these crises are clear examples that financial markets can be driven by buyers who take little notice of underlying values — i.e., by investors who have incentives to interpret information in a biased fashion in a systematic way. Thus, ‘fat tails’ also occurred because under these circumstances there is a high likelihood of self-made disastrous events. In other words, markets are not always right — indeed, in the case of financial markets they can be seriously wrong as a whole. Also, as the recent collapse of ‘MF Global’ indicates, the capacity of ‘utility-maximising’ agents operating in (excessively) ‘friendly-regulated’ and over-liquid financial market to learn from previous mistakes seems rather limited.
Resumo:
Latin America has recently experienced three cycles of capital inflows, the first two ending in major financial crises. The first took place between 1973 and the 1982 ‘debt-crisis’. The second took place between the 1989 ‘Brady bonds’ agreement (and the beginning of the economic reforms and financial liberalisation that followed) and the Argentinian 2001/2002 crisis, and ended up with four major crises (as well as the 1997 one in East Asia) — Mexico (1994), Brazil (1999), and two in Argentina (1995 and 2001/2). Finally, the third inflow-cycle began in 2003 as soon as international financial markets felt reassured by the surprisingly neo-liberal orientation of President Lula’s government; this cycle intensified in 2004 with the beginning of a (purely speculative) commodity price-boom, and actually strengthened after a brief interlude following the 2008 global financial crash — and at the time of writing (mid-2011) this cycle is still unfolding, although already showing considerable signs of distress. The main aim of this paper is to analyse the financial crises resulting from this second cycle (both in LA and in East Asia) from the perspective of Keynesian/ Minskyian/ Kindlebergian financial economics. I will attempt to show that no matter how diversely these newly financially liberalised Developing Countries tried to deal with the absorption problem created by the subsequent surges of inflow (and they did follow different routes), they invariably ended up in a major crisis. As a result (and despite the insistence of mainstream analysis), these financial crises took place mostly due to factors that were intrinsic (or inherent) to the workings of over-liquid and under-regulated financial markets — and as such, they were both fully deserved and fairly predictable. Furthermore, these crises point not just to major market failures, but to a systemic market failure: evidence suggests that these crises were the spontaneous outcome of actions by utility-maximising agents, freely operating in friendly (light-touched) regulated, over-liquid financial markets. That is, these crises are clear examples that financial markets can be driven by buyers who take little notice of underlying values — investors have incentives to interpret information in a biased fashion in a systematic way. ‘Fat tails’ also occurred because under these circumstances there is a high likelihood of self-made disastrous events. In other words, markets are not always right — indeed, in the case of financial markets they can be seriously wrong as a whole. Also, as the recent collapse of ‘MF Global’ indicates, the capacity of ‘utility-maximising’ agents operating in unregulated and over-liquid financial market to learn from previous mistakes seems rather limited.
Resumo:
This study approach the Jorge Luis Borges s prose of fiction under the perspective of mimesis and the self-reflexivity. The hypothesis is that the Aleph is a central symbol of the Borges s fictional universe. The rewriting and the retake of this symbol along of his work entail to a reflection about the possibilities and the limits of mimesis. This study is divided in three parts which contain two chapters. The first part Bibliographic revision and conceptual fundaments of inquiry discuss the critical fortune of author (Chapter 1) and the concepts that will give sustentation to the inquiry (Chapter 2). The second part About the Borges s aesthetic project sketch out the literary project defended by Borges that is his conception of the literature and his ideological matrix (Chapter 3) beside his anti-psychologism and his nostalgia of epos (Chapter 4). The third and last part is entitled The Aleph and his doubles. In the chapter 5 this study analyses the short story El Aleph and consider its centrality on the Borges s work. The argument that is on this short story Borges elaborates a reflection about mimesis. In the chapter 6, on the same hand, four short stories will be analysed: Funes el memorioso ; El Libro de Arena ; El evangelio según Marcos and Del rigor en la ciencia . The conclusion that is the Borges s literature is self-awake of its process as such demonstrate its parodic sense and its bookish origin. Hence, the Borges s literature overlapping the mimetic crisis of language and challenge the limits between fiction and reality. However, it doesn t surrender to the nihilist perspective that is closing of literature to the world
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The behavior of the non-perturbative parts of the isovector-vector and isovector and isosinglet axial-vector correlators at Euclidean momenta is studied in the framework of a covariant chiral quark model with non-local quark-quark interactions. The gauge covariance is ensured with the help of the P-exponents, with the corresponding modification of the quark-current interaction vertices taken into account. The low- and high-momentum behavior of the correlators is compared with the chiral perturbation theory and with the QCD operator product expansion, respectively. The V-A combination of the correlators obtained in the model reproduces quantitatively the ALEPH and OPAL data on hadronic tau decays, transformed into the Euclidean domain via dispersion relations. The predictions for the electromagnetic pi(+/-) - pi(0) mass difference and for the pion electric polarizability are also in agreement with the experimental values. The topological susceptibility of the vacuum is evaluated as a function of the momentum, and its first moment is predicted to be chi'(0) approximate to (50 MeV)(2). In addition, the fulfillment of the Crewther theorem is demonstrated.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)