631 resultados para Algorithme DP
Resumo:
An important disconnect in the news driven view of the business cycle formalized by Beaudry and Portier (2004), is the lack of agreement between different—VAR and DSGE—methodologies over the empirical plausibility of this view. We argue that this disconnect can be largely resolved once we augment a standard DSGE model with a financial channel that provides amplification to news shocks. Both methodologies suggest news shocks to the future growth prospects of the economy to be significant drivers of U.S. business cycles in the post-Greenspan era (1990-2011), explaining as much as 50% of the forecast error variance in hours worked in cyclical frequencies
Resumo:
Using the framework of Desmet and Rossi-Hansberg (forthcoming), we present a model of spatial takeoff that is calibrated using spatially-disaggregated occupational data for England in c.1710. The model predicts changes in the spatial distribution of agricultural and manufacturing employment which match data for c.1817 and 1861. The model also matches a number of aggregate changes that characterise the first industrial revolution. Using counterfactual geographical distributions, we show that the initial concentration of productivity can matter for whether and when an industrial takeoff occurs. Subsidies to innovation in either sector can bring forward the date of takeoff while subsidies to the use of land by manufacturing firms can significantly delay a takeoff because it decreases spatial concentration of activity.
Resumo:
In this analysis, we examine the relationship between an individual's decision to volunteer and the average level of volunteering in the community where the individual resides. Our theoretical model is based on a coordination game , in which volunteering by others is informative regarding the benefit from volunteering. We demonstrate that the interaction between this information and one's private information makes it more likely that he or she will volunteer, given a higher level of contributions by his or her peers. We complement this theoretical work with an empirical analysis using Census 2000 Summary File 3 and Current Population Survey (CPS) 2004-2007 September supplement file data. We control for various individual and community characteristics, and employ robustness checks to verify the results of the baseline analysis. We additionally use an innovative instrumental variables strategy to account for reflection bias and endogeneity caused by selective sorting by individuals into neighborhoods, which allows us to argue for a causal interpretation. The empirical results in the baseline, as well as all robustness analyses, verify the main result of our theoretical model, and we employ a more general structure to further strengthen our results.
Resumo:
The purpose of this paper is to highlight the curiously circular course followed by mainstream macroeconomic thinking in recent times. Having broken from classical orthodoxy in the late 1930s via Keynes’s General Theory, over the last three or four decades the mainstream conventional wisdom, regressing rather than progressing, has now come to embrace a conception of the working of the macroeconomy which is again of a classical, essentially pre-Keynesian, character. At the core of the analysis presented in the typical contemporary macro textbook is the (neo)classical model of the labour market, which represents employment as determined (given conditions of productivity) by the terms of labour supply. While it is allowed that changes in aggregate demand may temporarily affect output and employment, the contention is that in due course employment will automatically return to its ‘natural’ (full employment) level. Unemployment is therefore identified as a merely frictional or voluntary phenomenon: involuntary unemployment - in other words persisting demand-deficient unemployment - is entirely absent from the picture. Variations in aggregate demand are understood to have a lasting impact only on the price level, not on output and employment. This in effect amounts to a return to a Pigouvian conception such as targeted by Keynes in the General Theory. We take the view that this reversion to ideas which should by now be obsolete reflects not the discovery of logical or empirical deficiencies in the Keynes analysis, but results rather from doctrinaire blindness and failure of scholarship on account of which essential features of the Keynes theory have been overlooked or misrepresented. There is an urgent need for a critical appraisal of the current conventional macroeconomic wisdom.
Resumo:
This paper argues that the natural rate of unemployment hypothesis, in which equilibrium unemployment is determined by “structural” variables alone, is wrong: it is both implausible and inconsistent with the evidence. Instead, equilibrium unemployment is haunted by hysteresis. The curious history of the natural rate hypothesis is considered, curious because the authors of the hypothesis thought hysteresis to be relevant. The various methods that have been used to model hysteresis in economic systems are outlined, including the Preisach model with its selective, erasable memory properties. The evidence regarding hysteresis effects on output and unemployment is then reviewed. The implications for macroeconomic policy, and for the macroeconomics profession, are discussed.
Resumo:
The efficient markets hypothesis implies that arbitrage opportunities in markets such as those for foreign exchange (FX) would be, at most, short-lived. The present paper surveys the fragmented nature of FX markets, revealing that information in these markets is also likely to be fragmented. The “quant” workforce in the hedge fund featured in The Fear Index novel by Robert Harris would have little or no reason for their existence in an EMH world. The four currency combinatorial analysis of arbitrage sequences contained in Cross, Kozyakin, O’Callaghan, Pokrovskii and Pokrovskiy (2012) is then considered. Their results suggest that arbitrage processes, rather than being self-extinguishing, tend to be periodic in nature. This helps explain the fact that arbitrage dealing tends to be endemic in FX markets.
Resumo:
Econometric analysis has been inconclusive in determining the contribution that increased skills have on macroeconomic performance whilst conventional growth accounting approaches to the same problem rest on restrictive assumptions. We propose an alternative micro-to-macro method which combines elements of growth accounting and numerical general equilibrium modelling. The usefulness of this approach for applied education policy analysis is demonstrated by evaluating the macroeconomic impact on the Scottish economy of a single graduation cohort from further education colleges. We find the macroeconomic impact to be significant. From a policy point of view this supports a revival of interest in the conventional teaching role of education institutions.
Resumo:
Much attention in recent years has turned to the potential of behavioural insights to improve the performance of government policy. One behavioural concept of interest is the effect of a cash transfer label on how the transfer is spent. The Winter Fuel Payment (WFP) is a labelled cash transfer to offset the costs of keeping older households warm in the winter. Previous research has shown that households spend a higher proportion of the WFP on energy expenditures due to its label (Beatty et al., 2011). If households interpret the WFP as money for their energy bills, it may reduce their willingness to undertake investments which help achieving the same goal, such as the adoption of renewable energy technologies. In this paper we show that the WFP has distortionary effects on the renewable technology market. Using the sharp eligibility criteria of the WFP in a Regression Discontinuity Design, this analysis finds a reduction in the propensity to install renewable energy technologies of around 2.7 percentage points due to the WFP. This is a considerable number. It implies that 62% of households (whose oldest member turns 60) would have invested in renewable energy but refrain to do so after receiving the WFP. This analysis suggests that the labelling effect spreads to products related to the labelled good. In this case, households use too much energy from sources which generate pollution and too little from relatively cleaner technologies.
Resumo:
The ways in which preferences respond to the varying stress of economic environments is a key question for behavioral economics and public policy. We conducted a laboratory experiment to investigate the effects of stress on financial decision making among individuals aged 50 and older. Using the cold pressor task as a physiological stressor, and a series of intelligence tests as cognitive stressors, we find that stress increases subjective discounting rates, has no effect on the degree of risk-aversion, and substantially lowers the effort individuals make to learn about financial decisions.
Resumo:
Genuine Savings has emerged as a widely-used indicator of sustainable development. In this paper, we use long-term data stretching back to 1870 to undertake empirical tests of the relationship between Genuine Savings (GS) and future well-being for three countries: Britain, the USA and Germany. Our tests are based on an underlying theoretical relationship between GS and changes in the present value of future consumption. Based on both single country and panel results, we find evidence supporting the existence of a cointegrating (long run equilibrium) relationship between GS and future well-being, and fail to reject the basic theoretical result on the relationship between these two macroeconomic variables. This provides some support for the GS measure of weak sustainability. We also show the effects of modelling shocks, such as World War Two and the Great Depression.
Resumo:
This paper compares how increases in experience versus increases in knowledge about a public good affect willingness to pay (WTP) for its provision. This is challenging because while consumers are often certain about their previous experiences with a good, they may be uncertain about the accuracy of their knowledge. We therefore design and conduct a field experiment in which treated subjects receive a precise and objective signal regarding their knowledge about a public good before estimating their WTP for it. Using data for two different public goods, we show qualitative equivalence of the effect of knowledge and experience on valuation for a public good. Surprisingly, though, we find that the causal effect of objective signals about the accuracy of a subject’s knowledge for a public good can dramatically affect their valuation for it: treatment causes an increase of $150-$200 in WTP for well-informed individuals. We find no such effect for less informed subjects. Our results imply that WTP estimates for public goods are not only a function of true information states of the respondents but beliefs about those information states.
Resumo:
In this study we elicit agents’ prior information set regarding a public good, exogenously give information treatments to survey respondents and subsequently elicit willingness to pay for the good and posterior information sets. The design of this field experiment allows us to perform theoretically motivated hypothesis testing between different updating rules: non-informative updating, Bayesian updating, and incomplete updating. We find causal evidence that agents imperfectly update their information sets. We also field causal evidence that the amount of additional information provided to subjects relative to their pre-existing information levels can affect stated WTP in ways consistent overload from too much learning. This result raises important (though familiar) issues for the use of stated preference methods in policy analysis.
Resumo:
Illegal hunting for bushmeat is regarded as an important cause of biodiversity decline in Africa. We use a stated preferences method to obtain information on determinants of demand for bushmeat in villages around the Serengeti National Park, Tanzania. We estimate the effects of changes in the own price of bushmeat and in the prices of two substitute protein sources – fish and chicken. Promoting the availability of protein substitutes at lower prices would be effective at reducing pressures on wildlife. Supply-side measures that raise the price of bushmeat would also be effective.
Resumo:
We model the choice behaviour of an agent who suffers from imperfect attention. We define inattention axiomatically through preference over menus and endowed alternatives: an agent is inattentive if it is better to be endowed with an alternative a than to be allowed to pick a from a menu in which a is is the best alternative. This property and vNM rationality on the domain of menus and alternatives imply that the agent notices each alternative with a given menu-dependent probability (attention parameter) and maximises a menu independent utility function over the alternatives he notices. Preference for flexibility restricts the model to menu independent attention parameters as in Manzini and Mariotti [19]. Our theory explains anomalies (e.g. the attraction and compromise effect) that the Random Utility Model cannot accommodate.
Resumo:
Using the framework of Desmet and Rossi-Hansberg (forthcoming), we present a model of spatial takeoff that is calibrated using spatially-disaggregated occupational data for England in c.1710. The model predicts changes in the spatial distribution of agricultural and manufacturing employment which match data for c.1817 and 1861. The model also matches a number of aggregate changes that characterise the first industrial revolution. Using counterfactual geographical distributions, we show that the initial concentration of productivity can matter for whether and when an industrial takeoff occurs. Subsidies to innovation in either sector can bring forward the date of takeoff while subsidies to the use of land by manufacturing firms can significantly delay a takeoff because it decreases spatial concentration of activity.