101 resultados para University of St. Andrews.
Resumo:
We compare three methods for the elicitation of time preferences in an experimental setting: the Becker-DeGroot-Marschak procedure (BDM); the second price auction; and the multiple price list format. The first two methods have been used rarely to elicit time preferences. All methods used are perfectly equivalent from a decision theoretic point of view, and they should induce the same ‘truthful’ revelation i dominant strategies. In spite of this, we find that framing does matter: the money discount rates elicited with the multiple price list tend to be higher than those elicited with the other two methods. In addition, our results shed some light on attitudes towards time, and they permit a broad classification of subjects depending on how the size of the elicited values varies with the time horizon.
Resumo:
In the mid-1940s, American film industry was on its way up to its golden era as studios started mass-producing iconic feature films. The escalating increase in popularity of Hollywood stars was actively suggested for its direct links to box office success by academics. Using data collected in 2007, this paper carries out an empirical investigation on how different factors, including star power, affect the revenue of ‘home-run’ movies in Hollywood. Due to the subjective nature of star power, two different approaches were used: (1) number of nominations and wins of Academy Awards by the key players, and (2) average lifetime gross revenue of films involving the key players preceding the sample year. It is found that number of Academy awards nominations and wins was not statistically significant in generating box office revenue, whereas star power based on the second approach was statistically significant. Other significant factors were critics’ reviews, screen coverage and top distributor, while number of Academy awards, MPAA-rating, seasonality, being a sequel and popular genre were not statistically significant.
Resumo:
In the context of the two-stage threshold model of decision making, with the agent’s choices determined by the interaction Of three “structural variables,” we study the restrictions on behavior that arise when one or more variables are xogenously known. Our results supply necessary and sufficient conditions for consistency with the model for all possible states of partial Knowledge, and for both single- and multivalued choice functions.
Resumo:
The paper uses a range of primary-source empirical evidence to address the question: ‘why is it to hard to value intangible assets?’ The setting is venture capital investment in high technology companies. While the investors are risk specialists and financial experts, the entrepreneurs are more knowledgeable about product innovation. Thus the context lends itself to analysis within a principal-agent framework, in which information asymmetry may give rise to adverse selection, pre-contract, and moral hazard, post-contract. We examine how the investor might attenuate such problems and attach a value to such high-tech investments in what are often merely intangible assets, through expert due diligence, monitoring and control. Qualitative evidence is used to qualify the more clear cut picture provided by a principal-agent approach to a more mixed picture in which the ‘art and science’ of investment appraisal are utilised by both parties alike
Resumo:
What is the seigniorage-maximizing level of inflation? Four models formulae for the seigniorage maximizing inflation rate (SMIR) are compared. Two sticky-price models arrive at very different quantitative recommendations although both predict somewhat lower SMIRs than Cagan’s formula and a variant of a .ex-price model due to Kimbrough (2006). The models differ markedly in how inflation distorts the labour market: The Calvo model implies that inflation and output are negatively related and that output is falling in price stickiness whilst the Rotemberg cost-of-price-adjustment model implies exactly the opposite. Interestingly, if our version of the Calvo model is to be believed, the level of inflation experienced recently in advanced economies such as the USA and the UK may be quite close to the SMIR.
Resumo:
This paper reports on: (a) new primary source evidence on; and (b) statistical and econometric analysis of high technology clusters in Scotland. It focuses on the following sectors: software, life sciences, microelectronics, optoelectronics, and digital media. Evidence on a postal and e-mailed questionnaire is presented and discussed under the headings of: performance, resources, collaboration & cooperation, embeddedness, and innovation. The sampled firms are characterised as being small (viz. micro-firms and SMEs), knowledge intensive (largely graduate staff), research intensive (mean spend on R&D GBP 842k), and internationalised (mainly selling to markets beyond Europe). Preliminary statistical evidence is presented on Gibrat’s Law (independence of growth and size) and the Schumpeterian Hypothesis (scale economies in R&D). Estimates suggest a short-run equilibrium size of just 100 employees, but a long-run equilibrium size of 1000 employees. Further, to achieve the Schumpeterian effect (of marked scale economies in R&D), estimates suggest that firms have to grow to very much larger sizes of beyond 3,000 employees. We argue that the principal way of achieving the latter scale may need to be by takeovers and mergers, rather than by internally driven growth.
Resumo:
We show that a flex-price two-sector open economy DSGE model can explain the poor degree of international risk sharing and exchange rate disconnect. We use a suite of model evaluation measures and examine the role of (i) traded and non-traded sectors; (ii) financial market incompleteness; (iii) preference shocks; (iv) deviations from UIP condition for the exchange rates; and (v) creditor status in net foreign assets. We find that there is a good case for both traded and non-traded productivity shocks as well as UIP deviations in explaining the puzzles.
Resumo:
Employing the financial accelerator (FA) model of Bernanke, Gertler and Gilchrist (1999) enhanced to include a shock to the FA mechanism, we construct and study shocks to the efficiency of the financial sector in post-war US business cycles. We find that financial shocks are very tightly linked with the onset of recessions, more so than TFP or monetary shocks. The financial shock invariably remains contractionary for sometime after recessions have ended. The shock accounts for a large part of the variance of GDP and is strongly negatively correlated with the external finance premium. Second-moments comparisons across variants of the model with and without a (stochastic) FA mechanism suggests the stochastic FA model helps us understand the data.
Resumo:
Abstract: Should two–band income taxes be progressive given a general income distribution? We provide a negative answer under utilitarian and max-min welfare functions. While this result clarifies some ambiguities in the literature, it does not rule out progressive taxes in general. If we maximize total or weighted utility of the poor, as often intended by the society, progressive taxes can be justified, especially when the ‘rich’ are very rich. Under these objectives we obtain new necessary conditions for progressive taxes, which only depend on aggregate features of income distributions. The validity of these conditions is examined using plausible income distributions.
Resumo:
We develop tests of the proportional hazards assumption, with respect to a continuous covariate, in the presence of unobserved heterogeneity with unknown distribution at the individual observation level. The proposed tests are specially powerful against ordered alternatives useful for modeling non-proportional hazards situations. By contrast to the case when the heterogeneity distribution is known up to …nite dimensional parameters, the null hypothesis for the current problem is similar to a test for absence of covariate dependence. However, the two testing problems di¤er in the nature of relevant alternative hypotheses. We develop tests for both the problems against ordered alternatives. Small sample performance and an application to real data highlight the usefulness of the framework and methodology.
Resumo:
This paper develops a general theoretical framework within which a heterogeneous group taxpayers confront a market that supplies a variety of schemes for reducing tax liability, and uses this framework to explore the impact of a wide range of anti-avoidance policies. Schemes differ in their legal effectiveness and hence in the risks to which they expose taxpayers - risks which go beyond the risk of audit considered in the conventional literature on evasion. Given the individual taxpayer’s circumstances, the prices charged for the schemes and the policy environment, the model predicts (i) whether or not any given taxpayer will acquire a scheme, and (ii) if they do so, which type of scheme they will acquire. The paper then analyses how these decisions, and hence the tax gap, are influenced by four generic types of policy: Disclosure – earlier information leading to faster closure of loopholes; Penalties – introduction of penalties for failed avoidance; Policy Design – fundamental policy changes that design out opportunities for avoidance; Product Register - the introduction of GAARs or mini-GAARs that give greater clarity about how different types of scheme will be treated. The paper shows that when considering the indirect/behavioural effects of policies on the tax gap it is important to recognise that these operate on two different margins. First policies will have deterrence effects – their impact on the quantum of taxpayers choosing to acquire different types schemes as distinct to acquiring no scheme at all. There will be a range of such deterrence effects reflecting the range of schemes available in the market. But secondly, since different schemes generate different tax gaps, policies will also have switching effects as they induce taxpayers who previously acquired one type of scheme to acquire another. The first three types of policy generate positive deterrence effects but differ in the switching effects they produce. The fourth type of policy produces mixed deterrence effects.
Resumo:
This paper examines the rise in European unemployment since the 1970s by introducing endogenous growth into an otherwise standard New Keynesian model with capital accumulation and unemployment. We subject the model to an uncorrelated cost push shock, in order to mimic a scenario akin to the one faced by central banks at the end of the 1970s. Monetary policy implements a disinfl ation by following an interest feedback rule calibrated to an estimate of a Bundesbank reaction function. 40 quarters after the shock has vanished, unemployment is still about 1.8 percentage points above its steady state. Our model also broadly reproduces cross country differences in unemployment by drawing on cross country differences in the size of cost push shock and the associated disinfl ation, the monetary policy reaction function and the wage setting structure.
Resumo:
Until recently, much effort has been devoted to the estimation of panel data regression models without adequate attention being paid to the drivers of diffusion and interaction across cross section and spatial units. We discuss some new methodologies in this emerging area and demonstrate their use in measurement and inferences on cross section and spatial interactions. Specifically, we highlight the important distinction between spatial dependence driven by unobserved common factors and those based on a spatial weights matrix. We argue that, purely factor driven models of spatial dependence may be somewhat inadequate because of their connection with the exchangeability assumption. Limitations and potential enhancements of the existing methods are discussed, and several directions for new research are highlighted.
Resumo:
This paper examines the optimal design of climate change policies in the context where governments want to encourage the private sector to undertake significant immediate investment in developing cleaner technologies, but the carbon taxes and other environmental policies that could in principle stimulate such investment will be imposed over a very long future. The conventional claim by environmental economists is that environmental policies alone are sufficient to induce firms to undertake optimal investment. However this argument requires governments to be able to commit to these future taxes, and it is far from clear that governments have this degree of commitment. We assume instead that governments cannot commit, and so both they and the private sector have to contemplate the possibility of there being governments in power in the future that give different (relative) weights to the environment. We show that this lack of commitment has a significant asymmetric effect. Compared to the situation where governments can commit it increases the incentive of the current government to have the investment undertaken, but reduces the incentive of the private sector to invest. Consequently governments may need to use additional policy instruments – such as R&D subsidies – to stimulate the required investment.
Resumo:
We introduce duration dependent skill decay among the unemployed into a New-Keynesian model with hiring frictions developed by Blanchard/Gali (2008). If the central bank responds only to (current, lagged or expected future) inflation and quarterly skill decay is above a threshold level, determinacy requires a coefficient on inflation smaller than one. The threshold level is plausible with little steady-state hiring and firing ("Continental European Calibration") but implausibly high in the opposite case ("American calibration"). Neither interest rate smoothing nor responding to the output gap helps to restore determinacy if skill decay exceeds the threshold level. However, a modest response to unemployment guarantees determinacy. Moreover, under indeterminacy, both an adverse sunspot shock and an adverse technology shock increase unemployment extremely persistently.