999 resultados para Optimal experience


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develop and estimates a model of demand estimation for environmental public goods which allows for consumers to learn about their preferences through consumption experiences. We develop a theoretical model of Bayesian updating, perform comparative statics over the model, and show how the theoretical model can be consistently incorporated into a reduced form econometric model. We then estimate the model using data collected for two environmental goods. We find that the predictions of the theoretical exercise that additional experience makes consumers more certain over their preferences in both mean and variance are supported in each case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines whether efficiency considerations require that optimal labour income taxation is progressive or regressive in a model with skill heterogeneity, endogenous skill acquisition and a production sector with capital-skill complementarity. We find that wage inequality driven by the resource requirements of skill-creation implies progressive labour income taxation in the steady-state as well as along the transition path from the exogenous to optimal policy steady-state. We find that these results are explained by a lower labour supply elasticity for skilled versus unskilled labour which results from the introduction of the skill acquisition technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we make three contributions to the literature on optimal Competition Law enforcement procedures. The first (which is of general interest beyond competition policy) is to clarify the concept of “legal uncertainty”, relating it to ideas in the literature on Law and Economics, but formalising the concept through various information structures which specify the probability that each firm attaches – at the time it takes an action – to the possibility of its being deemed anti-competitive were it to be investigated by a Competition Authority. We show that the existence of Type I and Type II decision errors by competition authorities is neither necessary nor sufficient for the existence of legal uncertainty, and that information structures with legal uncertainty can generate higher welfare than information structures with legal certainty – a result echoing a similar finding obtained in a completely different context and under different assumptions in earlier Law and Economics literature (Kaplow and Shavell, 1992). Our second contribution is to revisit and significantly generalise the analysis in our previous paper, Katsoulacos and Ulph (2009), involving a welfare comparison of Per Se and Effects- Based legal standards. In that analysis we considered just a single information structure under an Effects-Based standard and also penalties were exogenously fixed. Here we allow for (a) different information structures under an Effects-Based standard and (b) endogenous penalties. We obtain two main results: (i) considering all information structures a Per Se standard is never better than an Effects-Based standard; (ii) optimal penalties may be higher when there is legal uncertainty than when there is no legal uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We determine he optimal combination of a universal benefit, B, and categorical benefit, C, for an economy in which individuals differ in both their ability to work - modelled as an exogenous zero quantity constraint on labour supply - and, conditional on being able to work, their productivity at work. C is targeted at those unable to work, and is conditioned in two dimensions: ex-ante an individual must be unable to work and be awarded the benefit, whilst ex-post a recipient must not subsequently work. However, the ex-ante conditionality may be imperfectly enforced due to Type I (false rejection) and Type II (false award) classification errors, whilst, in addition, the ex-post conditionality may be imperfectly enforced. If there are no classification errors - and thus no enforcement issues - it is always optimal to set C>0, whilst B=0 only if the benefit budget is sufficiently small. However, when classification errors occur, B=0 only if there are no Type I errors and the benefit budget is sufficiently small, while the conditions under which C>0 depend on the enforcement of the ex-post conditionality. We consider two discrete alternatives. Under No Enforcement C>0 only if the test administering C has some discriminatory power. In addition, social welfare is decreasing in the propensity to make each type error. However, under Full Enforcement C>0 for all levels of discriminatory power. Furthermore, whilst social welfare is decreasing in the propensity to make Type I errors, there are certain conditions under which it is increasing in the propensity to make Type II errors. This implies that there may be conditions under which it would be welfare enhancing to lower the chosen eligibility threshold - support the suggestion by Goodin (1985) to "err on the side of kindness".

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a market in which sellers compete by posting mechanisms, we study how the properties of the meeting technology affect the mechanism that sellers select. In general, sellers have incentive to use mechanisms that are socially efficient. In our environment, sellers achieve this by posting an auction with a reserve price equal to their own valuation, along with a transfer that is paid by (or to) all buyers with whom the seller meets. However, we define a novel condition on meeting technologies, which we call “invariance,” and show that the transfer is equal to zero if and only if the meeting technology satisfies this condition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Second Change School programmes are active in a number of European countries. These schools offer vulnerable young adults an alternative opportunity to enhance their employability skills by alternating education with work experience. People enrolling in these programmes disengaged from schools at an early age. They already experienced or are at-risk to enter into unemployment. This paper examines the impact of the Second Chance Schools on their participants’ aspirations towards the labour market through skill-acquisition. We are able to identify the perception of Second Chance Schools’ interns regarding entry to the professional life. A third of them, for example, consider their attitude or their surroundings as a barrier preventing them from getting a job. However, our results emphasise the role of the interns’ coach in improving their aspirations towards the labour market. We also show that when compared to male interns, female interns have a stronger (positive) perception of the school as a place where they can gain skills.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: When a child is seen in a clinic with a headache, stroke is certainly not the first on the list of differential diagnoses. In western countries, stroke is typically associated with adults and the elderly. Although rare, haemorrhagic strokes are not exceptional in the paediatric population, as their incidence is around 1/100 000/year. Prompt diagnosis is essential, since delayed treatment may lead to disastrous prognosis in these children. MATERIALS AND METHODS: This is a retrospective review of paediatric cases with spontaneous cerebral haemorrhage that presented in two university hospitals in the last ten years. The experience of these primary and tertiary referral centres comprises 22 consecutive cases that are analysed according to aetiology, presenting symptoms, treatment and outcome. RESULTS: 77% of the children diagnosed with haemorrhagic stroke presented with headaches. 41% of them had a sudden onset, while 9% developed headaches over a period of hours to weeks. While 9% presented only with headaches, the majority had either subtle (diplopia, balance problems) or obvious (focal deficits, unilateral weakness and decreased level of consciousness) concomitant neurological signs. 55% had an arteriovenous malformation (AVM), 18% had an aneurysm and 14% had a cavernous malformation. In 14% the aetiology could not be determined. The majority of haemorrhages (82%) were supratentorial, while 18% bled into the posterior fossa. All children underwent an emergency cerebral CT scan followed by specific investigations. The treatment was dependent on the aetiology as well as the mass effect of the haematoma. In 23% an emergent evacuation of the haematoma was performed. Two children (9%) died, and 75% had a favourable clinical outcome. CONCLUSION: Headaches in children are a common problem, and a small minority may reveal an intracranial haemorrhage with poor prognosis if not treated promptly. Although characterisation of headaches is more difficult in a paediatric population, sudden, unusual or intense headaches should lead to imaging work-up. Any neurological finding, even one as subtle as hemianopsia or dysmetria, should alarm the physician and should be followed by emergency imaging investigation. If the cerebral CT reveals a haemorrhage, the child should be referred immediately to a neurosurgical referral centre without further investigation. The outcome is grim for children presenting in coma with fixed, dilated pupils. The long-term result overall for children after spontaneous intracranial haemorrhage is not dismal and depends critically on specialised management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time-inconsistency is an essential feature of many policy problems (Kydland and Prescott, 1977). This paper presents and compares three methods for computing Markov-perfect optimal policies in stochastic nonlinear business cycle models. The methods considered include value function iteration, generalized Euler-equations, and parameterized shadow prices. In the context of a business cycle model in which a scal authority chooses government spending and income taxation optimally, while lacking the ability to commit, we show that the solutions obtained using value function iteration and generalized Euler equations are somewhat more accurate than that obtained using parameterized shadow prices. Among these three methods, we show that value function iteration can be applied easily, even to environments that include a risk-sensitive scal authority and/or inequality constraints on government spending. We show that the risk-sensitive scal authority lowers government spending and income-taxation, reducing the disincentive households face to accumulate wealth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers the optimal degree of discretion in monetary policy when the central bank conducts policy based on its private information about the state of the economy and is unable to commit. Society seeks to maximize social welfare by imposing restrictions on the central bank's actions over time, and the central bank takes these restrictions and the New Keynesian Phillips curve as constraints. By solving a dynamic mechanism design problem we find that it is optimal to grant "constrained discretion" to the central bank by imposing both upper and lower bounds on permissible inflation, and that these bounds must be set in a history-dependent way. The optimal degree of discretion varies over time with the severity of the time-inconsistency problem, and, although no discretion is optimal when the time-inconsistency problem is very severe, our numerical experiment suggests that no-discretion is a transient phenomenon, and that some discretion is granted eventually.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Empirical studies on the determinants of industrial location typically use variables measured at the available administrative level (municipalities, counties, etc.). However, this amounts to assuming that the effects these determinants may have on the location process do not extent beyond the geographical limits of the selected site. We address the validity of this assumption by comparing results from standard count data models with those obtained by calculating the geographical scope of the spatially varying explanatory variables using a wide range of distances and alternative spatial autocorrelation measures. Our results reject the usual practice of using administrative records as covariates without making some kind of spatial correction. Keywords: industrial location, count data models, spatial statistics JEL classification: C25, C52, R11, R30

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the lysis timing of a bacteriophage population by means of a continuously infection-age-structured population dynamics model. The features of the model are the infection process of bacteria, the natural death process, and the lysis process which means the replication of bacteriophage viruses inside bacteria and the destruction of them. We consider that the length of the lysis timing (or latent period) is distributed according to a general probability distribution function. We have carried out an optimization procedure and we have found the latent period corresponding to the maximal fitness (i.e. maximal growth rate) of the bacteriophage population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When using a polynomial approximating function the most contentious aspect of the Heat Balance Integral Method is the choice of power of the highest order term. In this paper we employ a method recently developed for thermal problems, where the exponent is determined during the solution process, to analyse Stefan problems. This is achieved by minimising an error function. The solution requires no knowledge of an exact solution and generally produces significantly better results than all previous HBI models. The method is illustrated by first applying it to standard thermal problems. A Stefan problem with an analytical solution is then discussed and results compared to the approximate solution. An ablation problem is also analysed and results compared against a numerical solution. In both examples the agreement is excellent. A Stefan problem where the boundary temperature increases exponentially is analysed. This highlights the difficulties that can be encountered with a time dependent boundary condition. Finally, melting with a time-dependent flux is briefly analysed without applying analytical or numerical results to assess the accuracy.