852 resultados para optimal taxation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we make three contributions to the literature on optimal Competition Law enforcement procedures. The first (which is of general interest beyond competition policy) is to clarify the concept of “legal uncertainty”, relating it to ideas in the literature on Law and Economics, but formalising the concept through various information structures which specify the probability that each firm attaches – at the time it takes an action – to the possibility of its being deemed anti-competitive were it to be investigated by a Competition Authority. We show that the existence of Type I and Type II decision errors by competition authorities is neither necessary nor sufficient for the existence of legal uncertainty, and that information structures with legal uncertainty can generate higher welfare than information structures with legal certainty – a result echoing a similar finding obtained in a completely different context and under different assumptions in earlier Law and Economics literature (Kaplow and Shavell, 1992). Our second contribution is to revisit and significantly generalise the analysis in our previous paper, Katsoulacos and Ulph (2009), involving a welfare comparison of Per Se and Effects- Based legal standards. In that analysis we considered just a single information structure under an Effects-Based standard and also penalties were exogenously fixed. Here we allow for (a) different information structures under an Effects-Based standard and (b) endogenous penalties. We obtain two main results: (i) considering all information structures a Per Se standard is never better than an Effects-Based standard; (ii) optimal penalties may be higher when there is legal uncertainty than when there is no legal uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We determine he optimal combination of a universal benefit, B, and categorical benefit, C, for an economy in which individuals differ in both their ability to work - modelled as an exogenous zero quantity constraint on labour supply - and, conditional on being able to work, their productivity at work. C is targeted at those unable to work, and is conditioned in two dimensions: ex-ante an individual must be unable to work and be awarded the benefit, whilst ex-post a recipient must not subsequently work. However, the ex-ante conditionality may be imperfectly enforced due to Type I (false rejection) and Type II (false award) classification errors, whilst, in addition, the ex-post conditionality may be imperfectly enforced. If there are no classification errors - and thus no enforcement issues - it is always optimal to set C>0, whilst B=0 only if the benefit budget is sufficiently small. However, when classification errors occur, B=0 only if there are no Type I errors and the benefit budget is sufficiently small, while the conditions under which C>0 depend on the enforcement of the ex-post conditionality. We consider two discrete alternatives. Under No Enforcement C>0 only if the test administering C has some discriminatory power. In addition, social welfare is decreasing in the propensity to make each type error. However, under Full Enforcement C>0 for all levels of discriminatory power. Furthermore, whilst social welfare is decreasing in the propensity to make Type I errors, there are certain conditions under which it is increasing in the propensity to make Type II errors. This implies that there may be conditions under which it would be welfare enhancing to lower the chosen eligibility threshold - support the suggestion by Goodin (1985) to "err on the side of kindness".

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a market in which sellers compete by posting mechanisms, we study how the properties of the meeting technology affect the mechanism that sellers select. In general, sellers have incentive to use mechanisms that are socially efficient. In our environment, sellers achieve this by posting an auction with a reserve price equal to their own valuation, along with a transfer that is paid by (or to) all buyers with whom the seller meets. However, we define a novel condition on meeting technologies, which we call “invariance,” and show that the transfer is equal to zero if and only if the meeting technology satisfies this condition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers the optimal degree of discretion in monetary policy when the central bank conducts policy based on its private information about the state of the economy and is unable to commit. Society seeks to maximize social welfare by imposing restrictions on the central bank's actions over time, and the central bank takes these restrictions and the New Keynesian Phillips curve as constraints. By solving a dynamic mechanism design problem we find that it is optimal to grant "constrained discretion" to the central bank by imposing both upper and lower bounds on permissible inflation, and that these bounds must be set in a history-dependent way. The optimal degree of discretion varies over time with the severity of the time-inconsistency problem, and, although no discretion is optimal when the time-inconsistency problem is very severe, our numerical experiment suggests that no-discretion is a transient phenomenon, and that some discretion is granted eventually.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Empirical studies on the determinants of industrial location typically use variables measured at the available administrative level (municipalities, counties, etc.). However, this amounts to assuming that the effects these determinants may have on the location process do not extent beyond the geographical limits of the selected site. We address the validity of this assumption by comparing results from standard count data models with those obtained by calculating the geographical scope of the spatially varying explanatory variables using a wide range of distances and alternative spatial autocorrelation measures. Our results reject the usual practice of using administrative records as covariates without making some kind of spatial correction. Keywords: industrial location, count data models, spatial statistics JEL classification: C25, C52, R11, R30

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the lysis timing of a bacteriophage population by means of a continuously infection-age-structured population dynamics model. The features of the model are the infection process of bacteria, the natural death process, and the lysis process which means the replication of bacteriophage viruses inside bacteria and the destruction of them. We consider that the length of the lysis timing (or latent period) is distributed according to a general probability distribution function. We have carried out an optimization procedure and we have found the latent period corresponding to the maximal fitness (i.e. maximal growth rate) of the bacteriophage population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When using a polynomial approximating function the most contentious aspect of the Heat Balance Integral Method is the choice of power of the highest order term. In this paper we employ a method recently developed for thermal problems, where the exponent is determined during the solution process, to analyse Stefan problems. This is achieved by minimising an error function. The solution requires no knowledge of an exact solution and generally produces significantly better results than all previous HBI models. The method is illustrated by first applying it to standard thermal problems. A Stefan problem with an analytical solution is then discussed and results compared to the approximate solution. An ablation problem is also analysed and results compared against a numerical solution. In both examples the agreement is excellent. A Stefan problem where the boundary temperature increases exponentially is analysed. This highlights the difficulties that can be encountered with a time dependent boundary condition. Finally, melting with a time-dependent flux is briefly analysed without applying analytical or numerical results to assess the accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to clarify the interactive nature of the leader-follower relationship when both players are endogenously risk-averse. The analysis is placed in the context of a dynamic closed-loop Stackelberg game with private information. The case of a risk-neutral leader, very often discussed in the literature, is only a borderline possibility in the present study. Each player in the game is characterized by a risk-averse type which is unknown to his opponent. The goal of the leader is to implement an optimal incentive compatible risk-sharing contract. The proposed approach provides a qualitative analysis of adaptive risk behavior profiles for asymmetrically informed players in the context of dynamic strategic interactions modelled as incentive Stackelberg games.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce a model of redistributive income taxation and public expenditure. This joint treatment permits analyzing the interdependencies between the two policies: one cannot be chosen independently of the other. Empirical evidence reveals that partisan confrontation essentially falls on expenditure policies rather than on income taxation. We examine the case in which the expenditure policy (or the size of government) is chosen by majority voting and income taxation is consistently adjusted. This adjustment consists of designing the income tax schedule that, given the expenditure policy, achieves consensus among the population. The model determines the consensus in- come tax schedule, the composition of public expenditure and the size of government. The main results are that inequality is negatively related to the size of government and to the pro-rich bias in public expenditure, and positively or negatively related to the marginal income tax, depending on substitutability between government supplied and market goods. These implications are validated using OECD data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Treaty Establishing the European Community, operative until December 1st 2009, had already established in its article 2 the mission of the up until then European Community and actual European Union is to promote an harmonious, equilibrated and sustainable development of the economic activities of the whole Community. This Mission must be achieved by establishing a Common Market, an Economic and Monetary Union and the realization of Common Policies. One of the instruments to obtain these objectives is the use of free circulation of people, services and capitals inside the Common and Interior Market of the European Union. The European Union is characterized by the confirmation of the total movement of capitals, services and individuals and legal peoples’ freedom; freedom that was already predicated by the Maastricht Treaty, through the suppression of whatever obstacles which are in the way of the objectives before exposed. The old TEC in its Title III, now Title IV of the Treaty on the Functioning of the European Union, covered the free circulation of people, services and capitals. Consequently, the inclusion of this mechanism inside one of the regulating texts of the European Union indicates the importance this freedom supposes for the European Union objectives’ development. Once stood up the relevance of the free movement of people, services and capitals, we must mention that in this paper we are going to centre our study in one of these freedoms of movement: the free movement of capital. In order to analyze in detail the free movement of capital within the European framework, we are going to depart from the analysis of the existent case law of the Court of Justice of the European Union. The use of jurisprudence is basic to know how Community legislation is interpreted. For this reason, we are going to develop this work through judgements dictated by the European Union Court. This way we can observe how Member States’ regulating laws and the European Common Law affect the free movement of capital. The starting point of this paper will be the Judgement C-67/08 European Court of Justice of February 12th 2009, known as Block case. So, following the argumentation the Luxemburg Court did about the mentioned case, we are going to develop how free movement of capital could be affected by the current disparity of Member States’ legislation. This disparity can produce double taxation cases due to the lack of tax harmonized legislation within the interior market and the lack of treaties to avoid double taxation within the European Union. Developing this idea we are going to see how double taxation, at least indirectly, can infringe free movement of capital.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper discusses the utilization of new techniques ot select processes for protein recovery, separation and purification. It describesa rational approach that uses fundamental databases of proteins molecules to simplify the complex problem of choosing high resolution separation methods for multi component mixtures. It examines the role of modern computer techniques to help solving these questions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: Imatinib has been increasingly proposed for therapeutic drug monitoring (TDM), as trough concentrations (Cmin) correlate with response rates in CML patients. This analysis aimed to evaluate the impact of imatinib exposure on optimal molecular response rates in a large European cohort of patients followed by centralized TDM.¦Methods: Sequential PK/PD analysis was performed in NONMEM 7 on 2230 plasma (PK) samples obtained along with molecular response (PD) data from 1299 CML patients. Model-based individual Bayesian estimates of exposure, parameterized as to initial dose adjusted and log-normalized Cmin (log-Cmin) or clearance (CL), were investigated as potential predictors of optimal molecular response, while accounting for time under treatment (stratified at 3 years), gender, CML phase, age, potentially interacting comedication, and TDM frequency. PK/PD analysis used mixed-effect logistic regression (iterative two-stage method) to account for intra-patient correlation.¦Results: In univariate analyses, CL, log-Cmin, time under treatment, TDM frequency, gender (all p<0.01) and CML phase (p=0.02) were significant predictors of the outcome. In multivariate analyses, all but log-Cmin remained significant (p<0.05). Our model estimates a 54.1% probability of optimal molecular response in a female patient with a median CL of 14.4 L/h, increasing by 4.7% with a 35% decrease in CL (percentile 10 of CL distribution), and decreasing by 6% with a 45% increased CL (percentile 90), respectively. Male patients were less likely than female to be in optimal response (odds ratio: 0.62, p<0.001), with an estimated probability of 42.3%.¦Conclusions: Beyond CML phase and time on treatment, expectedly correlated to the outcome, an effect of initial imatinib exposure on the probability of achieving optimal molecular response was confirmed in field-conditions by this multivariate analysis. Interestingly, male patients had a higher risk of suboptimal response, which might not exclusively derive from their 18.5% higher CL, but also from reported lower adherence to the treatment. A prospective longitudinal study would be desirable to confirm the clinical importance of identified covariates and to exclude biases possibly affecting this observational survey.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a market where firms hire workers to run their projects and such projects differ in profitability. At any period, each firm needs two workers to successfully run its project: a junior agent, with no specific skills, and a senior worker, whose effort is not verifiable. Senior workers differ in ability and their competence is revealed after they have worked as juniors in the market. We study the length of the contractual relationships between firms and workers in an environment where the matching between firms and workers is the result of market interaction. We show that, despite in a one-firm-one-worker set-up long-term contracts are the optimal choice for firms, market forces often induce firms to use short-term contracts. Unless the market only consists of firms with very profitable projects, firms operating highly profitable projects offer short-term contracts to ensure the service of high-ability workers and those with less lucrative projects also use short-term contracts to save on the junior workers' wage. Intermediate firms may (or may not) hire workers through long-term contracts.