870 resultados para Optimal location
Resumo:
This paper examines whether efficiency considerations require that optimal labour income taxation is progressive or regressive in a model with skill heterogeneity, endogenous skill acquisition and a production sector with capital-skill complementarity. We find that wage inequality driven by the resource requirements of skill-creation implies progressive labour income taxation in the steady-state as well as along the transition path from the exogenous to optimal policy steady-state. We find that these results are explained by a lower labour supply elasticity for skilled versus unskilled labour which results from the introduction of the skill acquisition technology.
Resumo:
We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.
Resumo:
In this paper we make three contributions to the literature on optimal Competition Law enforcement procedures. The first (which is of general interest beyond competition policy) is to clarify the concept of “legal uncertainty”, relating it to ideas in the literature on Law and Economics, but formalising the concept through various information structures which specify the probability that each firm attaches – at the time it takes an action – to the possibility of its being deemed anti-competitive were it to be investigated by a Competition Authority. We show that the existence of Type I and Type II decision errors by competition authorities is neither necessary nor sufficient for the existence of legal uncertainty, and that information structures with legal uncertainty can generate higher welfare than information structures with legal certainty – a result echoing a similar finding obtained in a completely different context and under different assumptions in earlier Law and Economics literature (Kaplow and Shavell, 1992). Our second contribution is to revisit and significantly generalise the analysis in our previous paper, Katsoulacos and Ulph (2009), involving a welfare comparison of Per Se and Effects- Based legal standards. In that analysis we considered just a single information structure under an Effects-Based standard and also penalties were exogenously fixed. Here we allow for (a) different information structures under an Effects-Based standard and (b) endogenous penalties. We obtain two main results: (i) considering all information structures a Per Se standard is never better than an Effects-Based standard; (ii) optimal penalties may be higher when there is legal uncertainty than when there is no legal uncertainty.
Resumo:
We determine he optimal combination of a universal benefit, B, and categorical benefit, C, for an economy in which individuals differ in both their ability to work - modelled as an exogenous zero quantity constraint on labour supply - and, conditional on being able to work, their productivity at work. C is targeted at those unable to work, and is conditioned in two dimensions: ex-ante an individual must be unable to work and be awarded the benefit, whilst ex-post a recipient must not subsequently work. However, the ex-ante conditionality may be imperfectly enforced due to Type I (false rejection) and Type II (false award) classification errors, whilst, in addition, the ex-post conditionality may be imperfectly enforced. If there are no classification errors - and thus no enforcement issues - it is always optimal to set C>0, whilst B=0 only if the benefit budget is sufficiently small. However, when classification errors occur, B=0 only if there are no Type I errors and the benefit budget is sufficiently small, while the conditions under which C>0 depend on the enforcement of the ex-post conditionality. We consider two discrete alternatives. Under No Enforcement C>0 only if the test administering C has some discriminatory power. In addition, social welfare is decreasing in the propensity to make each type error. However, under Full Enforcement C>0 for all levels of discriminatory power. Furthermore, whilst social welfare is decreasing in the propensity to make Type I errors, there are certain conditions under which it is increasing in the propensity to make Type II errors. This implies that there may be conditions under which it would be welfare enhancing to lower the chosen eligibility threshold - support the suggestion by Goodin (1985) to "err on the side of kindness".
Resumo:
We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.
Resumo:
In a market in which sellers compete by posting mechanisms, we study how the properties of the meeting technology affect the mechanism that sellers select. In general, sellers have incentive to use mechanisms that are socially efficient. In our environment, sellers achieve this by posting an auction with a reserve price equal to their own valuation, along with a transfer that is paid by (or to) all buyers with whom the seller meets. However, we define a novel condition on meeting technologies, which we call “invariance,” and show that the transfer is equal to zero if and only if the meeting technology satisfies this condition.
Resumo:
Time-inconsistency is an essential feature of many policy problems (Kydland and Prescott, 1977). This paper presents and compares three methods for computing Markov-perfect optimal policies in stochastic nonlinear business cycle models. The methods considered include value function iteration, generalized Euler-equations, and parameterized shadow prices. In the context of a business cycle model in which a scal authority chooses government spending and income taxation optimally, while lacking the ability to commit, we show that the solutions obtained using value function iteration and generalized Euler equations are somewhat more accurate than that obtained using parameterized shadow prices. Among these three methods, we show that value function iteration can be applied easily, even to environments that include a risk-sensitive scal authority and/or inequality constraints on government spending. We show that the risk-sensitive scal authority lowers government spending and income-taxation, reducing the disincentive households face to accumulate wealth.
Resumo:
This paper considers the optimal degree of discretion in monetary policy when the central bank conducts policy based on its private information about the state of the economy and is unable to commit. Society seeks to maximize social welfare by imposing restrictions on the central bank's actions over time, and the central bank takes these restrictions and the New Keynesian Phillips curve as constraints. By solving a dynamic mechanism design problem we find that it is optimal to grant "constrained discretion" to the central bank by imposing both upper and lower bounds on permissible inflation, and that these bounds must be set in a history-dependent way. The optimal degree of discretion varies over time with the severity of the time-inconsistency problem, and, although no discretion is optimal when the time-inconsistency problem is very severe, our numerical experiment suggests that no-discretion is a transient phenomenon, and that some discretion is granted eventually.
Resumo:
We study the lysis timing of a bacteriophage population by means of a continuously infection-age-structured population dynamics model. The features of the model are the infection process of bacteria, the natural death process, and the lysis process which means the replication of bacteriophage viruses inside bacteria and the destruction of them. We consider that the length of the lysis timing (or latent period) is distributed according to a general probability distribution function. We have carried out an optimization procedure and we have found the latent period corresponding to the maximal fitness (i.e. maximal growth rate) of the bacteriophage population.
Resumo:
When using a polynomial approximating function the most contentious aspect of the Heat Balance Integral Method is the choice of power of the highest order term. In this paper we employ a method recently developed for thermal problems, where the exponent is determined during the solution process, to analyse Stefan problems. This is achieved by minimising an error function. The solution requires no knowledge of an exact solution and generally produces significantly better results than all previous HBI models. The method is illustrated by first applying it to standard thermal problems. A Stefan problem with an analytical solution is then discussed and results compared to the approximate solution. An ablation problem is also analysed and results compared against a numerical solution. In both examples the agreement is excellent. A Stefan problem where the boundary temperature increases exponentially is analysed. This highlights the difficulties that can be encountered with a time dependent boundary condition. Finally, melting with a time-dependent flux is briefly analysed without applying analytical or numerical results to assess the accuracy.
Dynamic stackelberg game with risk-averse players: optimal risk-sharing under asymmetric information
Resumo:
The objective of this paper is to clarify the interactive nature of the leader-follower relationship when both players are endogenously risk-averse. The analysis is placed in the context of a dynamic closed-loop Stackelberg game with private information. The case of a risk-neutral leader, very often discussed in the literature, is only a borderline possibility in the present study. Each player in the game is characterized by a risk-averse type which is unknown to his opponent. The goal of the leader is to implement an optimal incentive compatible risk-sharing contract. The proposed approach provides a qualitative analysis of adaptive risk behavior profiles for asymmetrically informed players in the context of dynamic strategic interactions modelled as incentive Stackelberg games.
Resumo:
The paper discusses the utilization of new techniques ot select processes for protein recovery, separation and purification. It describesa rational approach that uses fundamental databases of proteins molecules to simplify the complex problem of choosing high resolution separation methods for multi component mixtures. It examines the role of modern computer techniques to help solving these questions.
Resumo:
Gastroschisis is a common congenital abdominal wall defect. It is almost always diagnosed prenatally thanks to routine maternal serum screening and ultrasound screening programs. In the majority of cases, the condition is isolated (i.e. not associated with chromosomal or other anatomical anomalies). Prenatal diagnosis allows for planning the timing, mode and location of delivery. Controversies persist concerning the optimal antenatal monitoring strategy. Compelling evidence supports elective delivery at 37 weeks' gestation in a tertiary pediatric center. Cesarean section should be reserved for routine obstetrical indications. Prognosis of infants with gastroschisis is primarily determined by the degree of bowel injury, which is difficult to assess antenatally. Prenatal counseling usually addresses gastroschisis issues. However, parental concerns are mainly focused on long-term postnatal outcomes including gastrointestinal function and neurodevelopment. Although infants born with gastroschisis often endure a difficult neonatal course, they experience few long-term complications. This manuscript, which is structured around common parental questions and concerns, reviews the evidence pertaining to the antenatal, neonatal and long-term implications of a fetal gastroschisis diagnosis and is aimed at helping healthcare professionals counsel expecting parents. © 2013 John Wiley & Sons, Ltd.
Resumo:
PURPOSE: The aim of this study was to determine whether tumor location proximal or distal to the splenic flexure is associated with distinct molecular patterns and can predict clinical outcome in a homogeneous group of patients with Dukes B (T3-T4, N0, M0) colorectal cancer. It has been hypothesized that proximal and distal colorectal cancer may arise through different pathogenetic mechanisms. Although p53 and Ki-ras gene mutations occur frequently in distal tumors, another form of genomic instability associated with defective DNA mismatch repair has been predominantly identified in the proximal colon. To date, however, the clinical usefulness of these molecular characteristics remains unproven. METHODS: A total of 126 patients with a lymph node-negative sporadic colon or rectum adenocarcinoma were prospectively assessed with the endpoint of death by cancer. No patient received either radiotherapy or chemotherapy. p53 protein was studied by immunohistochemistry using DO-7 monoclonal antibody, and p53 and Ki-ras gene mutations were detected by single strand conformation polymorphism assay. RESULTS: During a mean follow-up of 67 months, the overall five-year survival was 70 percent. Nuclear p53 staining was found in 57 tumors (47 percent), and was more frequent in distal than in proximal tumors (55 vs. 21 percent; chi-squared test, P < 0.001). For the whole group, p53 protein expression correlated with poor survival in univariate and multivariate analysis (log-rank test, P = 0.01; hazard ratio = 2.16; 95 percent confidence interval = 1.12-4.11, P = 0.02). Distal colon tumors and rectal tumors exhibited similar molecular patterns and showed no difference in clinical outcome. In comparison with distal colorectal cancer, proximal tumors were found to be statistically significantly different on the following factors: mucinous content (P = 0.008), degree of histologic differentiation (P = 0.012), p53 protein expression, and gene mutation (P = 0.001 and 0.01 respectively). Finally, patients with proximal tumors had a marginally better survival than those with distal colon or rectal cancers (log-rank test, P = 0.045). CONCLUSION: In this series of Dukes B colorectal cancers, p53 protein expression was an independent factor for survival, which also correlated with tumor location. Eighty-six percent of p53-positive tumors were located in the distal colon and rectum. Distal colon and rectum tumors had similar molecular and clinical characteristics. In contrast, proximal neoplasms seem to represent a distinct entity, with specific histopathologic characteristics, molecular patterns, and clinical outcome. Location of the neoplasm in reference to the splenic flexure should be considered before group stratification in future trials of adjuvant chemotherapy in patients with Dukes B tumors.
Resumo:
El objetivo de esta investigación es aportar evidencia sobre las fuentes de las economías de aglomeración para el caso español. De todas las maneras posibles que se han tomado en la literatura para medir las economías de aglomeración, nosotros lo analizamos a partir de las decisiones de localización de las empresas manufactureras. La literatura reciente ha puesto de relieve que el análisis basado en la disyuntiva localización / urbanización (relaciones dentro de un mismo sector) no es suficiente para entender las economías de aglomeración. Sin embargo, las relaciones entre los diferentes sectores sí resultan significativas al examinar por qué las empresas que pertenecen a diferentes sectores se localizan unas al lado de las otras. Con esto en mente, intentamos explicar que relaciones entre diferentes sectores pueden explicar coaglomeración. Para ello, nos centramos en aquellas relaciones entre sectores definidos a partir de los mecanismos de aglomeración de Marshall, es decir, labor market, input sharing y knowledge spillovers. Trabajamos con el labor market pooling en la medida en que los dos sectores utilizan los mismos trabajadores (clasificación de ocupaciones). Con el segundo mecanismo de Marshall, input sharing, introducimos cómo dos sectores tienen una relación de comprador / vendedor. Por último, nos referimos a dos sectores que utilizan las mismas tecnologías en cuanto a los knowledge spillovers. Con el fin de capturar todos los efectos de los mecanismos de aglomeracion en España, en esta investigación trabajamos con dos ámbitos geográficos, los municipios y los mercados de trabajo locales. La literatura existente nunca se ha puesto de acuerdo en cual es el ámbito geográfico en el que mejor trabajan los mecanismos Marshall, por lo que hemos cubierto todas las unidades geográficas potenciales.