914 resultados para Distorted probabilities


Relevância:

10.00% 10.00%

Publicador:

Resumo:

How should an equity-motivated policy-marker allocate public capital (infrastructure) across regions. Should it aim at reducing interregional differences in per capita output, or at maximizing total output? Such a normative question is examined in a model where the policy-marker is exclusively concerned about personal inequality and has access to two policy instruments. (i) a personal tax-transfer system (taxation is distortionary), and (ii) the regional allocation of public investment. I show that the case for public investment as a significant instrument for interpersonal redistribution is rather weak. In the most favorable case, when the tax code is constrained to be uniform across regions, it is optimal to distort the allocation of public investment in favor of the poor regions, but only to a limited extent. The reason is that poor individuals are relatively more sensitive to public trans fers, which are maximized by allocating public investment efficiently. If! the tax code can vary across regions then the optimal policy may involve an allocation of public investment distorted in favor of the rich regions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, several school districts in the US have adopted or consider adopting the Student-Optimal Stable Mechanism or the Top Trading Cycles Mechanism to assign children to public schools. There is clear evidence that for school districts that employ (variants of) the so-called Boston Mechanism the transition would lead to efficiency gains. The first two mechanisms are strategy-proof, but in practice student assignment procedures impede students to submit a preference list that contains all their acceptable schools. Therefore, any desirable property of the mechanisms is likely toget distorted. We study the non trivial preference revelation game where students can only declare up to a fixed number (quota) of schools to be acceptable. We focus on the stability of the Nash equilibrium outcomes. Our main results identify rather stringent necessary and sufficient conditions on the priorities to guaranteestability. This stands in sharp contrast with the Boston Mechanism which yields stable Nash equilibrium outcomes, independently of the quota. Hence, the transition to any of the two mechanisms is likely to come with a higher risk that students seek legal actionas lower priority students may occupy more preferred schools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background. Accurate quantification of the prevalence of human immunodeficiency virus type 1 (HIV-1) drug resistance in patients who are receiving antiretroviral therapy (ART) is difficult, and results from previous studies vary. We attempted to assess the prevalence and dynamics of resistance in a highly representative patient cohort from Switzerland. Methods. On the basis of genotypic resistance test results and clinical data, we grouped patients according to their risk of harboring resistant viruses. Estimates of resistance prevalence were calculated on the basis of either the proportion of individuals with a virologic failure or confirmed drug resistance (lower estimate) or the frequency-weighted average of risk group-specific probabilities for the presence of drug resistance mutations (upper estimate). Results. Lower and upper estimates of drug resistance prevalence in 8064 ART-exposed patients were 50% and 57% in 1999 and 37% and 45% in 2007, respectively. This decrease was driven by 2 mechanisms: loss to follow-up or death of high-risk patients exposed to mono- or dual-nucleoside reverse-transcriptase inhibitor therapy (lower estimates range from 72% to 75%) and continued enrollment of low-risk patients who were taking combination ART containing boosted protease inhibitors or nonnucleoside reverse-transcriptase inhibitors as first-line therapy (lower estimates range from 7% to 12%). A subset of 4184 participants (52%) had 1 study visit per year during 2002-2007. In this subset, lower and upper estimates increased from 45% to 49% and from 52% to 55%, respectively. Yearly increases in prevalence were becoming smaller in later years. Conclusions. Contrary to earlier predictions, in situations of free access to drugs, close monitoring, and rapid introduction of new potent therapies, the emergence of drug-resistant viruses can be minimized at the population level. Moreover, this study demonstrates the necessity of interpreting time trends in the context of evolving cohort populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops a comprehensive framework for the quantitative analysis of the private and fiscal returns to schooling and of the effect of public policies on private incentives to invest in education. This framework is applied to 14 member states of the European Union. For each of these countries, we construct estimates of the private return to an additional year of schooling for an individual of average attainment, taking into account the effects of education on wages and employment probabilities after allowing for academic failure rates, the direct and opportunity costs of schooling, and the impact of personal taxes, social security contributions and unemployment and pension benefits on net incomes. We also construct a set of effective tax and subsidy rates that measure the effects of different public policies on the private returns to education, and measures of the fiscal returns to schooling that capture the long-term effects of a marginal increase in attainment on public finances under conditions that approximate general equilibrium.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION: Hip fractures are responsible for excessive mortality, decreasing the 5-year survival rate by about 20%. From an economic perspective, they represent a major source of expense, with direct costs in hospitalization, rehabilitation, and institutionalization. The incidence rate sharply increases after the age of 70, but it can be reduced in women aged 70-80 years by therapeutic interventions. Recent analyses suggest that the most efficient strategy is to implement such interventions in women at the age of 70 years. As several guidelines recommend bone mineral density (BMD) screening of postmenopausal women with clinical risk factors, our objective was to assess the cost-effectiveness of two screening strategies applied to elderly women aged 70 years and older. METHODS: A cost-effectiveness analysis was performed using decision-tree analysis and a Markov model. Two alternative strategies, one measuring BMD of all women, and one measuring BMD only of those having at least one risk factor, were compared with the reference strategy "no screening". Cost-effectiveness ratios were measured as cost per year gained without hip fracture. Most probabilities were based on data observed in EPIDOS, SEMOF and OFELY cohorts. RESULTS: In this model, which is mostly based on observed data, the strategy "screen all" was more cost effective than "screen women at risk." For one woman screened at the age of 70 and followed for 10 years, the incremental (additional) cost-effectiveness ratio of these two strategies compared with the reference was 4,235 euros and 8,290 euros, respectively. CONCLUSION: The results of this model, under the assumptions described in the paper, suggest that in women aged 70-80 years, screening all women with dual-energy X-ray absorptiometry (DXA) would be more effective than no screening or screening only women with at least one risk factor. Cost-effectiveness studies based on decision-analysis trees maybe useful tools for helping decision makers, and further models based on different assumptions should be performed to improve the level of evidence on cost-effectiveness ratios of the usual screening strategies for osteoporosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Non-invasive brain imaging techniques often contrast experimental conditions across a cohort of participants, obfuscating distinctions in individual performance and brain mechanisms that are better characterised by the inter-trial variability. To overcome such limitations, we developed topographic analysis methods for single-trial EEG data [1]. So far this was typically based on time-frequency analysis of single-electrode data or single independent components. The method's efficacy is demonstrated for event-related responses to environmental sounds, hitherto studied at an average event-related potential (ERP) level. Methods: Nine healthy subjects participated to the experiment. Auditory meaningful sounds of common objects were used for a target detection task [2]. On each block, subjects were asked to discriminate target sounds, which were living or man-made auditory objects. Continuous 64-channel EEG was acquired during the task. Two datasets were considered for each subject including single-trial of the two conditions, living and man-made. The analysis comprised two steps. In the first part, a mixture of Gaussians analysis [3] provided representative topographies for each subject. In the second step, conditional probabilities for each Gaussian provided statistical inference on the structure of these topographies across trials, time, and experimental conditions. Similar analysis was conducted at group-level. Results: Results show that the occurrence of each map is structured in time and consistent across trials both at the single-subject and at group level. Conducting separate analyses of ERPs at single-subject and group levels, we could quantify the consistency of identified topographies and their time course of activation within and across participants as well as experimental conditions. A general agreement was found with previous analysis at average ERP level. Conclusions: This novel approach to single-trial analysis promises to have impact on several domains. In clinical research, it gives the possibility to statistically evaluate single-subject data, an essential tool for analysing patients with specific deficits and impairments and their deviation from normative standards. In cognitive neuroscience, it provides a novel tool for understanding behaviour and brain activity interdependencies at both single-subject and at group levels. In basic neurophysiology, it provides a new representation of ERPs and promises to cast light on the mechanisms of its generation and inter-individual variability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the literature the outcome of contests is either interpreted as win probabilities or as shares of the prize. With this in mind, we examine two approaches to contest success functions. In the first we analyze the implications of contestants' incomplete information concerning the "type" of the contest administrator. While in the case of two contestants this approach can rationalize prominent contest success functions, we show that it runs into difficulties when there are more agents. Our second approach interprets contest success functions as sharing rules and establishes a connection to bargaining and claims problems which is independent of the number of contestants. Both approaches provide foundations for popular contest success functions and guidelines for the definition of new ones. Keywords: Endogenous Contests, Contest Success Function. JEL Classification: C72 (Noncooperative Games), D72 (Economic Models of Political Processes: Rent-Seeking, Elections), D74 (Conflict; Conflict Resolution; Alliances).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper studies the interaction between cyclical uncertainty and investment in a stochastic real option framework where demand shifts stochastically between three different states, each with different rates of drift and volatility. In our setting the shifts are governed by a three-state Markov switching model with constant transition probabilities. The magnitude of the link between cyclical uncertainty and investment is quantified using simulations of the model. The chief implication of the model is that recessions and financial turmoil are important catalysts for waiting. In other words, our model shows that macroeconomic risk acts as an important deterrent to investments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The unconditional expectation of social welfare is often used to assess alternative macroeconomic policy rules in applied quantitative research. It is shown that it is generally possible to derive a linear - quadratic problem that approximates the exact non-linear problem where the unconditional expectation of the objective is maximised and the steady-state is distorted. Thus, the measure of pol icy performance is a linear combinat ion of second moments of economic variables which is relatively easy to compute numerically, and can be used to rank alternative policy rules. The approach is applied to a simple Calvo-type model under various monetary policy rules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Recommended oral voriconazole (VRC) doses are lower than intravenous doses. Because plasma concentrations impact efficacy and safety of therapy, optimizing individual drug exposure may improve these outcomes. METHODS: A population pharmacokinetic analysis (NONMEM) was performed on 505 plasma concentration measurements involving 55 patients with invasive mycoses who received recommended VRC doses. RESULTS: A 1-compartment model with first-order absorption and elimination best fitted the data. VRC clearance was 5.2 L/h, the volume of distribution was 92 L, the absorption rate constant was 1.1 hour(-1), and oral bioavailability was 0.63. Severe cholestasis decreased VRC elimination by 52%. A large interpatient variability was observed on clearance (coefficient of variation [CV], 40%) and bioavailability (CV 84%), and an interoccasion variability was observed on bioavailability (CV, 93%). Lack of response to therapy occurred in 12 of 55 patients (22%), and grade 3 neurotoxicity occurred in 5 of 55 patients (9%). A logistic multivariate regression analysis revealed an independent association between VRC trough concentrations and probability of response or neurotoxicity by identifying a therapeutic range of 1.5 mg/L (>85% probability of response) to 4.5 mg/L (<15% probability of neurotoxicity). Population-based simulations with the recommended 200 mg oral or 300 mg intravenous twice-daily regimens predicted probabilities of 49% and 87%, respectively, for achievement of 1.5 mg/L and of 8% and 37%, respectively, for achievement of 4.5 mg/L. With 300-400 mg twice-daily oral doses and 200-300 mg twice-daily intravenous doses, the predicted probabilities of achieving the lower target concentration were 68%-78% for the oral regimen and 70%-87% for the intravenous regimen, and the predicted probabilities of achieving the upper target concentration were 19%-29% for the oral regimen and 18%-37% for the intravenous regimen. CONCLUSIONS: Higher oral than intravenous VRC doses, followed by individualized adjustments based on measured plasma concentrations, improve achievement of the therapeutic target that maximizes the probability of therapeutic response and minimizes the probability of neurotoxicity. These findings challenge dose recommendations for VRC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent work on optimal monetary and fiscal policy in New Keynesian models suggests that it is optimal to allow steady-state debt to follow a random walk. Leith and Wren-Lewis (2012) consider the nature of the timeinconsistency involved in such a policy and its implication for discretionary policy-making. We show that governments are tempted, given inflationary expectations, to utilize their monetary and fiscal instruments in the initial period to change the ultimate debt burden they need to service. We demonstrate that this temptation is only eliminated if following shocks, the new steady-state debt is equal to the original (efficient) debt level even though there is no explicit debt target in the government’s objective function. Analytically and in a series of numerical simulations we show which instrument is used to stabilize the debt depends crucially on the degree of nominal inertia and the size of the debt-stock. We also show that the welfare consequences of introducing debt are negligible for precommitment policies, but can be significant for discretionary policy. Finally, we assess the credibility of commitment policy by considering a quasi-commitment policy which allows for different probabilities of reneging on past promises. This on-line Appendix extends the results of this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent work on optimal monetary and fiscal policy in New Keynesian models suggests that it is optimal to allow steady-state debt to follow a random walk. Leith and Wren-Lewis (2012) consider the nature of the timeinconsistency involved in such a policy and its implication for discretionary policy-making. We show that governments are tempted, given inflationary expectations, to utilize their monetary and fiscal instruments in the initial period to change the ultimate debt burden they need to service. We demonstrate that this temptation is only eliminated if following shocks, the new steady-state debt is equal to the original (efficient) debt level even though there is no explicit debt target in the government’s objective function. Analytically and in a series of numerical simulations we show which instrument is used to stabilize the debt depends crucially on the degree of nominal inertia and the size of the debt-stock. We also show that the welfare consequences of introducing debt are negligible for precommitment policies, but can be significant for discretionary policy. Finally, we assess the credibility of commitment policy by considering a quasi-commitment policy which allows for different probabilities of reneging on past promises. This on-line Appendix extends the results of this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an envelope theorem for establishing first-order conditions in decision problems involving continuous and discrete choices. Our theorem accommodates general dynamic programming problems, even with unbounded marginal utilities. And, unlike classical envelope theorems that focus only on differentiating value functions, we accommodate other endogenous functions such as default probabilities and interest rates. Our main technical ingredient is how we establish the differentiability of a function at a point: we sandwich the function between two differentiable functions from above and below. Our theory is widely applicable. In unsecured credit models, neither interest rates nor continuation values are globally differentiable. Nevertheless, we establish an Euler equation involving marginal prices and values. In adjustment cost models, we show that first-order conditions apply universally, even if optimal policies are not (S,s). Finally, we incorporate indivisible choices into a classic dynamic insurance analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an axiomatic characterization of difference-form group contests, that is, contests fought among groups and where their probability of victory depends on the difference of their effective efforts. This axiomatization rests on the property of Equalizing Consistency, stating that the difference between winning probabilities in the grand contest and in the smaller contest should be identical across all participants in the smaller contest. This property overcomes some of the drawbacks of the widely-used ratio-form contest success functions. Our characterization shows that the criticisms commonly-held against difference-form contests success functions, such as lack of scale invariance and zero elasticity of augmentation, are unfounded.By clarifying the properties of this family of contest success functions, this axiomatization can help researchers to find the functional form better suited to their application of interest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

What genotype should the scientist specify for conducting a database search to try to find the source of a low-template-DNA (lt-DNA) trace? When the scientist answers this question, he or she makes a decision. Here, we approach this decision problem from a normative point of view by defining a decision-theoretic framework for answering this question for one locus. This framework combines the probability distribution describing the uncertainty over the trace's donor's possible genotypes with a loss function describing the scientist's preferences concerning false exclusions and false inclusions that may result from the database search. According to this approach, the scientist should choose the genotype designation that minimizes the expected loss. To illustrate the results produced by this approach, we apply it to two hypothetical cases: (1) the case of observing one peak for allele xi on a single electropherogram, and (2) the case of observing one peak for allele xi on one replicate, and a pair of peaks for alleles xi and xj, i ≠ j, on a second replicate. Given that the probabilities of allele drop-out are defined as functions of the observed peak heights, the threshold values marking the turning points when the scientist should switch from one designation to another are derived in terms of the observed peak heights. For each case, sensitivity analyses show the impact of the model's parameters on these threshold values. The results support the conclusion that the procedure should not focus on a single threshold value for making this decision for all alleles, all loci and in all laboratories.