914 resultados para Distorted probabilities
Resumo:
Pigouvian taxes are typically imposed in situations where there is imperfect knowledge on the extent of damage caused by a producing firm. A regulator imposing imperfectly informed Pigouvian taxes may cause the firms that should (should not) produce to shut down (produce). In this paper we use a Bayesian information framework to identify optimal signal-conditioned taxes in the presence of such losses. The tax system involves reducing (increasing) taxes on firms identified as causing high (low) damage. Unfortunately, when an abatement decision has to be made, the tax system that minimizes production distortions also dampens the incentive to abate. In the absence of wrong-firm concerns, a regulator can solve the problem by not adjusting taxes for signal noise. When wrong-firm losses are a concern, the regulator has to trade off losses from distorted production incentives with losses from distorted abatement incentives. The most appropriate policy may involve a combination of instruments.
Resumo:
Osteoporosis is a serious worldwide epidemic. FRAX® is a web-based tool developed by the Sheffield WHO Collaborating Center team, that integrates clinical risk factors and femoral neck BMD and calculates the 10 year fracture probability in order to help health care professionals identify patients who need treatment. However, only 31 countries have a FRAX® calculator. In the absence of a FRAX® model for a particular country, it has been suggested to use a surrogate country for which the epidemiology of osteoporosis most closely approximates the index country. More specific recommendations for clinicians in these countries are not available. In North America, concerns have also been raised regarding the assumptions used to construct the US ethnic specific FRAX® calculators with respect to the correction factors applied to derive fracture probabilities in Blacks, Asians and Hispanics in comparison to Whites. In addition, questions were raised about calculating fracture risk in other ethnic groups e.g., Native Americans and First Canadians. The International Society for Clinical Densitometry (ISCD) in conjunction with the International Osteoporosis Foundation (IOF) assembled an international panel of experts that ultimately developed joint Official Positions of the ISCD and IOF advising clinicians regarding FRAX® usage. As part of the process, the charge of the FRAX® International Task Force was to review and synthesize data regarding geographic and race/ethnic variability in hip fractures, non-hip osteoporotic fractures, and make recommendations about the use of FRAX® in ethnic groups and countries without a FRAX® calculator. This synthesis was presented to the expert panel and constitutes the data on which the subsequent Official Positions are predicated. A summary of the International Task Force composition and charge is presented here.
Resumo:
We show that a simple mixing idea allows one to establish a number of explicit formulas for ruin probabilities and related quantities in collective risk models with dependence among claim sizes and among claim inter-occurrence times. Examples include compound Poisson risk models with completely monotone marginal claim size distributions that are dependent according to Archimedean survival copulas as well as renewal risk models with dependent inter-occurrence times.
Resumo:
Prior probabilities represent a core element of the Bayesian probabilistic approach to relatedness testing. This letter opinions on the commentary 'Use of prior odds for missing persons identifications' by Budowle et al. (2011), published recently in this journal. Contrary to Budowle et al. (2011), we argue that the concept of prior probabilities (i) is not endowed with the notion of objectivity, (ii) is not a case for computation and (iii) does not require new guidelines edited by the forensic DNA community - as long as probability is properly considered as an expression of personal belief. Please see related article: http://www.investigativegenetics.com/content/3/1/3
Resumo:
Mutations in LACERATA (LCR), FIDDLEHEAD (FDH), and BODYGUARD (BDG) cause a complex developmental syndrome that is consistent with an important role for these Arabidopsis genes in cuticle biogenesis. The genesis of their pleiotropic phenotypes is, however, poorly understood. We provide evidence that neither distorted depositions of cutin, nor deficiencies in the chemical composition of cuticular lipids, account for these features, instead suggesting that the mutants alleviate the functional disorder of the cuticle by reinforcing their defenses. To better understand how plants adapt to these mutations, we performed a genome-wide gene expression analysis. We found that apparent compensatory transcriptional responses in these mutants involve the induction of wax, cutin, cell wall, and defense genes. To gain greater insight into the mechanism by which cuticular mutations trigger this response in the plants, we performed an overlap meta-analysis, which is termed MASTA (MicroArray overlap Search Tool and Analysis), of differentially expressed genes. This suggested that different cell integrity pathways are recruited in cesA cellulose synthase and cuticular mutants. Using MASTA for an in silico suppressor/enhancer screen, we identified SERRATE (SE), which encodes a protein of RNA-processing multi-protein complexes, as a likely enhancer. In confirmation of this notion, the se lcr and se bdg double mutants eradicate severe leaf deformations as well as the organ fusions that are typical of lcr and bdg and other cuticular mutants. Also, lcr does not confer resistance to Botrytis cinerea in a se mutant background. We propose that there is a role for SERRATE-mediated RNA signaling in the cuticle integrity pathway.
Resumo:
In this paper we present a simple theory-based measure of the variations in aggregate economic efficiency: the gap between the marginal product of labor and the household s consumption/leisure tradeoff. We show that this indicator corresponds to the inverse of the markup of price over social marginal cost, and give some evidence in support of this interpretation. We then show that, with some auxilliary assumptions our gap variable may be used to measure the efficiency costs of business fluctuations. We find that the latter costs are modest on average. However, to the extent the flexible price equilibrium is distorted, the gross efficiency losses from recessions and gains from booms may be large. Indeed, we find that the major recessions involved large efficiency losses. These results hold for reasonable parameterizations of the Frisch elasticity of labor supply, the coefficient of relative risk aversion, and steady state distortions.
Resumo:
The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.
Resumo:
One of the assumptions of the Capacitated Facility Location Problem (CFLP) is thatdemand is known and fixed. Most often, this is not the case when managers take somestrategic decisions such as locating facilities and assigning demand points to thosefacilities. In this paper we consider demand as stochastic and we model each of thefacilities as an independent queue. Stochastic models of manufacturing systems anddeterministic location models are put together in order to obtain a formula for thebacklogging probability at a potential facility location.Several solution techniques have been proposed to solve the CFLP. One of the mostrecently proposed heuristics, a Reactive Greedy Adaptive Search Procedure, isimplemented in order to solve the model formulated. We present some computationalexperiments in order to evaluate the heuristics performance and to illustrate the use ofthis new formulation for the CFLP. The paper finishes with a simple simulationexercise.
Resumo:
Revenue management practices often include overbooking capacity to account for customerswho make reservations but do not show up. In this paper, we consider the network revenuemanagement problem with no-shows and overbooking, where the show-up probabilities are specificto each product. No-show rates differ significantly by product (for instance, each itinerary andfare combination for an airline) as sale restrictions and the demand characteristics vary byproduct. However, models that consider no-show rates by each individual product are difficultto handle as the state-space in dynamic programming formulations (or the variable space inapproximations) increases significantly. In this paper, we propose a randomized linear program tojointly make the capacity control and overbooking decisions with product-specific no-shows. Weestablish that our formulation gives an upper bound on the optimal expected total profit andour upper bound is tighter than a deterministic linear programming upper bound that appearsin the existing literature. Furthermore, we show that our upper bound is asymptotically tightin a regime where the leg capacities and the expected demand is scaled linearly with the samerate. We also describe how the randomized linear program can be used to obtain a bid price controlpolicy. Computational experiments indicate that our approach is quite fast, able to scale to industrialproblems and can provide significant improvements over standard benchmarks.
Resumo:
Scoring rules that elicit an entire belief distribution through the elicitation of point beliefsare time-consuming and demand considerable cognitive e¤ort. Moreover, the results are validonly when agents are risk-neutral or when one uses probabilistic rules. We investigate a classof rules in which the agent has to choose an interval and is rewarded (deterministically) onthe basis of the chosen interval and the realization of the random variable. We formulatean e¢ ciency criterion for such rules and present a speci.c interval scoring rule. For single-peaked beliefs, our rule gives information about both the location and the dispersion of thebelief distribution. These results hold for all concave utility functions.
Resumo:
This paper explores biases in the elicitation of utilities under risk and the contribution that generalizations of expected utility can make to the resolution of these biases. We used five methods to measure utilities under risk and found clear violations of expected utility. Of the theories studies, prospect theory was most consistent with our data. The main improvement of prospect theory over expected utility was in comparisons between a riskless and a risky prospect(riskless-risk methods). We observed no improvement over expected utility in comparisons between two risky prospects (risk-risk methods). An explanation why we found no improvement of prospect theory over expected utility in risk-risk methods may be that there was less overweighting of small probabilities in our study than has commonly been observed.
Resumo:
An important problem in descriptive and prescriptive research in decision making is to identify regions of rationality, i.e., the areas for which heuristics are and are not effective. To map the contours of such regions, we derive probabilities that heuristics identify the best of m alternatives (m > 2) characterized by k attributes or cues (k > 1). The heuristics include a single variable (lexicographic), variations of elimination-by-aspects, equal weighting, hybrids of the preceding, and models exploiting dominance. We use twenty simulated and four empirical datasets for illustration. We further provide an overview by regressing heuristic performance on factors characterizing environments. Overall, sensible heuristics generally yield similar choices in many environments. However, selection of the appropriate heuristic can be important in some regions (e.g., if there is low inter-correlation among attributes/cues). Since our work assumes a hit or miss decision criterion, we conclude by outlining extensions for exploring the effects of different loss functions.
Resumo:
I show that intellectual property rights yield static efficiency gains, irrespective oftheir dynamic role in fostering innovation. I develop a property-rights model of firmorganization with two dimensions of non-contractible investment. In equilibrium, thefirst best is attained if and only if ownership of tangible and intangible assets is equallyprotected. If IP rights are weaker, firm structure is distorted and efficiency declines:the entrepreneur must either integrate her suppliers, which prompts a decline in theirinvestment; or else risk their defection, which entails a waste of her human capital. Mymodel predicts greater prevalence of vertical integration where IP rights are weaker,and a switch from integration to outsourcing over the product cycle. Both empiricalpredictions are consistent with evidence on multinational companies. As a normativeimplication, I find that IP rights should be strong but narrowly defined, to protect abusiness without holding up its potential spin-offs.
Resumo:
Unemployment rates in developed countries have recently reached levels not seenin a generation, and workers of all ages are facing increasing probabilities of losingtheir jobs and considerable losses in accumulated assets. These events likely increasethe reliance that most older workers will have on public social insurance programs,exactly at a time that public finances are suffering from a large drop in contributions.Our paper explicitly accounts for employment uncertainty and unexpectedwealth shocks, something that has been relatively overlooked in the literature, butthat has grown in importance in recent years. Using administrative and householdlevel data we empirically characterize a life-cycle model of retirement and claimingdecisions in terms of the employment, wage, health, and mortality uncertainty facedby individuals. Our benchmark model explains with great accuracy the strikinglyhigh proportion of individuals who claim benefits exactly at the Early RetirementAge, while still explaining the increased claiming hazard at the Normal RetirementAge. We also discuss some policy experiments and their interplay with employmentuncertainty. Additionally, we analyze the effects of negative wealth shocks on thelabor supply and claiming decisions of older Americans. Our results can explainwhy early claiming has remained very high in the last years even as the early retirementpenalties have increased substantially compared with previous periods, andwhy labor force participation has remained quite high for older workers even in themidst of the worse employment crisis in decades.
Resumo:
This paper explores three aspects of strategic uncertainty: its relation to risk, predictability of behavior and subjective beliefs of players. In a laboratory experiment we measure subjects certainty equivalents for three coordination games and one lottery. Behavior in coordination games is related to risk aversion, experience seeking, and age.From the distribution of certainty equivalents we estimate probabilities for successful coordination in a wide range of games. For many games, success of coordination is predictable with a reasonable error rate. The best response to observed behavior is close to the global-game solution. Comparing choices in coordination games with revealed risk aversion, we estimate subjective probabilities for successful coordination. In games with a low coordination requirement, most subjects underestimate the probability of success. In games with a high coordination requirement, most subjects overestimate this probability. Estimating probabilistic decision models, we show that the quality of predictions can be improved when individual characteristics are taken into account. Subjects behavior is consistent with probabilistic beliefs about the aggregate outcome, but inconsistent with probabilistic beliefs about individual behavior.