932 resultados para Uncertainty in governance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper study repeated games where the time repetitions of the stage game are not known or controlled by the players. We call this feature random monitoring. Kawamori's (2004) shows that perfect random monitoring is always better than the canonical case. Surprisingly, when the monitoring is public, the result is less clear-cut and does not generalize in a straightforward way. Unless the public signals are sufficiently informative about player's actions and/or players are patient enough. In addition to a discount effect, that tends to consistently favor the provision of incentives, we found an information effect, associated with the time uncertainty on the distribution of public signals. Whether payoff improvements are or not possible, depends crucially on the direction and strength of these effects. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Random Public Monitoring, Moral Hazard, Stochastic Processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We compare behavior in modified dictator games with and without role uncertainty. Subjectschoose between a selfish action, a costly surplus creating action (altruistic behavior) and acostly surplus destroying action (spiteful behavior). While costly surplus creating actions are themost frequent under role uncertainty (64%), selfish actions become the most frequent withoutrole uncertainty (69%). Also, the frequency of surplus destroying choices is negligible with roleuncertainty (1%) but not so without it (11%). A classification of subjects into four differenttypes of interdependent preferences (Selfish, Social Welfare maximizing, Inequity Averse andCompetitive) shows that the use of role uncertainty overestimates the prevalence of SocialWelfare maximizing preferences in the subject population (from 74% with role uncertainty to21% without it) and underestimates Selfish and Inequity Averse preferences. An additionaltreatment, in which subjects undertake an understanding test before participating in theexperiment with role uncertainty, shows that the vast majority of subjects (93%) correctlyunderstand the payoff mechanism with role uncertainty, but yet surplus creating actions weremost frequent. Our results warn against the use of role uncertainty in experiments that aim tomeasure the prevalence of interdependent preferences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores three aspects of strategic uncertainty: its relation to risk, predictability of behavior and subjective beliefs of players. In a laboratory experiment we measure subjects certainty equivalents for three coordination games and one lottery. Behavior in coordination games is related to risk aversion, experience seeking, and age.From the distribution of certainty equivalents we estimate probabilities for successful coordination in a wide range of games. For many games, success of coordination is predictable with a reasonable error rate. The best response to observed behavior is close to the global-game solution. Comparing choices in coordination games with revealed risk aversion, we estimate subjective probabilities for successful coordination. In games with a low coordination requirement, most subjects underestimate the probability of success. In games with a high coordination requirement, most subjects overestimate this probability. Estimating probabilistic decision models, we show that the quality of predictions can be improved when individual characteristics are taken into account. Subjects behavior is consistent with probabilistic beliefs about the aggregate outcome, but inconsistent with probabilistic beliefs about individual behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Winter weather in Iowa is often unpredictable and can have an adverse impact on traffic flow. The Iowa Department of Transportation (Iowa DOT) attempts to lessen the impact of winter weather events on traffic speeds with various proactive maintenance operations. In order to assess the performance of these maintenance operations, it would be beneficial to develop a model for expected speed reduction based on weather variables and normal maintenance schedules. Such a model would allow the Iowa DOT to identify situations in which speed reductions were much greater than or less than would be expected for a given set of storm conditions, and make modifications to improve efficiency and effectiveness. The objective of this work was to predict speed changes relative to baseline speed under normal conditions, based on nominal maintenance schedules and winter weather covariates (snow type, temperature, and wind speed), as measured by roadside weather stations. This allows for an assessment of the impact of winter weather covariates on traffic speed changes, and estimation of the effect of regular maintenance passes. The researchers chose events from Adair County, Iowa and fit a linear model incorporating the covariates mentioned previously. A Bayesian analysis was conducted to estimate the values of the parameters of this model. Specifically, the analysis produces a distribution for the parameter value that represents the impact of maintenance on traffic speeds. The effect of maintenance is not a constant, but rather a value that the researchers have some uncertainty about and this distribution represents what they know about the effects of maintenance. Similarly, examinations of the distributions for the effects of winter weather covariates are possible. Plots of observed and expected traffic speed changes allow a visual assessment of the model fit. Future work involves expanding this model to incorporate many events at multiple locations. This would allow for assessment of the impact of winter weather maintenance across various situations, and eventually identify locations and times in which maintenance could be improved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of the Bayes factor (BF) or likelihood ratio as a metric to assess the probative value of forensic traces is largely supported by operational standards and recommendations in different forensic disciplines. However, the progress towards more widespread consensus about foundational principles is still fragile as it raises new problems about which views differ. It is not uncommon e.g. to encounter scientists who feel the need to compute the probability distribution of a given expression of evidential value (i.e. a BF), or to place intervals or significance probabilities on such a quantity. The article here presents arguments to show that such views involve a misconception of principles and abuse of language. The conclusion of the discussion is that, in a given case at hand, forensic scientists ought to offer to a court of justice a given single value for the BF, rather than an expression based on a distribution over a range of values.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Outsourcing is a common strategy for companies looking for cost savings and improvements in performance. This has been especially prevalent in logistics, where warehousing and transporting are typical targets for outsourcing. However, while the benefits from logistics outsourcing are clear on paper, there are several cases companies fail to reach these benefits. The most commonly cited reasons for this are poor information flow between the company and the third party logistics partner, and a lack of integration between the two partners. Uncertainty stems from lack of information, and it can cripple the whole outsourcing operation. This is where enterprise resource planning (ERP) systems step in, as they can have a significant role in improving the flow of information, and integration, which consequently mitigates uncertainty. The purpose of the study is to examine if ERP systems have an effect on a company's decision to outsource logistics operations. Along the rapid advancements in technology during the past decades, ERP systems have also evolved. Therefore, empirical research on the subject needs constant revision as it can quickly become outdated due to ERP systems having more advanced capabilities every year. The research was conducted using a qualitative single-case study of a Finnish manufacturing firm that had outsourced warehousing and transportation operations in the Swedish market. The empirical data was gathered with use of semi-structured interviews with three employees from the case company that were closely related to the outsourcing operation. The theoretical framework that was used to analyze the empirical data was based on Transaction Cost Economics theory. The results of the study were align with the theoretical framework, in that the ERP system of the case company was seen as an enabler for their logistics outsourcing operation. However, the full theoretical benefits from ERP systems concerning extended enterprise functionality and flexibility were not attained due to the case company having an older version of their ERP system. This emphasizes the importance of having up-to-date technology if you want to overcome the shortcomings of ERP systems in outsourcing situations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Faced by the realities of a changing climate, decision makers in a wide variety of organisations are increasingly seeking quantitative predictions of regional and local climate. An important issue for these decision makers, and for organisations that fund climate research, is what is the potential for climate science to deliver improvements - especially reductions in uncertainty - in such predictions? Uncertainty in climate predictions arises from three distinct sources: internal variability, model uncertainty and scenario uncertainty. Using data from a suite of climate models we separate and quantify these sources. For predictions of changes in surface air temperature on decadal timescales and regional spatial scales, we show that uncertainty for the next few decades is dominated by sources (model uncertainty and internal variability) that are potentially reducible through progress in climate science. Furthermore, we find that model uncertainty is of greater importance than internal variability. Our findings have implications for managing adaptation to a changing climate. Because the costs of adaptation are very large, and greater uncertainty about future climate is likely to be associated with more expensive adaptation, reducing uncertainty in climate predictions is potentially of enormous economic value. We highlight the need for much more work to compare: a) the cost of various degrees of adaptation, given current levels of uncertainty; and b) the cost of new investments in climate science to reduce current levels of uncertainty. Our study also highlights the importance of targeting climate science investments on the most promising opportunities to reduce prediction uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

21st century climate change is projected to result in an intensification of the global hydrological cycle, but there is substantial uncertainty in how this will impact freshwater availability. A relatively overlooked aspect of this uncertainty pertains to how different methods of estimating potential evapotranspiration (PET) respond to changing climate. Here we investigate the global response of six different PET methods to a 2 °C rise in global mean temperature. All methods suggest an increase in PET associated with a warming climate. However, differences in PET climate change signal of over 100% are found between methods. Analysis of a precipitation/PET aridity index and regional water surplus indicates that for certain regions and GCMs, choice of PET method can actually determine the direction of projections of future water resources. As such, method dependence of the PET climate change signal is an important source of uncertainty in projections of future freshwater availability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Critical loads are the basis for policies controlling emissions of acidic substances in Europe. The implementation of these policies involves large expenditures, and it is reasonable for policymakers to ask what degree of certainty can be attached to the underlying critical load and exceedance estimates. This paper is a literature review of studies which attempt to estimate the uncertainty attached to critical loads. Critical load models and uncertainty analysis are briefly outlined. Most studies have used Monte Carlo analysis of some form to investigate the propagation of uncertainties in the definition of the input parameters through to uncertainties in critical loads. Though the input parameters are often poorly known, the critical load uncertainties are typically surprisingly small because of a "compensation of errors" mechanism. These results depend on the quality of the uncertainty estimates of the input parameters, and a "pedigree" classification for these is proposed. Sensitivity analysis shows that some input parameters are more important in influencing critical load uncertainty than others, but there have not been enough studies to form a general picture. Methods used for dealing with spatial variation are briefly discussed. Application of alternative models to the same site or modifications of existing models can lead to widely differing critical loads, indicating that research into the underlying science needs to continue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports an uncertainty analysis of critical loads for acid deposition for a site in southern England, using the Steady State Mass Balance Model. The uncertainty bounds, distribution type and correlation structure for each of the 18 input parameters was considered explicitly, and overall uncertainty estimated by Monte Carlo methods. Estimates of deposition uncertainty were made from measured data and an atmospheric dispersion model, and hence the uncertainty in exceedance could also be calculated. The uncertainties of the calculated critical loads were generally much lower than those of the input parameters due to a "compensation of errors" mechanism - coefficients of variation ranged from 13% for CLmaxN to 37% for CL(A). With 1990 deposition, the probability that the critical load was exceeded was > 0.99; to reduce this probability to 0.50, a 63% reduction in deposition is required; to 0.05, an 82% reduction. With 1997 deposition, which was lower than that in 1990, exceedance probabilities declined and uncertainties in exceedance narrowed as deposition uncertainty had less effect. The parameters contributing most to the uncertainty in critical loads were weathering rates, base cation uptake rates, and choice of critical chemical value, indicating possible research priorities. However, the different critical load parameters were to some extent sensitive to different input parameters. The application of such probabilistic results to environmental regulation is discussed.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Critical loads are the basis for policies controlling emissions of acidic substances in Europe and elsewhere. They are assessed by several elaborate and ingenious models, each of which requires many parameters, and have to be applied on a spatially-distributed basis. Often the values of the input parameters are poorly known, calling into question the validity of the calculated critical loads. This paper attempts to quantify the uncertainty in the critical loads due to this "parameter uncertainty", using examples from the UK. Models used for calculating critical loads for deposition of acidity and nitrogen in forest and heathland ecosystems were tested at four contrasting sites. Uncertainty was assessed by Monte Carlo methods. Each input parameter or variable was assigned a value, range and distribution in an objective a fashion as possible. Each model was run 5000 times at each site using parameters sampled from these input distributions. Output distributions of various critical load parameters were calculated. The results were surprising. Confidence limits of the calculated critical loads were typically considerably narrower than those of most of the input parameters. This may be due to a "compensation of errors" mechanism. The range of possible critical load values at a given site is however rather wide, and the tails of the distributions are typically long. The deposition reductions required for a high level of confidence that the critical load is not exceeded are thus likely to be large. The implication for pollutant regulation is that requiring a high probability of non-exceedance is likely to carry high costs. The relative contribution of the input variables to critical load uncertainty varied from site to site: any input variable could be important, and thus it was not possible to identify variables as likely targets for research into narrowing uncertainties. Sites where a number of good measurements of input parameters were available had lower uncertainties, so use of in situ measurement could be a valuable way of reducing critical load uncertainty at particularly valuable or disputed sites. From a restricted number of samples, uncertainties in heathland critical loads appear comparable to those of coniferous forest, and nutrient nitrogen critical loads to those of acidity. It was important to include correlations between input variables in the Monte Carlo analysis, but choice of statistical distribution type was of lesser importance. Overall, the analysis provided objective support for the continued use of critical loads in policy development. (c) 2007 Elsevier B.V. All rights reserved.