932 resultados para Uncertainty in governance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change is expected to have wide-ranging impacts on urban areas and creates additional challenges for sustainable development. Urban areas are inextricably linked with climate change, as they are major contributors to it, while also being particularly vulnerable to its impacts. Climate change presents a new challenge to urban areas, not only because of the expected rises in temperature and sea-level, but also the current context of failure to fully address the institutional barriers preventing action to prepare for climate change, or feedbacks between urban systems and agents. Despite the importance of climate change, there are few cities in developing countries that are attempting to address these issues systematically as part of their governance and planning processes. While there is a growing literature on the risks and vulnerabilities related to climate change, as yet there is limited research on the development of institutional responses, the dissemination of relevant knowledge and evaluation of tools for practical planning responses by decision makers at the city level. This thesis questions the dominant assumptions about the capacity of institutions and potential of adaptive planning. It argues that achieving a balance between climate change impacts and local government decision-making capacity is a vital for successful adaptation to the impacts of climate change. Urban spatial planning and wider environmental planning not only play a major role in reducing/mitigating risks but also have a key role in adapting to uncertainty in over future risk. The research focuses on a single province - the biggest city in Vietnam - Ho Chi Minh City - as the principal case study to explore this argument, by examining the linkages between urban planning systems, the structures of governance, and climate change adaptation planning. In conclusion it proposes a specific framework to offer insights into some of the more practical considerations, and the approach emphasises the importance of vertical and horizontal coordination in governance and urban planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN]The uncertainty associated with natural magnitudes and processes is conspicuous in water resources and groundwater evaluation. This uncertainty has an essential component and a part that can be reduced to some extent by increasing knowledge, improving monitoring coverage, continuous elaboration of data and accuracy and addressing the related economic and social aspects involved. Reducing uncertainty has a cost that may not be justified by the improvement that is obtainable, but that has to be known to make the right decisions. With this idea, this paper contributes general comments on the evaluation of groundwater resources in the semiarid Canary Islands and on some of the main sources of uncertainty, but a full treatment is not attempted, nor how to reduce it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Government actors create law against a backdrop of uncertainty. Limited information, unpredictable events, and lack of understanding interfere with accurately predicting a legal regime’s costs, benefits, and effects on other legal and social programs and institutions. Does the availability of no-fault divorce increase the number of terminated marriages? Will bulk-collection of telecommunications information about American citizens reveal terrorist plots? Can a sensitive species breed in the presence of oil and gas wells? The answers to these questions are far from clear, but lawmakers must act nonetheless. The problems posed by uncertainty cut across legal fields. Scholars and regulators in a variety of contexts recognize the importance of uncertainty, but no systematic, generally-applicable framework exists for determining how law should account for gaps in information. This Article suggests such a framework and develops a novel typology of strategies for accounting for uncertainty in governance. This typology includes “static law,” as well as three varieties of “dynamic law.” “Static law” is a legal rule initially intended to last in perpetuity. “Dynamic law” is intended to change, and includes: (1) durational regulation, or fixed legal rules with periodic opportunities for amendment or repeal; (2) adaptive regulation, or malleable legal rules with procedural mechanisms allowing rules to change; and (3) contingent regulation, or malleable legal rules with triggering mechanisms to substantively change to the rules. Each of these strategies, alone or in combination, may best address the uncertainty inherent in a particular lawmaking effort. This Article provides a diagnostic framework that lawmakers can use to identify optimal strategies. Ultimately, this approach to uncertainty yields immediate practical benefits by enabling lawmakers to better structure governance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a personal view of the interaction between the analysis of choice under uncertainty and the analysis of production under uncertainty. Interest in the foundations of the theory of choice under uncertainty was stimulated by applications of expected utility theory such as the Sandmo model of production under uncertainty. This interest led to the development of generalized models including rank-dependent expected utility theory. In turn, the development of generalized expected utility models raised the question of whether such models could be used in the analysis of applied problems such as those involving production under uncertainty. Finally, the revival of the state-contingent approach led to the recognition of a fundamental duality between choice problems and production problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we consider a mixed market with uncertain demand, involving one private firm and one public firm with quadratic costs. The model is a two-stage game in which players choose to make their output decisions either in stage 1 or stage 2. We assume that the demand is unknown until the end of the first stage. We compute the output levels at equilibrium in each possible role. We also determine ex-ante and ex-post firms’ payoff functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On a symmetric differentiated Stackelberg duopoly model in which there is asymmetric demand information owned by leading and follower firms, we show that the leading firm does not necessarily have advantage over the following one. The reason for this is that the second mover can adjust its output level after observing the realized demand, while the first mover chooses its output level only with the knowledge of demand distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper was first presented at the 2012 – EU SPRI Conference “Towards Transformative Governance? - Responses to mission-oriented innovation policy paradigms”, Fraunhofer ISI, June 2012, Karlsruhe

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relationships between accuracy and speed of decision-making, or speed-accuracy tradeoffs (SAT), have been extensively studied. However, the range of SAT observed varies widely across studies for reasons that are unclear. Several explanations have been proposed, including motivation or incentive for speed vs. accuracy, species and modality but none of these hypotheses has been directly tested. An alternative explanation is that the different degrees of SAT are related to the nature of the task being performed. Here, we addressed this problem by comparing SAT in two odor-guided decision tasks that were identical except for the nature of the task uncertainty: an odor mixture categorization task, where the distinguishing information is reduced by making the stimuli more similar to each other; and an odor identification task in which the information is reduced by lowering the intensity over a range of three log steps. (...)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton–proton collision data with a centre-of-mass energy of s√=7 TeV corresponding to an integrated luminosity of 4.7 fb −1 . Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti- kt algorithm with distance parameters R=0.4 or R=0.6 , and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a Z boson, for 20≤pjetT<1000 GeV and pseudorapidities |η|<4.5 . The effect of multiple proton–proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region ( |η|<1.2 ) for jets with 55≤pjetT<500 GeV . For central jets at lower pT , the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton–proton collisions and test-beam data, which also provide the estimate for pjetT>1 TeV. The calibration of forward jets is derived from dijet pT balance measurements. The resulting uncertainty reaches its largest value of 6 % for low- pT jets at |η|=4.5 . Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5–3 %.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de mestrado em Engenharia Industrial

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scheduling, job shop, uncertainty, mixed (disjunctive) graph, stability analysis

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop methods for Bayesian model averaging (BMA) or selection (BMS) in Panel Vector Autoregressions (PVARs). Our approach allows us to select between or average over all possible combinations of restricted PVARs where the restrictions involve interdependencies between and heterogeneities across cross-sectional units. The resulting BMA framework can find a parsimonious PVAR specification, thus dealing with overparameterization concerns. We use these methods in an application involving the euro area sovereign debt crisis and show that our methods perform better than alternatives. Our findings contradict a simple view of the sovereign debt crisis which divides the euro zone into groups of core and peripheral countries and worries about financial contagion within the latter group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop methods for Bayesian model averaging (BMA) or selection (BMS) in Panel Vector Autoregressions (PVARs). Our approach allows us to select between or average over all possible combinations of restricted PVARs where the restrictions involve interdependencies between and heterogeneities across cross-sectional units. The resulting BMA framework can find a parsimonious PVAR specification, thus dealing with overparameterization concerns. We use these methods in an application involving the euro area sovereign debt crisis and show that our methods perform better than alternatives. Our findings contradict a simple view of the sovereign debt crisis which divides the euro zone into groups of core and peripheral countries and worries about financial contagion within the latter group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyse the role of time-variation in coefficients and other sources of uncertainty in exchange rate forecasting regressions. Our techniques incorporate the notion that the relevant set of predictors and their corresponding weights, change over time. We find that predictive models which allow for sudden rather than smooth, changes in coefficients significantly beat the random walk benchmark in out-of-sample forecasting exercise. Using innovative variance decomposition scheme, we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients' variability, as the main factors hindering models' forecasting performance. The uncertainty regarding the choice of the predictor is small.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.