941 resultados para Uncertainty in generation
Resumo:
This paper presents a mathematical model and a methodology to solve a transmission network expansion planning problem considering uncertainty in demand and generation. The methodology used to solve the problem, finds the optimal transmission network expansion plan that allows the power system to operate adequately in an environment with uncertainty. The model presented results in an optimization problem that is solved using a specialized genetic algorithm. The results obtained for known systems from the literature show that cheaper plans can be found satisfying the uncertainty in demand and generation. ©2008 IEEE.
Resumo:
We separate and quantify the sources of uncertainty in projections of regional (*2,500 km) precipitation changes for the twenty-first century using the CMIP3 multi-model ensemble, allowing a direct comparison with a similar analysis for regional temperature changes. For decadal means of seasonal mean precipitation, internal variability is the dominant uncertainty for predictions of the first decade everywhere, and for many regions until the third decade ahead. Model uncertainty is generally the dominant source of uncertainty for longer lead times. Scenario uncertainty is found to be small or negligible for all regions and lead times, apart from close to the poles at the end of the century. For the global mean, model uncertainty dominates at all lead times. The signal-to-noise ratio (S/N) of the precipitation projections is highest at the poles but less than 1 almost everywhere else, and is far lower than for temperature projections. In particular, the tropics have the highest S/N for temperature, but the lowest for precipitation. We also estimate a ‘potential S/N’ by assuming that model uncertainty could be reduced to zero, and show that, for regional precipitation, the gains in S/N are fairly modest, especially for predictions of the next few decades. This finding suggests that adaptation decisions will need to be made in the context of high uncertainty concerning regional changes in precipitation. The potential to narrow uncertainty in regional temperature projections is far greater. These conclusions on S/N are for the current generation of models; the real signal may be larger or smaller than the CMIP3 multi-model mean. Also note that the S/N for extreme precipitation, which is more relevant for many climate impacts, may be larger than for the seasonal mean precipitation considered here.
Resumo:
Poster presented in the 24th European Symposium on Computer Aided Process Engineering (ESCAPE 24), Budapest, Hungary, June 15-18, 2014.
Resumo:
In this work, we analyze the effect of incorporating life cycle inventory (LCI) uncertainty on the multi-objective optimization of chemical supply chains (SC) considering simultaneously their economic and environmental performance. To this end, we present a stochastic multi-scenario mixed-integer linear programming (MILP) coupled with a two-step transformation scenario generation algorithm with the unique feature of providing scenarios where the LCI random variables are correlated and each one of them has the desired lognormal marginal distribution. The environmental performance is quantified following life cycle assessment (LCA) principles, which are represented in the model formulation through standard algebraic equations. The capabilities of our approach are illustrated through a case study of a petrochemical supply chain. We show that the stochastic solution improves the economic performance of the SC in comparison with the deterministic one at any level of the environmental impact, and moreover the correlation among environmental burdens provides more realistic scenarios for the decision making process.
Resumo:
This paper presents a personal view of the interaction between the analysis of choice under uncertainty and the analysis of production under uncertainty. Interest in the foundations of the theory of choice under uncertainty was stimulated by applications of expected utility theory such as the Sandmo model of production under uncertainty. This interest led to the development of generalized models including rank-dependent expected utility theory. In turn, the development of generalized expected utility models raised the question of whether such models could be used in the analysis of applied problems such as those involving production under uncertainty. Finally, the revival of the state-contingent approach led to the recognition of a fundamental duality between choice problems and production problems.
Resumo:
In this paper, we consider a mixed market with uncertain demand, involving one private firm and one public firm with quadratic costs. The model is a two-stage game in which players choose to make their output decisions either in stage 1 or stage 2. We assume that the demand is unknown until the end of the first stage. We compute the output levels at equilibrium in each possible role. We also determine ex-ante and ex-post firms’ payoff functions.
Resumo:
On a symmetric differentiated Stackelberg duopoly model in which there is asymmetric demand information owned by leading and follower firms, we show that the leading firm does not necessarily have advantage over the following one. The reason for this is that the second mover can adjust its output level after observing the realized demand, while the first mover chooses its output level only with the knowledge of demand distribution.
Resumo:
Relationships between accuracy and speed of decision-making, or speed-accuracy tradeoffs (SAT), have been extensively studied. However, the range of SAT observed varies widely across studies for reasons that are unclear. Several explanations have been proposed, including motivation or incentive for speed vs. accuracy, species and modality but none of these hypotheses has been directly tested. An alternative explanation is that the different degrees of SAT are related to the nature of the task being performed. Here, we addressed this problem by comparing SAT in two odor-guided decision tasks that were identical except for the nature of the task uncertainty: an odor mixture categorization task, where the distinguishing information is reduced by making the stimuli more similar to each other; and an odor identification task in which the information is reduced by lowering the intensity over a range of three log steps. (...)
Resumo:
The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton–proton collision data with a centre-of-mass energy of s√=7 TeV corresponding to an integrated luminosity of 4.7 fb −1 . Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti- kt algorithm with distance parameters R=0.4 or R=0.6 , and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a Z boson, for 20≤pjetT<1000 GeV and pseudorapidities |η|<4.5 . The effect of multiple proton–proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region ( |η|<1.2 ) for jets with 55≤pjetT<500 GeV . For central jets at lower pT , the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton–proton collisions and test-beam data, which also provide the estimate for pjetT>1 TeV. The calibration of forward jets is derived from dijet pT balance measurements. The resulting uncertainty reaches its largest value of 6 % for low- pT jets at |η|=4.5 . Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5–3 %.
Resumo:
Dissertação de mestrado em Engenharia Industrial
Resumo:
Scheduling, job shop, uncertainty, mixed (disjunctive) graph, stability analysis
Resumo:
We develop methods for Bayesian model averaging (BMA) or selection (BMS) in Panel Vector Autoregressions (PVARs). Our approach allows us to select between or average over all possible combinations of restricted PVARs where the restrictions involve interdependencies between and heterogeneities across cross-sectional units. The resulting BMA framework can find a parsimonious PVAR specification, thus dealing with overparameterization concerns. We use these methods in an application involving the euro area sovereign debt crisis and show that our methods perform better than alternatives. Our findings contradict a simple view of the sovereign debt crisis which divides the euro zone into groups of core and peripheral countries and worries about financial contagion within the latter group.
Resumo:
We develop methods for Bayesian model averaging (BMA) or selection (BMS) in Panel Vector Autoregressions (PVARs). Our approach allows us to select between or average over all possible combinations of restricted PVARs where the restrictions involve interdependencies between and heterogeneities across cross-sectional units. The resulting BMA framework can find a parsimonious PVAR specification, thus dealing with overparameterization concerns. We use these methods in an application involving the euro area sovereign debt crisis and show that our methods perform better than alternatives. Our findings contradict a simple view of the sovereign debt crisis which divides the euro zone into groups of core and peripheral countries and worries about financial contagion within the latter group.
Resumo:
We analyse the role of time-variation in coefficients and other sources of uncertainty in exchange rate forecasting regressions. Our techniques incorporate the notion that the relevant set of predictors and their corresponding weights, change over time. We find that predictive models which allow for sudden rather than smooth, changes in coefficients significantly beat the random walk benchmark in out-of-sample forecasting exercise. Using innovative variance decomposition scheme, we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients' variability, as the main factors hindering models' forecasting performance. The uncertainty regarding the choice of the predictor is small.
Resumo:
1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.