837 resultados para Robust Scenario Formulation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Strategic supply chain optimization (SCO) problems are often modelled as a two-stage optimization problem, in which the first-stage variables represent decisions on the development of the supply chain and the second-stage variables represent decisions on the operations of the supply chain. When uncertainty is explicitly considered, the problem becomes an intractable infinite-dimensional optimization problem, which is usually solved approximately via a scenario or a robust approach. This paper proposes a novel synergy of the scenario and robust approaches for strategic SCO under uncertainty. Two formulations are developed, namely, naïve robust scenario formulation and affinely adjustable robust scenario formulation. It is shown that both formulations can be reformulated into tractable deterministic optimization problems if the uncertainty is bounded with the infinity-norm, and the uncertain equality constraints can be reformulated into deterministic constraints without assumption of the uncertainty region. Case studies of a classical farm planning problem and an energy and bioproduct SCO problem demonstrate the advantages of the proposed formulations over the classical scenario formulation. The proposed formulations not only can generate solutions with guaranteed feasibility or indicate infeasibility of a problem, but also can achieve optimal expected economic performance with smaller numbers of scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is concerned with strategic optimization of a typical industrial chemical supply chain, which involves a material purchase and transportation network, several manufacturing plants with on-site material and product inventories, a product transportation network and several regional markets. In order to address large uncertainties in customer demands at the different regional markets, a novel robust scenario formulation, which has been developed by the authors recently, is tailored and applied for the strategic optimization. Case study results show that the robust scenario formulation works well for this real industrial supply chain system, and it outperforms the deterministic formulation and the classical scenario-based stochastic programming formulation by generating better expected economic performance and solutions that are guaranteed to be feasible for all uncertainty realizations. The robust scenario problem exhibits a decomposable structure that can be taken advantage of by Benders decomposition for efficient solution, so the application of Benders decomposition to the solution of the strategic optimization is also discussed. The case study results show that Benders decomposition can reduce the solution time by almost an order of magnitude when the number of scenarios in the problem is large.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Production companies use raw materials to compose end-products. They often make different products with the same raw materials. In this research, the focus lies on the production of two end-products consisting of (partly) the same raw materials as cheap as possible. Each of the products has its own demand and quality requirements consisting of quadratic constraints. The minimization of the costs, given the quadratic constraints is a global optimization problem, which can be difficult because of possible local optima. Therefore, the multi modal character of the (bi-) blend problem is investigated. Standard optimization packages (solvers) in Matlab and GAMS were tested on their ability to solve the problem. In total 20 test cases were generated and taken from literature to test solvers on their effectiveness and efficiency to solve the problem. The research also gives insight in adjusting the quadratic constraints of the problem in order to make a robust problem formulation of the bi-blend problem.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Automatically generating maps of a measured variable of interest can be problematic. In this work we focus on the monitoring network context where observations are collected and reported by a network of sensors, and are then transformed into interpolated maps for use in decision making. Using traditional geostatistical methods, estimating the covariance structure of data collected in an emergency situation can be difficult. Variogram determination, whether by method-of-moment estimators or by maximum likelihood, is very sensitive to extreme values. Even when a monitoring network is in a routine mode of operation, sensors can sporadically malfunction and report extreme values. If this extreme data destabilises the model, causing the covariance structure of the observed data to be incorrectly estimated, the generated maps will be of little value, and the uncertainty estimates in particular will be misleading. Marchant and Lark [2007] propose a REML estimator for the covariance, which is shown to work on small data sets with a manual selection of the damping parameter in the robust likelihood. We show how this can be extended to allow treatment of large data sets together with an automated approach to all parameter estimation. The projected process kriging framework of Ingram et al. [2007] is extended to allow the use of robust likelihood functions, including the two component Gaussian and the Huber function. We show how our algorithm is further refined to reduce the computational complexity while at the same time minimising any loss of information. To show the benefits of this method, we use data collected from radiation monitoring networks across Europe. We compare our results to those obtained from traditional kriging methodologies and include comparisons with Box-Cox transformations of the data. We discuss the issue of whether to treat or ignore extreme values, making the distinction between the robust methods which ignore outliers and transformation methods which treat them as part of the (transformed) process. Using a case study, based on an extreme radiological events over a large area, we show how radiation data collected from monitoring networks can be analysed automatically and then used to generate reliable maps to inform decision making. We show the limitations of the methods and discuss potential extensions to remedy these.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a non-linear boundary element formulation applied to analysis of contact problems. The boundary element method (BEM) is known as a robust and accurate numerical technique to handle this type of problem, because the contact among the solids occurs along their boundaries. The proposed non-linear formulation is based on the use of singular or hyper-singular integral equations by BEM, for multi-region contact. When the contact occurs between crack surfaces, the formulation adopted is the dual version of BEM, in which singular and hyper-singular integral equations are defined along the opposite sides of the contact boundaries. The structural non-linear behaviour on the contact is considered using Coulomb`s friction law. The non-linear formulation is based on the tangent operator in which one uses the derivate of the set of algebraic equations to construct the corrections for the non-linear process. This implicit formulation has shown accurate as the classical approach, however, it is faster to compute the solution. Examples of simple and multi-region contact problems are shown to illustrate the applicability of the proposed scheme. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with analysis of multiple random crack propagation in two-dimensional domains using the boundary element method (BEM). BEM is known to be a robust and accurate numerical technique for analysing this type of problem. The formulation adopted in this work is based on the dual BEM, for which singular and hyper-singular integral equations are used. We propose an iterative scheme to predict the crack growth path and the crack length increment at each time step. The proposed scheme able us to simulate localisation and coalescence phenomena, which is the main contribution of this paper. Considering the fracture mechanics analysis, the displacement correlation technique is applied to evaluate the stress intensity factors. The propagation angle and the equivalent stress intensity factor are calculated using the theory of maximum circumferential stress. Examples of simple and multi-fractured domains, loaded up to the rupture, are considered to illustrate the applicability of the proposed scheme. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model predictive controller (MPC) is proposed, which is robustly stable for some classes of model uncertainty and to unknown disturbances. It is considered as the case of open-loop stable systems, where only the inputs and controlled outputs are measured. It is assumed that the controller will work in a scenario where target tracking is also required. Here, it is extended to the nominal infinite horizon MPC with output feedback. The method considers an extended cost function that can be made globally convergent for any finite input horizon considered for the uncertain system. The method is based on the explicit inclusion of cost contracting constraints in the control problem. The controller considers the output feedback case through a non-minimal state-space model that is built using past output measurements and past input increments. The application of the robust output feedback MPC is illustrated through the simulation of a low-order multivariable system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A highly robust hydrogel device made from a single biopolymer formulation is reported. Owing to the presence of covalent and non-covalent crosslinks, these engineered systems were able to (i) sustain a compressive strength of ca. 20 MPa, (ii) quickly recover upon unloading, and (iii) encapsulate cells with high viability rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RP-HPLC based analytical method for use in both quality control of green tea in a semisolid formulation and for in vitro drug release assays was developed and validated. The method was precise (CV < 5%), accurate (recovery between 98% and 102%), linear (R² > 0.99), robust, and specific for the determination of epigallocatechin 3-gallate (EGCG), caffeine (CAF), and gallic acid (GA). In a diffusion cell chamber, the release rate of EGCG was 8896.01 µg cm-2. This data showed that EGCG will be able to exert its systemic activity when delivered though the transdermal formulation, due to its good flux rates with the synthetic membrane.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change science is increasingly concerned with methods for managing and integrating sources of uncertainty from emission storylines, climate model projections, and ecosystem model parameterizations. In tropical ecosystems, regional climate projections and modeled ecosystem responses vary greatly, leading to a significant source of uncertainty in global biogeochemical accounting and possible future climate feedbacks. Here, we combine an ensemble of IPCC-AR4 climate change projections for the Amazon Basin (eight general circulation models) with alternative ecosystem parameter sets for the dynamic global vegetation model, LPJmL. We evaluate LPJmL simulations of carbon stocks and fluxes against flux tower and aboveground biomass datasets for individual sites and the entire basin. Variability in LPJmL model sensitivity to future climate change is primarily related to light and water limitations through biochemical and water-balance-related parameters. Temperature-dependent parameters related to plant respiration and photosynthesis appear to be less important than vegetation dynamics (and their parameters) for determining the magnitude of ecosystem response to climate change. Variance partitioning approaches reveal that relationships between uncertainty from ecosystem dynamics and climate projections are dependent on geographic location and the targeted ecosystem process. Parameter uncertainty from the LPJmL model does not affect the trajectory of ecosystem response for a given climate change scenario and the primary source of uncertainty for Amazon 'dieback' results from the uncertainty among climate projections. Our approach for describing uncertainty is applicable for informing and prioritizing policy options related to mitigation and adaptation where long-term investments are required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A crucial concern in the evaluation of evidence related to a major crime is the formulation of sufficient alternative plausible scenarios that can explain the available evidence. However, software aimed at assisting human crime investigators by automatically constructing crime scenarios from evidence is difficult to develop because of the almost infinite variation of plausible crime scenarios. This paper introduces a novel knowledge driven methodology for crime scenario construction and it presents a decision support system based on it. The approach works by storing the component events of the scenarios instead of entire scenarios and by providing an algorithm that can instantiate and compose these component events into useful scenarios. The scenario composition approach is highly adaptable to unanticipated cases because it allows component events to match the case under investigation in many different ways. Given a description of the available evidence, it generates a network of plausible scenarios that can then be analysed to devise effective evidence collection strategies. The applicability of the ideas presented here are demonstrated by means of a realistic example and prototype decision support software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decomposition based approaches are recalled from primal and dual point of view. The possibility of building partially disaggregated reduced master problems is investigated. This extends the idea of aggregated-versus-disaggregated formulation to a gradual choice of alternative level of aggregation. Partial aggregation is applied to the linear multicommodity minimum cost flow problem. The possibility of having only partially aggregated bundles opens a wide range of alternatives with different trade-offs between the number of iterations and the required computation for solving it. This trade-off is explored for several sets of instances and the results are compared with the ones obtained by directly solving the natural node-arc formulation. An iterative solution process to the route assignment problem is proposed, based on the well-known Frank Wolfe algorithm. In order to provide a first feasible solution to the Frank Wolfe algorithm, a linear multicommodity min-cost flow problem is solved to optimality by using the decomposition techniques mentioned above. Solutions of this problem are useful for network orientation and design, especially in relation with public transportation systems as the Personal Rapid Transit. A single-commodity robust network design problem is addressed. In this, an undirected graph with edge costs is given together with a discrete set of balance matrices, representing different supply/demand scenarios. The goal is to determine the minimum cost installation of capacities on the edges such that the flow exchange is feasible for every scenario. A set of new instances that are computationally hard for the natural flow formulation are solved by means of a new heuristic algorithm. Finally, an efficient decomposition-based heuristic approach for a large scale stochastic unit commitment problem is presented. The addressed real-world stochastic problem employs at its core a deterministic unit commitment planning model developed by the California Independent System Operator (ISO).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A sustainable water resources management depends on sound information about the impacts of climate change. This information is, however, not easily derived because natural runoff variability interferes with the climate change signal. This study presents a procedure that leads to robust estimates of magnitude and Time Of Emergence (TOE) of climate-induced hydrological change that also account for the natural variability contained in the time series. Firstly, natural variability of 189 mesoscale catchments in Switzerland is sampled for 10 ENSEMBLES scenarios for the control (1984–2005) and two scenario periods (near future: 2025–2046, far future: 2074–2095) applying a bootstrap procedure. Then, the sampling distributions of mean monthly runoff are tested for significant differences with the Wilcoxon-Mann–Whitney test and for effect size with Cliff’s delta d. Finally, the TOE of a climate change induced hydrological change is determined when at least eight out of the ten hydrological projections significantly differ from natural variability. The results show that the TOE occurs in the near future period except for high-elevated catchments in late summer. The significant hydrological projections in the near future correspond, however, to only minor runoff changes. In the far future, hydrological change is statistically significant and runoff changes are substantial. Temperature change is the most important factor determining hydrological change in this mountainous region. Therefore, hydrological change depends strongly on a catchment’s mean elevation. Considering that the hydrological changes are predicted to be robust in the near future highlights the importance of accounting for these changes in water resources planning.