939 resultados para integrated lot sizing and scheduling models
Resumo:
Space weather effects on technological systems originate with energy carried from the Sun to the terrestrial environment by the solar wind. In this study, we present results of modeling of solar corona-heliosphere processes to predict solar wind conditions at the L1 Lagrangian point upstream of Earth. In particular we calculate performance metrics for (1) empirical, (2) hybrid empirical/physics-based, and (3) full physics-based coupled corona-heliosphere models over an 8-year period (1995–2002). L1 measurements of the radial solar wind speed are the primary basis for validation of the coronal and heliosphere models studied, though other solar wind parameters are also considered. The models are from the Center for Integrated Space-Weather Modeling (CISM) which has developed a coupled model of the whole Sun-to-Earth system, from the solar photosphere to the terrestrial thermosphere. Simple point-by-point analysis techniques, such as mean-square-error and correlation coefficients, indicate that the empirical coronal-heliosphere model currently gives the best forecast of solar wind speed at 1 AU. A more detailed analysis shows that errors in the physics-based models are predominately the result of small timing offsets to solar wind structures and that the large-scale features of the solar wind are actually well modeled. We suggest that additional “tuning” of the coupling between the coronal and heliosphere models could lead to a significant improvement of their accuracy. Furthermore, we note that the physics-based models accurately capture dynamic effects at solar wind stream interaction regions, such as magnetic field compression, flow deflection, and density buildup, which the empirical scheme cannot.
Resumo:
This is the first of two articles presenting a detailed review of the historical evolution of mathematical models applied in the development of building technology, including conventional buildings and intelligent buildings. After presenting the technical differences between conventional and intelligent buildings, this article reviews the existing mathematical models, the abstract levels of these models, and their links to the literature for intelligent buildings. The advantages and limitations of the applied mathematical models are identified and the models are classified in terms of their application range and goal. We then describe how the early mathematical models, mainly physical models applied to conventional buildings, have faced new challenges for the design and management of intelligent buildings and led to the use of models which offer more flexibility to better cope with various uncertainties. In contrast with the early modelling techniques, model approaches adopted in neural networks, expert systems, fuzzy logic and genetic models provide a promising method to accommodate these complications as intelligent buildings now need integrated technologies which involve solving complex, multi-objective and integrated decision problems.
Resumo:
Observations of atmospheric conditions and processes in citiesare fundamental to understanding the interactions between the urban surface and weather/climate, improving the performance of urban weather, air quality and climate models, and providing key information for city end-users (e.g. decision-makers, stakeholders, public). In this paper, Shanghai's urban integrated meteorological observation network (SUIMON) and some examples of intended applications are introduced. Its characteristics include being: multi- purpose (e.g. forecast, research, service), multi-function (high impact weather, city climate, special end-users), multi-scale (e.g. macro/meso-, urban-, neighborhood, street canyon), multi-variable (e.g. thermal, dynamic, chemical, bio-meteorological, ecological), and multi- platform (e.g. radar, wind profiler, ground-based, satellite based, in-situ observation/ sampling). Underlying SUIMON is a data management system to facilitate exchange of data and information. The overall aim of the network is to improve coordination strategies and instruments; to identify data gaps based on science and user driven requirements; and to intelligently combine observations from a variety of platforms by using a data assimilation system that is tuned to produce the best estimate of the current state of the urban atmosphere.
Resumo:
In the last years extreme hydrometeorological phenomena have increased in number and intensity affecting the inhabitants of various regions, an example of these effects are the central basins of the Gulf of Mexico (CBGM) that they have been affected by 55.2% with floods and especially the state of Veracruz (1999-2013), leaving economic, social and environmental losses. Mexico currently lacks sufficient hydrological studies for the measurement of volumes in rivers, since is convenient to create a hydrological model (HM) suited to the quality and quantity of the geographic and climatic information that is reliable and affordable. Therefore this research compares the semi-distributed hydrological model (SHM) and the global hydrological model (GHM), with respect to the volumes of runoff and achieve to predict flood areas, furthermore, were analyzed extreme hydrometeorological phenomena in the CBGM, by modeling the Hydrologic Modeling System (HEC-HMS) which is a SHM and the Modèle Hydrologique Simplifié à I'Extrême (MOHYSE) which is a GHM, to evaluate the results and compare which model is suitable for tropical conditions to propose public policies for integrated basins management and flood prevention. Thus it was determined the temporal and spatial framework of the analyzed basins according to hurricanes and floods. It were developed the SHM and GHM models, which were calibrated, validated and compared the results to identify the sensitivity to the real model. It was concluded that both models conform to tropical conditions of the CBGM, having MOHYSE further approximation to the real model. Worth mentioning that in Mexico there is not enough information, besides there are no records of MOHYSE use in Mexico, so it can be a useful tool for determining runoff volumes. Finally, with the SHM and the GHM were generated climate change scenarios to develop risk studies creating a risk map for urban planning, agro-hydrological and territorial organization.
Resumo:
The present paper solves the multi-level capacitated lot sizing problem with backlogging (MLCLSPB) combining a genetic algorithm with the solution of mixed-integer programming models and the improvement heuristic fix and optimize. This approach is evaluated over sets of benchmark instances and compared to methods from literature. Computational results indicate competitive results applying the proposed method when compared with other literature approaches. © 2013 IEEE.
Resumo:
This thesis deals with an investigation of combinatorial and robust optimisation models to solve railway problems. Railway applications represent a challenging area for operations research. In fact, most problems in this context can be modelled as combinatorial optimisation problems, in which the number of feasible solutions is finite. Yet, despite the astonishing success in the field of combinatorial optimisation, the current state of algorithmic research faces severe difficulties with highly-complex and data-intensive applications such as those dealing with optimisation issues in large-scale transportation networks. One of the main issues concerns imperfect information. The idea of Robust Optimisation, as a way to represent and handle mathematically systems with not precisely known data, dates back to 1970s. Unfortunately, none of those techniques proved to be successfully applicable in one of the most complex and largest in scale (transportation) settings: that of railway systems. Railway optimisation deals with planning and scheduling problems over several time horizons. Disturbances are inevitable and severely affect the planning process. Here we focus on two compelling aspects of planning: robust planning and online (real-time) planning.
Resumo:
Analyzing and modeling relationships between the structure of chemical compounds, their physico-chemical properties, and biological or toxic effects in chemical datasets is a challenging task for scientific researchers in the field of cheminformatics. Therefore, (Q)SAR model validation is essential to ensure future model predictivity on unseen compounds. Proper validation is also one of the requirements of regulatory authorities in order to approve its use in real-world scenarios as an alternative testing method. However, at the same time, the question of how to validate a (Q)SAR model is still under discussion. In this work, we empirically compare a k-fold cross-validation with external test set validation. The introduced workflow allows to apply the built and validated models to large amounts of unseen data, and to compare the performance of the different validation approaches. Our experimental results indicate that cross-validation produces (Q)SAR models with higher predictivity than external test set validation and reduces the variance of the results. Statistical validation is important to evaluate the performance of (Q)SAR models, but does not support the user in better understanding the properties of the model or the underlying correlations. We present the 3D molecular viewer CheS-Mapper (Chemical Space Mapper) that arranges compounds in 3D space, such that their spatial proximity reflects their similarity. The user can indirectly determine similarity, by selecting which features to employ in the process. The tool can use and calculate different kinds of features, like structural fragments as well as quantitative chemical descriptors. Comprehensive functionalities including clustering, alignment of compounds according to their 3D structure, and feature highlighting aid the chemist to better understand patterns and regularities and relate the observations to established scientific knowledge. Even though visualization tools for analyzing (Q)SAR information in small molecule datasets exist, integrated visualization methods that allows for the investigation of model validation results are still lacking. We propose visual validation, as an approach for the graphical inspection of (Q)SAR model validation results. New functionalities in CheS-Mapper 2.0 facilitate the analysis of (Q)SAR information and allow the visual validation of (Q)SAR models. The tool enables the comparison of model predictions to the actual activity in feature space. Our approach reveals if the endpoint is modeled too specific or too generic and highlights common properties of misclassified compounds. Moreover, the researcher can use CheS-Mapper to inspect how the (Q)SAR model predicts activity cliffs. The CheS-Mapper software is freely available at http://ches-mapper.org.
Resumo:
Purpose of review: Overview on integrated care trials focusing on effectiveness and efficiency published from 2011 to 2013. Recent findings: Eight randomized controlled trials (RCTs) and 21 non-RCT studies were published from 2011 to 2013. Studies differed in several methodological aspects such as study population, psychotherapeutic approaches used, outcome parameters, follow-up times, fidelities, and implementation of the integrated care model and the nation-specific healthcare context with different control conditions. This makes it difficult to draw firm conclusions. Most studies demonstrated relevant improvements regarding symptoms (P = 0.001) and functioning (P = 0.01), quality of life (P = 0.01), adherence (P <0.05) and patient's satisfaction (P = 0.01), and reduction of caregiver's stress (P < 0.05). Mean total costs were favoring or at least equalizing costs but with positive effects found on subjective health favoring integrated care models. Summary: There is an increasing interest in the effectiveness and efficiency of integrated care models in patients with mental disorders, specifically in those with severe and persistent mental illness. To increase generalizability, future trials should exactly describe rationales and content of integrated care model and control conditions.
Resumo:
Abstract Transport is the foundation of any economy: it boosts economic growth, creates wealth, enhances trade, geographical accessibility and the mobility of people. Transport is also a key ingredient for a high quality of life, making places accessible and bringing people together. The future prosperity of our world will depend on the ability of all of its regions to remain fully and competitively integrated in the world economy. Efficient transport is vital in making this happen. Operations research can help in efficiently planning the design and operating transport systems. Planning and operational processes are fields that are rich in combinatorial optimization problems. These problems can be analyzed and solved through the application of mathematical models and optimization techniques, which may lead to an improvement in the performance of the transport system, as well as to a reduction in the time required for solving these problems. The latter aspect is important, because it increases the flexibility of the system: the system can adapt in a faster way to changes in the environment (i.e.: weather conditions, crew illness, failures, etc.). These disturbing changes (called disruptions) often enforce the schedule to be adapted. The direct consequences are delays and cancellations, implying many schedule adjustments and huge costs. Consequently, robust schedules and recovery plans must be developed in order to fight against disruptions. This dissertation makes contributions to two different fields: rail and air applications. Robust planning and recovery methods are presented. In the field of railway transport we develop several mathematical models which answer to RENFE’s (the major railway operator in Spain) needs: 1. We study the rolling stock assignment problem: here, we introduce some robust aspects in order to ameliorate some operations which are likely to fail. Once the rolling stock assignment is known, we propose a robust routing model which aims at identifying the train units’ sequences while minimizing the expected delays and human resources needed to perform the sequences. 2. It is widely accepted that the sequential solving approach produces solutions that are not global optima. Therefore, we develop an integrated and robust model to determine the train schedule and rolling stock assignment. We also propose an integrated model to study the rolling stock circulations. Circulations are determined by the rolling stock assignment and routing of the train units. 3. Although our aim is to develop robust plans, disruptions will be likely to occur and recovery methods will be needed. Therefore, we propose a recovery method which aims to recover the train schedule and rolling stock assignment in an integrated fashion all while considering the passenger demand. In the field of air transport we develop several mathematical models which answer to IBERIA’s (the major airline in Spain) needs: 1. We look at the airline-scheduling problem and develop an integrated approach that optimizes schedule design, fleet assignment and passenger use so as to reduce costs and create fewer incompatibilities between decisions. Robust itineraries are created to ameliorate misconnected passengers. 2. Air transport operators are continuously facing competition from other air operators and different modes of transport (e.g., High Speed Rail). Consequently, airline profitability is critically influenced by the airline’s ability to estimate passenger demands and construct profitable flight schedules. We consider multi-modal competition including airline and rail, and develop a new approach that estimates the demand associated with a given schedule; and generates airline schedules and fleet assignments using an integrated schedule design and fleet assignment optimization model that captures the impacts of schedule decisions on passenger demand.
Resumo:
This paper presents an overview of depth averaged modelling of fast catastrophic landslides where coupling of solid skeleton and pore fluid (air and water) is important. The first goal is to show how Biot-Zienkiewicz models can be applied to develop depth integrated, coupled models. The second objective of the paper is to consider a link which can be established between rheological and constitutive models. Perzyna´s viscoplasticity can be considered a general framework within which rheological models such as Bingham and cohesive frictional fluids can be derived. Among the several alternative numerical models, we will focus here on SPH which has not been widely applied by engineers to model landslide propagation. We propose an improvement, based on combining Finite Difference meshes associated to SPH nodes to describe pore pressure evolution inside the landslide mass. We devote a Section to analyze the performance of the models, considering three sets of tests and examples which allows to assess the model performance and limitations: (i) Problems having an analytical solution, (ii) Small scale laboratory tests, and (iii) Real cases for which we have had access to reliable information
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-04
Resumo:
Large-scale disasters are constantly occurring around the world, and in many cases evacuation of regions of city is needed. ‘Operational Research/Management Science’ (OR/MS) has been widely used in emergency planning for over five decades. Warning dissemination, evacuee transportation and shelter management are three ‘Evacuation Support Functions’ (ESF) generic to many hazards. This thesis has adopted a case study approach to illustrate the importance of integrated approach of evacuation planning and particularly the role of OR/MS models. In the warning dissemination phase, uncertainty in the household’s behaviour as ‘warning informants’ has been investigated along with uncertainties in the warning system. An agentbased model (ABM) was developed for ESF-1 with households as agents and ‘warning informants’ behaviour as the agent behaviour. The model was used to study warning dissemination effectiveness under various conditions of the official channel. In the transportation phase, uncertainties in the household’s behaviour such as departure time (a function of ESF-1), means of transport and destination have been. Households could evacuate as pedestrians, using car or evacuation buses. An ABM was developed to study the evacuation performance (measured in evacuation travel time). In this thesis, a holistic approach for planning the public evacuation shelters called ‘Shelter Information Management System’ (SIMS) has been developed. A generic allocation framework of was developed to available shelter capacity to the shelter demand by considering the evacuation travel time. This was formulated using integer programming. In the sheltering phase, the uncertainty in household shelter choices (either nearest/allocated/convenient) has been studied for its impact on allocation policies using sensitivity analyses. Using analyses from the models and detailed examination of household states from ‘warning to safety’, it was found that the three ESFs though sequential in time, however have lot of interdependencies from the perspective of evacuation planning. This thesis has illustrated an OR/MS based integrated approach including and beyond single ESF preparedness. The developed approach will help in understanding the inter-linkages of the three evacuation phases and preparing a multi-agency-based evacuation planning evacuation
Resumo:
The U.S. railroad companies spend billions of dollars every year on railroad track maintenance in order to ensure safety and operational efficiency of their railroad networks. Besides maintenance costs, other costs such as train accident costs, train and shipment delay costs and rolling stock maintenance costs are also closely related to track maintenance activities. Optimizing the track maintenance process on the extensive railroad networks is a very complex problem with major cost implications. Currently, the decision making process for track maintenance planning is largely manual and primarily relies on the knowledge and judgment of experts. There is considerable potential to improve the process by using operations research techniques to develop solutions to the optimization problems on track maintenance. In this dissertation study, we propose a range of mathematical models and solution algorithms for three network-level scheduling problems on track maintenance: track inspection scheduling problem (TISP), production team scheduling problem (PTSP) and job-to-project clustering problem (JTPCP). TISP involves a set of inspection teams which travel over the railroad network to identify track defects. It is a large-scale routing and scheduling problem where thousands of tasks are to be scheduled subject to many difficult side constraints such as periodicity constraints and discrete working time constraints. A vehicle routing problem formulation was proposed for TISP, and a customized heuristic algorithm was developed to solve the model. The algorithm iteratively applies a constructive heuristic and a local search algorithm in an incremental scheduling horizon framework. The proposed model and algorithm have been adopted by a Class I railroad in its decision making process. Real-world case studies show the proposed approach outperforms the manual approach in short-term scheduling and can be used to conduct long-term what-if analyses to yield managerial insights. PTSP schedules capital track maintenance projects, which are the largest track maintenance activities and account for the majority of railroad capital spending. A time-space network model was proposed to formulate PTSP. More than ten types of side constraints were considered in the model, including very complex constraints such as mutual exclusion constraints and consecution constraints. A multiple neighborhood search algorithm, including a decomposition and restriction search and a block-interchange search, was developed to solve the model. Various performance enhancement techniques, such as data reduction, augmented cost function and subproblem prioritization, were developed to improve the algorithm. The proposed approach has been adopted by a Class I railroad for two years. Our numerical results show the model solutions are able to satisfy all hard constraints and most soft constraints. Compared with the existing manual procedure, the proposed approach is able to bring significant cost savings and operational efficiency improvement. JTPCP is an intermediate problem between TISP and PTSP. It focuses on clustering thousands of capital track maintenance jobs (based on the defects identified in track inspection) into projects so that the projects can be scheduled in PTSP. A vehicle routing problem based model and a multiple-step heuristic algorithm were developed to solve this problem. Various side constraints such as mutual exclusion constraints and rounding constraints were considered. The proposed approach has been applied in practice and has shown good performance in both solution quality and efficiency.
Resumo:
This thesis deals with efficient solution of optimization problems of practical interest. The first part of the thesis deals with bin packing problems. The bin packing problem (BPP) is one of the oldest and most fundamental combinatorial optimiza- tion problems. The bin packing problem and its generalizations arise often in real-world ap- plications, from manufacturing industry, logistics and transportation of goods, and scheduling. After an introductory chapter, I will present two applications of two of the most natural extensions of the bin packing: Chapter 2 will be dedicated to an application of bin packing in two dimension to a problem of scheduling a set of computational tasks on a computer cluster, while Chapter 3 deals with the generalization of BPP in three dimensions that arise frequently in logistic and transportation, often com- plemented with additional constraints on the placement of items and characteristics of the solution, like, for example, guarantees on the stability of the items, to avoid potential damage to the transported goods, on the distribution of the total weight of the bins, and on compatibility with loading and unloading operations. The second part of the thesis, and in particular Chapter 4 considers the Trans- mission Expansion Problem (TEP), where an electrical transmission grid must be expanded so as to satisfy future energy demand at the minimum cost, while main- taining some guarantees of robustness to potential line failures. These problems are gaining importance in a world where a shift towards renewable energy can impose a significant geographical reallocation of generation capacities, resulting in the ne- cessity of expanding current power transmission grids.