934 resultados para Branch and bound algorithms
Resumo:
Nella seguente tesi è descritto il principio di sviluppo di una macchina industriale di alimentazione. Il suddetto sistema dovrà essere installato fra due macchine industriali. L’apparato dovrà mettere al passo e sincronizzare con la macchina a valle i prodotti che arriveranno in input. La macchina ordina gli oggetti usando una serie di nastri trasportatori a velocità regolabile. Lo sviluppo è stato effettuato al Laboratorio Liam dopo la richiesta dell’azienda Sitma. Sitma produceva già un tipo di sistema come quello descritto in questa tesi. Il deisderio di Sitma è quindi quello di modernizzare la precedente applicazione poiché il dispositivo che le permetteva di effettuare la messa al passo di prodotti era un PLC Siemens che non è più commercializzato. La tesi verterà sullo studio dell’applicazione e la modellazione tramite Matlab-Simulink per poi proseguire ad una applicazione, seppure non risolutiva, in TwinCAT 3.
Resumo:
Process systems design, operation and synthesis problems under uncertainty can readily be formulated as two-stage stochastic mixed-integer linear and nonlinear (nonconvex) programming (MILP and MINLP) problems. These problems, with a scenario based formulation, lead to large-scale MILPs/MINLPs that are well structured. The first part of the thesis proposes a new finitely convergent cross decomposition method (CD), where Benders decomposition (BD) and Dantzig-Wolfe decomposition (DWD) are combined in a unified framework to improve the solution of scenario based two-stage stochastic MILPs. This method alternates between DWD iterations and BD iterations, where DWD restricted master problems and BD primal problems yield a sequence of upper bounds, and BD relaxed master problems yield a sequence of lower bounds. A variant of CD, which includes multiple columns per iteration of DW restricted master problem and multiple cuts per iteration of BD relaxed master problem, called multicolumn-multicut CD is then developed to improve solution time. Finally, an extended cross decomposition method (ECD) for solving two-stage stochastic programs with risk constraints is proposed. In this approach, a CD approach at the first level and DWD at a second level is used to solve the original problem to optimality. ECD has a computational advantage over a bilevel decomposition strategy or solving the monolith problem using an MILP solver. The second part of the thesis develops a joint decomposition approach combining Lagrangian decomposition (LD) and generalized Benders decomposition (GBD), to efficiently solve stochastic mixed-integer nonlinear nonconvex programming problems to global optimality, without the need for explicit branch and bound search. In this approach, LD subproblems and GBD subproblems are systematically solved in a single framework. The relaxed master problem obtained from the reformulation of the original problem, is solved only when necessary. A convexification of the relaxed master problem and a domain reduction procedure are integrated into the decomposition framework to improve solution efficiency. Using case studies taken from renewable resource and fossil-fuel based application in process systems engineering, it can be seen that these novel decomposition approaches have significant benefit over classical decomposition methods and state-of-the-art MILP/MINLP global optimization solvers.
Resumo:
The aim of this thesis is to present exact and heuristic algorithms for the integrated planning of multi-energy systems. The idea is to disaggregate the energy system, starting first with its core the Central Energy System, and then to proceed towards the Decentral part. Therefore, a mathematical model for the generation expansion operations to optimize the performance of a Central Energy System system is first proposed. To ensure that the proposed generation operations are compatible with the network, some extensions of the existing network are considered as well. All these decisions are evaluated both from an economic viewpoint and from an environmental perspective, as specific constraints related to greenhouse gases emissions are imposed in the formulation. Then, the thesis presents an optimization model for solar organic Rankine cycle in the context of transactive energy trading. In this study, the impact that this technology can have on the peer-to-peer trading application in renewable based community microgrids is inspected. Here the consumer becomes a prosumer and engages actively in virtual trading with other prosumers at the distribution system level. Moreover, there is an investigation of how different technological parameters of the solar Organic Rankine Cycle may affect the final solution. Finally, the thesis introduces a tactical optimization model for the maintenance operations’ scheduling phase of a Combined Heat and Power plant. Specifically, two types of cleaning operations are considered, i.e., online cleaning and offline cleaning. Furthermore, a piecewise linear representation of the electric efficiency variation curve is included. Given the challenge of solving the tactical management model, a heuristic algorithm is proposed. The heuristic works by solving the daily operational production scheduling problem, based on the final consumer’s demand and on the electricity prices. The aggregate information from the operational problem is used to derive maintenance decisions at a tactical level.
Resumo:
Due to usage conditions, hazardous environments or intentional causes, physical and virtual systems are subject to faults in their components, which may affect their overall behaviour. In a ‘black-box’ agent modelled by a set of propositional logic rules, in which just a subset of components is externally visible, such faults may only be recognised by examining some output function of the agent. A (fault-free) model of the agent’s system provides the expected output given some input. If the real output differs from that predicted output, then the system is faulty. However, some faults may only become apparent in the system output when appropriate inputs are given. A number of problems regarding both testing and diagnosis thus arise, such as testing a fault, testing the whole system, finding possible faults and differentiating them to locate the correct one. The corresponding optimisation problems of finding solutions that require minimum resources are also very relevant in industry, as is minimal diagnosis. In this dissertation we use a well established set of benchmark circuits to address such diagnostic related problems and propose and develop models with different logics that we formalise and generalise as much as possible. We also prove that all techniques generalise to agents and to multiple faults. The developed multi-valued logics extend the usual Boolean logic (suitable for faultfree models) by encoding values with some dependency (usually on faults). Such logics thus allow modelling an arbitrary number of diagnostic theories. Each problem is subsequently solved with CLP solvers that we implement and discuss, together with a new efficient search technique that we present. We compare our results with other approaches such as SAT (that require substantial duplication of circuits), showing the effectiveness of constraints over multi-valued logics, and also the adequacy of a general set constraint solver (with special inferences over set functions such as cardinality) on other problems. In addition, for an optimisation problem, we integrate local search with a constructive approach (branch-and-bound) using a variety of logics to improve an existing efficient tool based on SAT and ILP.
Resumo:
Optimization is a very important field for getting the best possible value for the optimization function. Continuous optimization is optimization over real intervals. There are many global and local search techniques. Global search techniques try to get the global optima of the optimization problem. However, local search techniques are used more since they try to find a local minimal solution within an area of the search space. In Continuous Constraint Satisfaction Problems (CCSP)s, constraints are viewed as relations between variables, and the computations are supported by interval analysis. The continuous constraint programming framework provides branch-and-prune algorithms for covering sets of solutions for the constraints with sets of interval boxes which are the Cartesian product of intervals. These algorithms begin with an initial crude cover of the feasible space (the Cartesian product of the initial variable domains) which is recursively refined by interleaving pruning and branching steps until a stopping criterion is satisfied. In this work, we try to find a convenient way to use the advantages in CCSP branchand- prune with local search of global optimization applied locally over each pruned branch of the CCSP. We apply local search techniques of continuous optimization over the pruned boxes outputted by the CCSP techniques. We mainly use steepest descent technique with different characteristics such as penalty calculation and step length. We implement two main different local search algorithms. We use “Procure”, which is a constraint reasoning and global optimization framework, to implement our techniques, then we produce and introduce our results over a set of benchmarks.
Resumo:
This work studies the combination of safe and probabilistic reasoning through the hybridization of Monte Carlo integration techniques with continuous constraint programming. In continuous constraint programming there are variables ranging over continuous domains (represented as intervals) together with constraints over them (relations between variables) and the goal is to find values for those variables that satisfy all the constraints (consistent scenarios). Constraint programming “branch-and-prune” algorithms produce safe enclosures of all consistent scenarios. Special proposed algorithms for probabilistic constraint reasoning compute the probability of sets of consistent scenarios which imply the calculation of an integral over these sets (quadrature). In this work we propose to extend the “branch-and-prune” algorithms with Monte Carlo integration techniques to compute such probabilities. This approach can be useful in robotics for localization problems. Traditional approaches are based on probabilistic techniques that search the most likely scenario, which may not satisfy the model constraints. We show how to apply our approach in order to cope with this problem and provide functionality in real time.
Resumo:
This paper presents a simple Optimised Search Heuristic for the Job Shop Scheduling problem that combines a GRASP heuristic with a branch-and-bound algorithm. The proposed method is compared with similar approaches and leads to better results in terms of solution quality and computing times.
Resumo:
We present a new branch and bound algorithm for weighted Max-SAT, called Lazy which incorporates original data structures and inference rules, as well as a lower bound of better quality. We provide experimental evidence that our solver is very competitive and outperforms some of the best performing Max-SAT and weighted Max-SAT solvers on a wide range of instances.
Resumo:
Este trabalho teve como objetivos desenvolver e testar um algoritmo com base na metaheurística busca tabu (BT), para a solução de problemas de gerenciamento florestal com restrições de inteireza. Os problemas avaliados tinham entre 93 e 423 variáveis de decisão, sujeitos às restrições de singularidade, produção mínima e produção máxima periódicas. Todos os problemas tiveram como objetivo a maximização do valor presente líquido. O algoritmo para implementação da BT foi codificado em ambiente delphi 5.0 e os testes foram efetuados em um microcomputador AMD K6II 500 MHZ, com memória RAM de 64 MB e disco rígido de 15GB. O desempenho da BT foi avaliado de acordo com as medidas de eficácia e eficiência. Os diferentes valores ou categorias dos parâmetros da BT foram testados e comparados quanto aos seus efeitos na eficácia do algoritmo. A seleção da melhor configuração de parâmetros foi feita com o teste L&O, a 1% de probabilidade, e as análises através de estatísticas descritivas. A melhor configuração de parâmetros propiciou à BT eficácia média de 95,97%, valor mínimo igual a 90,39% e valor máximo igual a 98,84%, com um coeficiente de variação de 2,48% do ótimo matemático. Para o problema de maior porte, a eficiência da BT foi duas vezes superior à eficiência do algoritmo exato branch and bound, apresentando-se como uma abordagem muito atrativa para solução de importantes problemas de gerenciamento florestal.
Resumo:
Os objetivos deste trabalho foram desenvolver e testar um algoritmo genético (AG) para a solução de problemas de gerenciamento florestal com restrições de integridade. O AG foi testado em quatro problemas, contendo entre 93 e 423 variáveis de decisão, sujeitos às restrições de singularidade, produção mínima e produção máxima, periodicamente. Todos os problemas tiveram como objetivo a maximização do valor presente líquido. O AG foi codificado em ambiente delphi 5.0 e os testes foram realizados em um microcomputador AMD K6II 500 MHZ, com memória RAM de 64 MB e disco rígido de 15GB. O desempenho do AG foi avaliado de acordo com as medidas de eficácia e eficiência. Os valores ou categorias dos parâmetros do AG foram testados e comparados quanto aos seus efeitos na eficácia do algoritmo. A seleção da melhor configuração de parâmetros foi feita com o teste L&O, a 1% de probabilidade, e as análises foram realizadas através de estatísticas descritivas. A melhor configuração de parâmetros propiciou ao AG eficácia média de 94,28%, valor mínimo de 90,01%, valor máximo de 98,48%, com coeficiente de variação de 2,08% do ótimo matemático, obtido pelo algoritmo exato branch and bound. Para o problema de maior porte, a eficiência do AG foi cinco vezes superior à eficiência do algoritmo exato branch and bound. O AG apresentou-se como uma abordagem bastante atrativa para solução de importantes problemas de gerenciamento florestal.
Resumo:
Os objetivos deste trabalho foram desenvolver e testar a metaheurística SA para solução de problemas de gerenciamento florestal com restrições de integridade. O algoritmo SA desenvolvido foi testado em quatro problemas, contendo entre 93 e 423 variáveis de decisão, sujeitos às restrições de singularidade, produção mínima e produção máxima, periodicamente. Todos os problemas tiveram como objetivo a maximização do valor presente líquido. O algoritmo SA foi codificado em liguagem delphi 5.0 e os testes foram efetuados em um microcomputador AMD K6II 500 MHZ, com memória RAM de 64 MB e disco rígido de 15GB. O desempenho da SA foi avaliado de acordo com as medidas de eficácia e eficiência. Os diferentes valores ou categorias dos parâmetros da SA foram testados e comparados quanto aos seus efeitos na eficácia do algoritmo. A seleção da melhor configuração de parâmetros foi feita com o teste L&O, a 1% de probabilidade, e as análises foram realizadas através de estatísticas descritivas. A melhor configuração de parâmetros propiciou à SA eficácia média de 95,36%, valor mínimo de 83,66%, valor máximo de 100% e coeficiente de variação igual a 3,18% do ótimo matemático obtido pelo algoritmo exato branch and bound. Para o problema de maior porte, a eficiência da SA foi dez vezes superior à eficiência do algoritmo exato branch and bound. O bom desempenho desta heurística reforçou as conclusões, tiradas em outros trabalhos, do seu enorme potencial para resolver importantes problemas de gerenciamento florestal de difícil solução pelos instrumentos computacionais da atualidade.
Resumo:
Många kvantitativa problem från vitt skilda områden kan beskrivas som optimeringsproblem. Ett mått på lösningens kvalitet bör optimeras samtidigt som vissa villkor på lösningen uppfylls. Kvalitetsmåttet kallas vanligen objektfunktion och kan beskriva kostnader (exempelvis produktion, logistik), potentialenergi (molekylmodellering, proteinveckning), risk (finans, försäkring) eller något annat relevant mått. I min doktorsavhandling diskuteras speciellt icke-linjär programmering, NLP, i ändliga dimensioner. Problem med enkel struktur, till exempel någon form av konvexitet, kan lösas effektivt. Tyvärr kan inte alla kvantitativa samband modelleras på ett konvext vis. Icke-konvexa problem kan angripas med heuristiska metoder, algoritmer som söker lösningar med hjälp av deterministiska eller stokastiska tumregler. Ibland fungerar det här väl, men heuristikerna kan sällan garantera kvaliteten på lösningen eller ens att en lösning påträffas. För vissa tillämpningar är det här oacceptabelt. Istället kan man tillämpa så kallad global optimering. Genom att successivt dela variabeldomänen i mindre delar och beräkna starkare gränser på det optimala värdet hittas en lösning inom feltoleransen. Den här metoden kallas branch-and-bound, ungefär dela-och-begränsa. För att ge undre gränser (vid minimering) approximeras problemet med enklare problem, till exempel konvexa, som kan lösas effektivt. I avhandlingen studeras tillvägagångssätt för att approximera differentierbara funktioner med konvexa underskattningar, speciellt den så kallade alphaBB-metoden. Denna metod adderar störningar av en viss form och garanterar konvexitet genom att sätta villkor på den perturberade Hessematrisen. Min forskning har lyft fram en naturlig utvidgning av de perturbationer som används i alphaBB. Nya metoder för att bestämma underskattningsparametrar har beskrivits och jämförts. I sammanfattningsdelen diskuteras global optimering ur bredare perspektiv på optimering och beräkningsalgoritmer.
Resumo:
L’industrie forestière est un secteur qui, même s’il est en déclin, se trouve au cœur du débat sur la mondialisation et le développement durable. Pour de nombreux pays tels que le Canada, la Suède et le Chili, les objectifs sont de maintenir un secteur florissant sans nuire à l’environnement et en réalisant le caractère fini des ressources. Il devient important d’être compétitif et d’exploiter de manière efficace les territoires forestiers, de la récolte jusqu’à la fabrication des produits aux usines, en passant par le transport, dont les coûts augmentent rapidement. L’objectif de ce mémoire est de développer un modèle de planification tactique/opérationnelle qui permet d’ordonnancer les activités pour une année de récolte de façon à satisfaire les demandes des usines, sans perdre de vue le transport des quantités récoltées et la gestion des inventaires en usine. L’année se divise en 26 périodes de deux semaines. Nous cherchons à obtenir les horaires et l’affectation des équipes de récolte aux blocs de coupe pour une année. Le modèle mathématique développé est un problème linéaire mixte en nombres entiers dont la structure est basée sur chaque étape de la chaine d’approvisionnement forestière. Nous choisissons de le résoudre par une méthode exacte, le branch-and-bound. Nous avons pu évaluer combien la résolution directe de notre problème de planification était difficile pour les instances avec un grand nombre de périodes. Cependant l’approche des horizons roulants s’est avérée fructueuse. Grâce à elle en une journée, il est possible de planifier les activités de récolte des blocs pour l’année entière (26 périodes).
Resumo:
Thèse réalisée en cotutelle avec l'Université d'Avignon.
Resumo:
Les problèmes de conception de réseaux ont reçu un intérêt particulier et ont été largement étudiés de par leurs nombreuses applications dans différents domaines, tels que les transports et les télécommunications. Nous nous intéressons dans ce mémoire au problème de conception de réseaux avec coûts d’ajout de capacité. Il s’agit d’installer un ensemble d’équipements sur un réseau en vue de satisfaire la demande, tout en respectant les contraintes de capacité, chaque arc pouvant admettre plusieurs équipements. L’objectif est de minimiser les coûts variables de transport des produits et les coûts fixes d’installation ou d’augmentation de capacité des équipements. La méthode que nous envisageons pour résoudre ce problème est basée sur les techniques utilisées en programmation linéaire en nombres entiers, notamment celles de génération de colonnes et de coupes. Ces méthodes sont introduites dans un algorithme général de branch-and-bound basé sur la relaxation linéaire. Nous avons testé notre méthode sur quatre groupes d’instances de tailles différentes, et nous l’avons comparée à CPLEX, qui constitue un des meilleurs solveurs permettant de résoudre des problèmes d’optimisation, ainsi qu’à une méthode existante dans la littérature combinant des méthodes exactes et heuristiques. Notre méthode a été plus performante que ces deux méthodes, notamment pour les instances de très grandes tailles.