970 resultados para Branch and Bound


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supply chain operations directly affect service levels. Decision on amendment of facilities is generally decided based on overall cost, leaving out the efficiency of each unit. Decomposing the supply chain superstructure, efficiency analysis of the facilities (warehouses or distribution centers) that serve customers can be easily implemented. With the proposed algorithm, the selection of a facility is based on service level maximization and not just cost minimization as this analysis filters all the feasible solutions utilizing Data Envelopment Analysis (DEA) technique. Through multiple iterations, solutions are filtered via DEA and only the efficient ones are selected leading to cost minimization. In this work, the problem of optimal supply chain networks design is addressed based on a DEA based algorithm. A Branch and Efficiency (B&E) algorithm is deployed for the solution of this problem. Based on this DEA approach, each solution (potentially installed warehouse, plant etc) is treated as a Decision Making Unit, thus is characterized by inputs and outputs. The algorithm through additional constraints named “efficiency cuts”, selects only efficient solutions providing better objective function values. The applicability of the proposed algorithm is demonstrated through illustrative examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Process systems design, operation and synthesis problems under uncertainty can readily be formulated as two-stage stochastic mixed-integer linear and nonlinear (nonconvex) programming (MILP and MINLP) problems. These problems, with a scenario based formulation, lead to large-scale MILPs/MINLPs that are well structured. The first part of the thesis proposes a new finitely convergent cross decomposition method (CD), where Benders decomposition (BD) and Dantzig-Wolfe decomposition (DWD) are combined in a unified framework to improve the solution of scenario based two-stage stochastic MILPs. This method alternates between DWD iterations and BD iterations, where DWD restricted master problems and BD primal problems yield a sequence of upper bounds, and BD relaxed master problems yield a sequence of lower bounds. A variant of CD, which includes multiple columns per iteration of DW restricted master problem and multiple cuts per iteration of BD relaxed master problem, called multicolumn-multicut CD is then developed to improve solution time. Finally, an extended cross decomposition method (ECD) for solving two-stage stochastic programs with risk constraints is proposed. In this approach, a CD approach at the first level and DWD at a second level is used to solve the original problem to optimality. ECD has a computational advantage over a bilevel decomposition strategy or solving the monolith problem using an MILP solver. The second part of the thesis develops a joint decomposition approach combining Lagrangian decomposition (LD) and generalized Benders decomposition (GBD), to efficiently solve stochastic mixed-integer nonlinear nonconvex programming problems to global optimality, without the need for explicit branch and bound search. In this approach, LD subproblems and GBD subproblems are systematically solved in a single framework. The relaxed master problem obtained from the reformulation of the original problem, is solved only when necessary. A convexification of the relaxed master problem and a domain reduction procedure are integrated into the decomposition framework to improve solution efficiency. Using case studies taken from renewable resource and fossil-fuel based application in process systems engineering, it can be seen that these novel decomposition approaches have significant benefit over classical decomposition methods and state-of-the-art MILP/MINLP global optimization solvers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The flowshop scheduling problem with blocking in-process is addressed in this paper. In this environment, there are no buffers between successive machines: therefore intermediate queues of jobs waiting in the system for their next operations are not allowed. Heuristic approaches are proposed to minimize the total tardiness criterion. A constructive heuristic that explores specific characteristics of the problem is presented. Moreover, a GRASP-based heuristic is proposed and Coupled with a path relinking strategy to search for better outcomes. Computational tests are presented and the comparisons made with an adaptation of the NEH algorithm and with a branch-and-bound algorithm indicate that the new approaches are promising. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to usage conditions, hazardous environments or intentional causes, physical and virtual systems are subject to faults in their components, which may affect their overall behaviour. In a ‘black-box’ agent modelled by a set of propositional logic rules, in which just a subset of components is externally visible, such faults may only be recognised by examining some output function of the agent. A (fault-free) model of the agent’s system provides the expected output given some input. If the real output differs from that predicted output, then the system is faulty. However, some faults may only become apparent in the system output when appropriate inputs are given. A number of problems regarding both testing and diagnosis thus arise, such as testing a fault, testing the whole system, finding possible faults and differentiating them to locate the correct one. The corresponding optimisation problems of finding solutions that require minimum resources are also very relevant in industry, as is minimal diagnosis. In this dissertation we use a well established set of benchmark circuits to address such diagnostic related problems and propose and develop models with different logics that we formalise and generalise as much as possible. We also prove that all techniques generalise to agents and to multiple faults. The developed multi-valued logics extend the usual Boolean logic (suitable for faultfree models) by encoding values with some dependency (usually on faults). Such logics thus allow modelling an arbitrary number of diagnostic theories. Each problem is subsequently solved with CLP solvers that we implement and discuss, together with a new efficient search technique that we present. We compare our results with other approaches such as SAT (that require substantial duplication of circuits), showing the effectiveness of constraints over multi-valued logics, and also the adequacy of a general set constraint solver (with special inferences over set functions such as cardinality) on other problems. In addition, for an optimisation problem, we integrate local search with a constructive approach (branch-and-bound) using a variety of logics to improve an existing efficient tool based on SAT and ILP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process of resources systems selection takes an important part in Distributed/Agile/Virtual Enterprises (D/A/V Es) integration. However, the resources systems selection is still a difficult matter to solve in a D/A/VE, as it is pointed out in this paper. Globally, we can say that the selection problem has been equated from different aspects, originating different kinds of models/algorithms to solve it. In order to assist the development of a web prototype tool (broker tool), intelligent and flexible, that integrates all the selection model activities and tools, and with the capacity to adequate to each D/A/V E project or instance (this is the major goal of our final project), we intend in this paper to show: a formulation of a kind of resources selection problem and the limitations of the algorithms proposed to solve it. We formulate a particular case of the problem as an integer programming, which is solved using simplex and branch and bound algorithms, and identify their performance limitations (in terms of processing time) based on simulation results. These limitations depend on the number of processing tasks and on the number of pre-selected resources per processing tasks, defining the domain of applicability of the algorithms for the problem studied. The limitations detected open the necessity of the application of other kind of algorithms (approximate solution algorithms) outside the domain of applicability founded for the algorithms simulated. However, for a broker tool it is very important the knowledge of algorithms limitations, in order to, based on problem features, develop and select the most suitable algorithm that guarantees a good performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trabalho de projeto realizado para obtenção do grau de Mestre em Engenharia Informática e de Computadores

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a simple Optimised Search Heuristic for the Job Shop Scheduling problem that combines a GRASP heuristic with a branch-and-bound algorithm. The proposed method is compared with similar approaches and leads to better results in terms of solution quality and computing times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new branch and bound algorithm for weighted Max-SAT, called Lazy which incorporates original data structures and inference rules, as well as a lower bound of better quality. We provide experimental evidence that our solver is very competitive and outperforms some of the best performing Max-SAT and weighted Max-SAT solvers on a wide range of instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho teve como objetivos desenvolver e testar um algoritmo com base na metaheurística busca tabu (BT), para a solução de problemas de gerenciamento florestal com restrições de inteireza. Os problemas avaliados tinham entre 93 e 423 variáveis de decisão, sujeitos às restrições de singularidade, produção mínima e produção máxima periódicas. Todos os problemas tiveram como objetivo a maximização do valor presente líquido. O algoritmo para implementação da BT foi codificado em ambiente delphi 5.0 e os testes foram efetuados em um microcomputador AMD K6II 500 MHZ, com memória RAM de 64 MB e disco rígido de 15GB. O desempenho da BT foi avaliado de acordo com as medidas de eficácia e eficiência. Os diferentes valores ou categorias dos parâmetros da BT foram testados e comparados quanto aos seus efeitos na eficácia do algoritmo. A seleção da melhor configuração de parâmetros foi feita com o teste L&O, a 1% de probabilidade, e as análises através de estatísticas descritivas. A melhor configuração de parâmetros propiciou à BT eficácia média de 95,97%, valor mínimo igual a 90,39% e valor máximo igual a 98,84%, com um coeficiente de variação de 2,48% do ótimo matemático. Para o problema de maior porte, a eficiência da BT foi duas vezes superior à eficiência do algoritmo exato branch and bound, apresentando-se como uma abordagem muito atrativa para solução de importantes problemas de gerenciamento florestal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Os objetivos deste trabalho foram desenvolver e testar um algoritmo genético (AG) para a solução de problemas de gerenciamento florestal com restrições de integridade. O AG foi testado em quatro problemas, contendo entre 93 e 423 variáveis de decisão, sujeitos às restrições de singularidade, produção mínima e produção máxima, periodicamente. Todos os problemas tiveram como objetivo a maximização do valor presente líquido. O AG foi codificado em ambiente delphi 5.0 e os testes foram realizados em um microcomputador AMD K6II 500 MHZ, com memória RAM de 64 MB e disco rígido de 15GB. O desempenho do AG foi avaliado de acordo com as medidas de eficácia e eficiência. Os valores ou categorias dos parâmetros do AG foram testados e comparados quanto aos seus efeitos na eficácia do algoritmo. A seleção da melhor configuração de parâmetros foi feita com o teste L&O, a 1% de probabilidade, e as análises foram realizadas através de estatísticas descritivas. A melhor configuração de parâmetros propiciou ao AG eficácia média de 94,28%, valor mínimo de 90,01%, valor máximo de 98,48%, com coeficiente de variação de 2,08% do ótimo matemático, obtido pelo algoritmo exato branch and bound. Para o problema de maior porte, a eficiência do AG foi cinco vezes superior à eficiência do algoritmo exato branch and bound. O AG apresentou-se como uma abordagem bastante atrativa para solução de importantes problemas de gerenciamento florestal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Os objetivos deste trabalho foram desenvolver e testar a metaheurística SA para solução de problemas de gerenciamento florestal com restrições de integridade. O algoritmo SA desenvolvido foi testado em quatro problemas, contendo entre 93 e 423 variáveis de decisão, sujeitos às restrições de singularidade, produção mínima e produção máxima, periodicamente. Todos os problemas tiveram como objetivo a maximização do valor presente líquido. O algoritmo SA foi codificado em liguagem delphi 5.0 e os testes foram efetuados em um microcomputador AMD K6II 500 MHZ, com memória RAM de 64 MB e disco rígido de 15GB. O desempenho da SA foi avaliado de acordo com as medidas de eficácia e eficiência. Os diferentes valores ou categorias dos parâmetros da SA foram testados e comparados quanto aos seus efeitos na eficácia do algoritmo. A seleção da melhor configuração de parâmetros foi feita com o teste L&O, a 1% de probabilidade, e as análises foram realizadas através de estatísticas descritivas. A melhor configuração de parâmetros propiciou à SA eficácia média de 95,36%, valor mínimo de 83,66%, valor máximo de 100% e coeficiente de variação igual a 3,18% do ótimo matemático obtido pelo algoritmo exato branch and bound. Para o problema de maior porte, a eficiência da SA foi dez vezes superior à eficiência do algoritmo exato branch and bound. O bom desempenho desta heurística reforçou as conclusões, tiradas em outros trabalhos, do seu enorme potencial para resolver importantes problemas de gerenciamento florestal de difícil solução pelos instrumentos computacionais da atualidade.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Många kvantitativa problem från vitt skilda områden kan beskrivas som optimeringsproblem. Ett mått på lösningens kvalitet bör optimeras samtidigt som vissa villkor på lösningen uppfylls. Kvalitetsmåttet kallas vanligen objektfunktion och kan beskriva kostnader (exempelvis produktion, logistik), potentialenergi (molekylmodellering, proteinveckning), risk (finans, försäkring) eller något annat relevant mått. I min doktorsavhandling diskuteras speciellt icke-linjär programmering, NLP, i ändliga dimensioner. Problem med enkel struktur, till exempel någon form av konvexitet, kan lösas effektivt. Tyvärr kan inte alla kvantitativa samband modelleras på ett konvext vis. Icke-konvexa problem kan angripas med heuristiska metoder, algoritmer som söker lösningar med hjälp av deterministiska eller stokastiska tumregler. Ibland fungerar det här väl, men heuristikerna kan sällan garantera kvaliteten på lösningen eller ens att en lösning påträffas. För vissa tillämpningar är det här oacceptabelt. Istället kan man tillämpa så kallad global optimering. Genom att successivt dela variabeldomänen i mindre delar och beräkna starkare gränser på det optimala värdet hittas en lösning inom feltoleransen. Den här metoden kallas branch-and-bound, ungefär dela-och-begränsa. För att ge undre gränser (vid minimering) approximeras problemet med enklare problem, till exempel konvexa, som kan lösas effektivt. I avhandlingen studeras tillvägagångssätt för att approximera differentierbara funktioner med konvexa underskattningar, speciellt den så kallade alphaBB-metoden. Denna metod adderar störningar av en viss form och garanterar konvexitet genom att sätta villkor på den perturberade Hessematrisen. Min forskning har lyft fram en naturlig utvidgning av de perturbationer som används i alphaBB. Nya metoder för att bestämma underskattningsparametrar har beskrivits och jämförts. I sammanfattningsdelen diskuteras global optimering ur bredare perspektiv på optimering och beräkningsalgoritmer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’industrie forestière est un secteur qui, même s’il est en déclin, se trouve au cœur du débat sur la mondialisation et le développement durable. Pour de nombreux pays tels que le Canada, la Suède et le Chili, les objectifs sont de maintenir un secteur florissant sans nuire à l’environnement et en réalisant le caractère fini des ressources. Il devient important d’être compétitif et d’exploiter de manière efficace les territoires forestiers, de la récolte jusqu’à la fabrication des produits aux usines, en passant par le transport, dont les coûts augmentent rapidement. L’objectif de ce mémoire est de développer un modèle de planification tactique/opérationnelle qui permet d’ordonnancer les activités pour une année de récolte de façon à satisfaire les demandes des usines, sans perdre de vue le transport des quantités récoltées et la gestion des inventaires en usine. L’année se divise en 26 périodes de deux semaines. Nous cherchons à obtenir les horaires et l’affectation des équipes de récolte aux blocs de coupe pour une année. Le modèle mathématique développé est un problème linéaire mixte en nombres entiers dont la structure est basée sur chaque étape de la chaine d’approvisionnement forestière. Nous choisissons de le résoudre par une méthode exacte, le branch-and-bound. Nous avons pu évaluer combien la résolution directe de notre problème de planification était difficile pour les instances avec un grand nombre de périodes. Cependant l’approche des horizons roulants s’est avérée fructueuse. Grâce à elle en une journée, il est possible de planifier les activités de récolte des blocs pour l’année entière (26 périodes).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Une réconciliation entre un arbre de gènes et un arbre d’espèces décrit une histoire d’évolution des gènes homologues en termes de duplications et pertes de gènes. Pour inférer une réconciliation pour un arbre de gènes et un arbre d’espèces, la parcimonie est généralement utilisée selon le nombre de duplications et/ou de pertes. Les modèles de réconciliation sont basés sur des critères probabilistes ou combinatoires. Le premier article définit un modèle combinatoire simple et général où les duplications et les pertes sont clairement identifiées et la réconciliation parcimonieuse n’est pas la seule considérée. Une architecture de toutes les réconciliations est définie et des algorithmes efficaces (soit de dénombrement, de génération aléatoire et d’exploration) sont développés pour étudier les propriétés combinatoires de l’espace de toutes les réconciliations ou seulement les plus parcimonieuses. Basée sur le processus classique nommé naissance-et-mort, un algorithme qui calcule la vraisemblance d’une réconciliation a récemment été proposé. Le deuxième article utilise cet algorithme avec les outils combinatoires décrits ci-haut pour calculer efficacement (soit approximativement ou exactement) les probabilités postérieures des réconciliations localisées dans le sous-espace considéré. Basé sur des taux réalistes (selon un modèle probabiliste) de duplication et de perte et sur des données réelles/simulées de familles de champignons, nos résultats suggèrent que la masse probabiliste de toute l’espace des réconciliations est principalement localisée autour des réconciliations parcimonieuses. Dans un contexte d’approximation de la probabilité d’une réconciliation, notre approche est une alternative intéressante face aux méthodes MCMC et peut être meilleure qu’une approche sophistiquée, efficace et exacte pour calculer la probabilité d’une réconciliation donnée. Le problème nommé Gene Tree Parsimony (GTP) est d’inférer un arbre d’espèces qui minimise le nombre de duplications et/ou de pertes pour un ensemble d’arbres de gènes. Basé sur une approche qui explore tout l’espace des arbres d’espèces pour les génomes considérés et un calcul efficace des coûts de réconciliation, le troisième article décrit un algorithme de Branch-and-Bound pour résoudre de façon exacte le problème GTP. Lorsque le nombre de taxa est trop grand, notre algorithme peut facilement considérer des relations prédéfinies entre ensembles de taxa. Nous avons testé notre algorithme sur des familles de gènes de 29 eucaryotes.