926 resultados para Combinatorial enumeration problems


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The quality of a heuristic solution to a NP-hard combinatorial problem is hard to assess. A few studies have advocated and tested statistical bounds as a method for assessment. These studies indicate that statistical bounds are superior to the more widely known and used deterministic bounds. However, the previous studies have been limited to a few metaheuristics and combinatorial problems and, hence, the general performance of statistical bounds in combinatorial optimization remains an open question. This work complements the existing literature on statistical bounds by testing them on the metaheuristic Greedy Randomized Adaptive Search Procedures (GRASP) and four combinatorial problems. Our findings confirm previous results that statistical bounds are reliable for the p-median problem, while we note that they also seem reliable for the set covering problem. For the quadratic assignment problem, the statistical bounds has previously been found reliable when obtained from the Genetic algorithm whereas in this work they found less reliable. Finally, we provide statistical bounds to four 2-path network design problem instances for which the optimum is currently unknown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given a prime power q, define c (q) as the minimum cardinality of a subset H of F 3 q which satisfies the following property: every vector in this space di ff ers in at most 1 coordinate from a multiple of a vector in H. In this work, we introduce two extremal problems in combinatorial number theory aiming to discuss a known connection between the corresponding coverings and sum-free sets. Also, we provide several bounds on these maps which yield new classes of coverings, improving the previous upper bound on c (q)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Logica Computicional

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Closest Vector Problem (CVP) and the Shortest Vector Problem (SVP) are prime problems in lattice-based cryptanalysis, since they underpin the security of many lattice-based cryptosystems. Despite the importance of these problems, there are only a few CVP-solvers publicly available, and their scalability was never studied. This paper presents a scalable implementation of an enumeration-based CVP-solver for multi-cores, which can be easily adapted to solve the SVP. In particular, it achieves super-linear speedups in some instances on up to 8 cores and almost linear speedups on 16 cores when solving the CVP on a 50-dimensional lattice. Our results show that enumeration-based CVP-solvers can be parallelized as effectively as enumeration-based solvers for the SVP, based on a comparison with a state of the art SVP-solver. In addition, we show that we can optimize the SVP variant of our solver in such a way that it becomes 35%-60% faster than the fastest enumeration-based SVP-solver to date.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Globalization involves several facility location problems that need to be handled at large scale. Location Allocation (LA) is a combinatorial problem in which the distance among points in the data space matter. Precisely, taking advantage of the distance property of the domain we exploit the capability of clustering techniques to partition the data space in order to convert an initial large LA problem into several simpler LA problems. Particularly, our motivation problem involves a huge geographical area that can be partitioned under overall conditions. We present different types of clustering techniques and then we perform a cluster analysis over our dataset in order to partition it. After that, we solve the LA problem applying simulated annealing algorithm to the clustered and non-clustered data in order to work out how profitable is the clustering and which of the presented methods is the most suitable

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Black-box optimization problems (BBOP) are de ned as those optimization problems in which the objective function does not have an algebraic expression, but it is the output of a system (usually a computer program). This paper is focussed on BBOPs that arise in the eld of insurance, and more speci cally in reinsurance problems. In this area, the complexity of the models and assumptions considered to de ne the reinsurance rules and conditions produces hard black-box optimization problems, that must be solved in order to obtain the optimal output of the reinsurance. The application of traditional optimization approaches is not possible in BBOP, so new computational paradigms must be applied to solve these problems. In this paper we show the performance of two evolutionary-based techniques (Evolutionary Programming and Particle Swarm Optimization). We provide an analysis in three BBOP in reinsurance, where the evolutionary-based approaches exhibit an excellent behaviour, nding the optimal solution within a fraction of the computational cost used by inspection or enumeration methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The set covering problem is an NP-hard combinatorial optimization problemthat arises in applications ranging from crew scheduling in airlines todriver scheduling in public mass transport. In this paper we analyze searchspace characteristics of a widely used set of benchmark instances throughan analysis of the fitness-distance correlation. This analysis shows thatthere exist several classes of set covering instances that have a largelydifferent behavior. For instances with high fitness distance correlation,we propose new ways of generating core problems and analyze the performanceof algorithms exploiting these core problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Combinatorial optimization involves finding an optimal solution in a finite set of options; many everyday life problems are of this kind. However, the number of options grows exponentially with the size of the problem, such that an exhaustive search for the best solution is practically infeasible beyond a certain problem size. When efficient algorithms are not available, a practical approach to obtain an approximate solution to the problem at hand, is to start with an educated guess and gradually refine it until we have a good-enough solution. Roughly speaking, this is how local search heuristics work. These stochastic algorithms navigate the problem search space by iteratively turning the current solution into new candidate solutions, guiding the search towards better solutions. The search performance, therefore, depends on structural aspects of the search space, which in turn depend on the move operator being used to modify solutions. A common way to characterize the search space of a problem is through the study of its fitness landscape, a mathematical object comprising the space of all possible solutions, their value with respect to the optimization objective, and a relationship of neighborhood defined by the move operator. The landscape metaphor is used to explain the search dynamics as a sort of potential function. The concept is indeed similar to that of potential energy surfaces in physical chemistry. Borrowing ideas from that field, we propose to extend to combinatorial landscapes the notion of the inherent network formed by energy minima in energy landscapes. In our case, energy minima are the local optima of the combinatorial problem, and we explore several definitions for the network edges. At first, we perform an exhaustive sampling of local optima basins of attraction, and define weighted transitions between basins by accounting for all the possible ways of crossing the basins frontier via one random move. Then, we reduce the computational burden by only counting the chances of escaping a given basin via random kick moves that start at the local optimum. Finally, we approximate network edges from the search trajectory of simple search heuristics, mining the frequency and inter-arrival time with which the heuristic visits local optima. Through these methodologies, we build a weighted directed graph that provides a synthetic view of the whole landscape, and that we can characterize using the tools of complex networks science. We argue that the network characterization can advance our understanding of the structural and dynamical properties of hard combinatorial landscapes. We apply our approach to prototypical problems such as the Quadratic Assignment Problem, the NK model of rugged landscapes, and the Permutation Flow-shop Scheduling Problem. We show that some network metrics can differentiate problem classes, correlate with problem non-linearity, and predict problem hardness as measured from the performances of trajectory-based local search heuristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work contains a series of studies on the optimization of three real-world scheduling problems, school timetabling, sports scheduling and staff scheduling. These challenging problems are solved to customer satisfaction using the proposed PEAST algorithm. The customer satisfaction refers to the fact that implementations of the algorithm are in industry use. The PEAST algorithm is a product of long-term research and development. The first version of it was introduced in 1998. This thesis is a result of a five-year development of the algorithm. One of the most valuable characteristics of the algorithm has proven to be the ability to solve a wide range of scheduling problems. It is likely that it can be tuned to tackle also a range of other combinatorial problems. The algorithm uses features from numerous different metaheuristics which is the main reason for its success. In addition, the implementation of the algorithm is fast enough for real-world use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this thesis is to investigate some open problems in the area of combinatorial number theory referred to as zero-sum theory. A zero-sequence in a finite cyclic group G is said to have the basic property if it is equivalent under group automorphism to one which has sum precisely IGI when this sum is viewed as an integer. This thesis investigates two major problems, the first of which is referred to as the basic pair problem. This problem seeks to determine conditions for which every zero-sequence of a given length in a finite abelian group has the basic property. We resolve an open problem regarding basic pairs in cyclic groups by demonstrating that every sequence of length four in Zp has the basic property, and we conjecture on the complete solution of this problem. The second problem is a 1988 conjecture of Kleitman and Lemke, part of which claims that every sequence of length n in Zn has a subsequence with the basic property. If one considers the special case where n is an odd integer we believe this conjecture to hold true. We verify this is the case for all prime integers less than 40, and all odd integers less than 26. In addition, we resolve the Kleitman-Lemke conjecture for general n in the negative. That is, we demonstrate a sequence in any finite abelian group isomorphic to Z2p (for p ~ 11 a prime) containing no subsequence with the basic property. These results, as well as the results found along the way, contribute to many other problems in zero-sum theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En entornos donde los recursos son precederos y la asignación de recursos se repite en el tiempo con el mismo conjunto o un conjunto muy similar de agentes, las subastas recurrentes pueden ser utilizadas. Una subasta recurrente es una secuencia de subastas donde el resultado de una subasta puede influenciar en las siguientes. De todas formas, este tipo de subastas tienen problemas particulares cuando la riqueza de los agentes esta desequilibrada y los recursos son precederos. En esta tesis se proponen algunos mecanismos justos o equitativos para minimizar los efectos de estos problemas. En una subasta recurrente una solución justa significa que todos los participantes consiguen a largo plazo sus objetivos en el mismo grado o en el grado más parecido posible, independientemente de su riqueza. Hemos demostrado experimentalmente que la inclusión de justicia incentiva a los bidders en permanecer en la subasta minimizando los problemas de las subastas recurrentes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose a new method for solving large scale p-median problem instances based on real data. We compare different approaches in terms of runtime, memory footprint and quality of solutions obtained. In order to test the different methods on real data, we introduce a new benchmark for the p-median problem based on real Swedish data. Because of the size of the problem addressed, up to 1938 candidate nodes, a number of algorithms, both exact and heuristic, are considered. We also propose an improved hybrid version of a genetic algorithm called impGA. Experiments show that impGA behaves as well as other methods for the standard set of medium-size problems taken from Beasley’s benchmark, but produces comparatively good results in terms of quality, runtime and memory footprint on our specific benchmark based on real Swedish data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neural networks consist of highly interconnected and parallel nonlinear processing elements that are shown to be extremely effective in computation. This paper presents an architecture of recurrent neural net-works that can be used to solve several classes of optimization problems. More specifically, a modified Hopfield network is developed and its inter-nal parameters are computed explicitly using the valid-subspace technique. These parameters guarantee the convergence of the network to the equilibrium points, which represent a solution of the problem considered. The problems that can be treated by the proposed approach include combinatorial optimiza-tion problems, dynamic programming problems, and nonlinear optimization problems.