900 resultados para Hard combinatorial scheduling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis we study three combinatorial optimization problems belonging to the classes of Network Design and Vehicle Routing problems that are strongly linked in the context of the design and management of transportation networks: the Non-Bifurcated Capacitated Network Design Problem (NBP), the Period Vehicle Routing Problem (PVRP) and the Pickup and Delivery Problem with Time Windows (PDPTW). These problems are NP-hard and contain as special cases some well known difficult problems such as the Traveling Salesman Problem and the Steiner Tree Problem. Moreover, they model the core structure of many practical problems arising in logistics and telecommunications. The NBP is the problem of designing the optimum network to satisfy a given set of traffic demands. Given a set of nodes, a set of potential links and a set of point-to-point demands called commodities, the objective is to select the links to install and dimension their capacities so that all the demands can be routed between their respective endpoints, and the sum of link fixed costs and commodity routing costs is minimized. The problem is called non- bifurcated because the solution network must allow each demand to follow a single path, i.e., the flow of each demand cannot be splitted. Although this is the case in many real applications, the NBP has received significantly less attention in the literature than other capacitated network design problems that allow bifurcation. We describe an exact algorithm for the NBP that is based on solving by an integer programming solver a formulation of the problem strengthened by simple valid inequalities and four new heuristic algorithms. One of these heuristics is an adaptive memory metaheuristic, based on partial enumeration, that could be applied to a wider class of structured combinatorial optimization problems. In the PVRP a fleet of vehicles of identical capacity must be used to service a set of customers over a planning period of several days. Each customer specifies a service frequency, a set of allowable day-combinations and a quantity of product that the customer must receive every time he is visited. For example, a customer may require to be visited twice during a 5-day period imposing that these visits take place on Monday-Thursday or Monday-Friday or Tuesday-Friday. The problem consists in simultaneously assigning a day- combination to each customer and in designing the vehicle routes for each day so that each customer is visited the required number of times, the number of routes on each day does not exceed the number of vehicles available, and the total cost of the routes over the period is minimized. We also consider a tactical variant of this problem, called Tactical Planning Vehicle Routing Problem, where customers require to be visited on a specific day of the period but a penalty cost, called service cost, can be paid to postpone the visit to a later day than that required. At our knowledge all the algorithms proposed in the literature for the PVRP are heuristics. In this thesis we present for the first time an exact algorithm for the PVRP that is based on different relaxations of a set partitioning-like formulation. The effectiveness of the proposed algorithm is tested on a set of instances from the literature and on a new set of instances. Finally, the PDPTW is to service a set of transportation requests using a fleet of identical vehicles of limited capacity located at a central depot. Each request specifies a pickup location and a delivery location and requires that a given quantity of load is transported from the pickup location to the delivery location. Moreover, each location can be visited only within an associated time window. Each vehicle can perform at most one route and the problem is to satisfy all the requests using the available vehicles so that each request is serviced by a single vehicle, the load on each vehicle does not exceed the capacity, and all locations are visited according to their time window. We formulate the PDPTW as a set partitioning-like problem with additional cuts and we propose an exact algorithm based on different relaxations of the mathematical formulation and a branch-and-cut-and-price algorithm. The new algorithm is tested on two classes of problems from the literature and compared with a recent branch-and-cut-and-price algorithm from the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crew scheduling and crew rostering are similar and related problems which can be solved by similar procedures. So far, the existing solution methods usually create a model for each one of these problems (scheduling and rostering), and when they are solved together in some cases an interaction between models is considered in order to obtain a better solution. A single set covering model to solve simultaneously both problems is presented here, where the total quantity of drivers needed is directly considered and optimized. This integration allows to optimize all of the depots at the same time, while traditional approaches needed to work depot by depot, and also it allows to see and manage the relationship between scheduling and rostering, which was known in some degree but usually not easy to quantify as this model permits. Recent research in the area of crew scheduling and rostering has stated that one of the current challenges to be achieved is to determine a schedule where crew fatigue, which depends mainly on the quality of the rosters created, is reduced. In this approach rosters are constructed in such way that stable working hours are used in every week of work, and a change to a different shift is done only using free days in between to make easier the adaptation to the new working hours. Computational results for real-world-based instances are presented. Instances are geographically diverse to test the performance of the procedures and the model in different scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis we present some combinatorial optimization problems, suggest models and algorithms for their effective solution. For each problem,we give its description, followed by a short literature review, provide methods to solve it and, finally, present computational results and comparisons with previous works to show the effectiveness of the proposed approaches. The considered problems are: the Generalized Traveling Salesman Problem (GTSP), the Bin Packing Problem with Conflicts(BPPC) and the Fair Layout Problem (FLOP).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In questa tesi ci occuperemo di fornire un modello MIP di base e di alcune sue varianti, realizzate allo scopo di comprenderne il comportamento ed eventualmente migliorarne l’efficienza. Le diverse varianti sono state costruite agendo in particolar modo sulla definizione di alcuni vincoli, oppure sui bound delle variabili, oppure ancora nell’obbligare il risolutore a focalizzarsi su determinate decisioni o specifiche variabili. Sono stati testati alcuni dei problemi tipici presenti in letteratura e i diversi risultati sono stati opportunamente valutati e confrontati. Tra i riferimenti per tale confronto sono stati considerati anche i risultati ottenibili tramite un modello Constraint Programming, che notoriamente produce risultati apprezzabili in ambito di schedulazione. Un ulteriore scopo della tesi è, infatti, comparare i due approcci Mathematical Programming e Constraint Programming, identificandone quindi i pregi e gli svantaggi e provandone la trasferibilità al modello raffrontato.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a large number of problems the high dimensionality of the search space, the vast number of variables and the economical constrains limit the ability of classical techniques to reach the optimum of a function, known or unknown. In this thesis we investigate the possibility to combine approaches from advanced statistics and optimization algorithms in such a way to better explore the combinatorial search space and to increase the performance of the approaches. To this purpose we propose two methods: (i) Model Based Ant Colony Design and (ii) Naïve Bayes Ant Colony Optimization. We test the performance of the two proposed solutions on a simulation study and we apply the novel techniques on an appplication in the field of Enzyme Engineering and Design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hard X-ray band (10 - 100 keV) has been only observed so far by collimated and coded aperture mask instruments, with a sensitivity and an angular resolution lower than two orders of magnitude as respects the current X-ray focusing telescopes operating below 10 - 15 keV. The technological advance in X-ray mirrors and detection systems is now able to extend the X-ray focusing technique to the hard X-ray domain, filling the gap in terms of observational performances and providing a totally new deep view on some of the most energetic phenomena of the Universe. In order to reach a sensitivity of 1 muCrab in the 10 - 40 keV energy range, a great care in the background minimization is required, a common issue for all the hard X-ray focusing telescopes. In the present PhD thesis, a comprehensive analysis of the space radiation environment, the payload design and the resulting prompt X-ray background level is presented, with the aim of driving the feasibility study of the shielding system and assessing the scientific requirements of the future hard X-ray missions. A Geant4 based multi-mission background simulator, BoGEMMS, is developed to be applied to any high energy mission for which the shielding and instruments performances are required. It allows to interactively create a virtual model of the telescope and expose it to the space radiation environment, tracking the particles along their path and filtering the simulated background counts as a real observation in space. Its flexibility is exploited to evaluate the background spectra of the Simbol-X and NHXM mission, as well as the soft proton scattering by the X-ray optics and the selection of the best shielding configuration. Altough the Simbol-X and NHXM missions are the case studies of the background analysis, the obtained results can be generalized to any future hard X-ray telescope. For this reason, a simplified, ideal payload model is also used to select the major sources of background in LEO. All the results are original contributions to the assessment studies of the cited missions, as part of the background groups activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-Equilibrium Statistical Mechanics is a broad subject. Grossly speaking, it deals with systems which have not yet relaxed to an equilibrium state, or else with systems which are in a steady non-equilibrium state, or with more general situations. They are characterized by external forcing and internal fluxes, resulting in a net production of entropy which quantifies dissipation and the extent by which, by the Second Law of Thermodynamics, time-reversal invariance is broken. In this thesis we discuss some of the mathematical structures involved with generic discrete-state-space non-equilibrium systems, that we depict with networks in all analogous to electrical networks. We define suitable observables and derive their linear regime relationships, we discuss a duality between external and internal observables that reverses the role of the system and of the environment, we show that network observables serve as constraints for a derivation of the minimum entropy production principle. We dwell on deep combinatorial aspects regarding linear response determinants, which are related to spanning tree polynomials in graph theory, and we give a geometrical interpretation of observables in terms of Wilson loops of a connection and gauge degrees of freedom. We specialize the formalism to continuous-time Markov chains, we give a physical interpretation for observables in terms of locally detailed balanced rates, we prove many variants of the fluctuation theorem, and show that a well-known expression for the entropy production due to Schnakenberg descends from considerations of gauge invariance, where the gauge symmetry is related to the freedom in the choice of a prior probability distribution. As an additional topic of geometrical flavor related to continuous-time Markov chains, we discuss the Fisher-Rao geometry of nonequilibrium decay modes, showing that the Fisher matrix contains information about many aspects of non-equilibrium behavior, including non-equilibrium phase transitions and superposition of modes. We establish a sort of statistical equivalence principle and discuss the behavior of the Fisher matrix under time-reversal. To conclude, we propose that geometry and combinatorics might greatly increase our understanding of nonequilibrium phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have performed Monte Carlo and molecular dynamics simulations of suspensions of monodisperse, hard ellipsoids of revolution. Hard-particle models play a key role in statistical mechanics. They are conceptually and computationally simple, and they offer insight into systems in which particle shape is important, including atomic, molecular, colloidal, and granular systems. In the high density phase diagram of prolate hard ellipsoids we have found a new crystal, which is more stable than the stretched FCC structure proposed previously . The new phase, SM2, has a simple monoclinic unit cell containing a basis of two ellipsoids with unequal orientations. The angle of inclination is very soft for length-to-width (aspect) ratio l/w=3, while the other angles are not. A symmetric state of the unit cell exists, related to the densest-known packings of ellipsoids; it is not always the stable one. Our results remove the stretched FCC structure for aspect ratio l/w=3 from the phase diagram of hard, uni-axial ellipsoids. We provide evidence that this holds between aspect ratios 3 and 6, and possibly beyond. Finally, ellipsoids in SM2 at l/w=1.55 exhibit end-over-end flipping, warranting studies of the cross-over to where this dynamics is not possible. Secondly, we studied the dynamics of nearly spherical ellipsoids. In equilibrium, they show a first-order transition from an isotropic phase to a rotator phase, where positions are crystalline but orientations are free. When over-compressing the isotropic phase into the rotator regime, we observed super-Arrhenius slowing down of diffusion and relaxation, and signatures of the cage effect. These features of glassy dynamics are sufficiently strong that asymptotic scaling laws of the Mode-Coupling Theory of the glass transition (MCT) could be tested, and were found to apply. We found strong coupling of positional and orientational degrees of freedom, leading to a common value for the MCT glass-transition volume fraction. Flipping modes were not slowed down significantly. We demonstrated that the results are independent of simulation method, as predicted by MCT. Further, we determined that even intra-cage motion is cooperative. We confirmed the presence of dynamical heterogeneities associated with the cage effect. The transit between cages was seen to occur on short time scales, compared to the time spent in cages; but the transit was shown not to involve displacements distinguishable in character from intra-cage motion. The presence of glassy dynamics was predicted by molecular MCT (MMCT). However, as MMCT disregards crystallization, a test by simulation was required. Glassy dynamics is unusual in monodisperse systems. Crystallization typically intervenes unless polydispersity, network-forming bonds or other asymmetries are introduced. We argue that particle anisometry acts as a sufficient source of disorder to prevent crystallization. This sheds new light on the question of which ingredients are required for glass formation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When a liquid crystal is confined to a cavity its director field becomes subject to competing forces: on the one hand, the surface of the cavity orients the director field (``surface anchoring''), on the other hand deformations of the director field cost elastic energy. Hence the equilibrium director field is determined by a compromise between surface anchoring and elasticity. One example of a confined liquid crystal that has attracted particular interest from physicists is the nematic droplet. In this thesis a system of hard rods is considered as the simplest model for nematic liquid crystals consisting of elongated molecules. First, systems of hard spherocylinders in a spherical geometry are investigated by means of canonical Monte Carlo simulations. In contrast to previous simulation work on this problem, a continuum model is used. In particular, the effects of ordering near hard curved walls are studied for the low-density regime. With increasing density, first a uniaxial surface film forms and then a biaxial surface film, which eventually fills the entire cavity. We study how the surface order, the adsorption and the shape of the director field depend on the curvature of the wall. We find that orientational ordering at a curved wall in a cavity is stronger than at a flat wall, while adsorption is weaker. For densities above the isotropic-nematic transition, we always find bipolar configurations. As a next step, an extension of the Asakura-Oosawa-Vrij model for colloid-polymer mixtures to anisotropic colloids is considered. By means of computer simulations we study how droplets of hard, rod-like particles optimize their shape and structure under the influence of the osmotic compression caused by the presence of spherical particles that act as depletion agents. At sufficiently high osmotic pressures the rods that make up the drops spontaneously align to turn them into uniaxial nematic liquid crystalline droplets. The nematic droplets or ``tactoids'' that so form are not spherical but elongated, resulting from the competition between the anisotropic surface tension and the elastic deformation of the director field. In agreement with recent theoretical predictions we find that sufficiently small tactoids have a uniform director field, whilst large ones are characterized by a bipolar director field. From the shape and director-field transformation of the droplets we estimate the surface anchoring strength.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In piattaforme di Stream Processing è spesso necessario eseguire elaborazioni differenziate degli stream di input. Questa tesi ha l'obiettivo di realizzare uno scheduler in grado di attribuire priorità di esecuzione differenti agli operatori deputati all'elaborazione degli stream.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Le reti ottiche, grazie alla loro elevata capacità, hanno acquisito sempre maggiore importanza negli ultimi anni, sia per via del crescente volume di dati scambiati, legato soprattutto alla larga diffusione di Internet, sia per la necessità di comunicazioni in tempo reale. Dati i (relativamente) lunghi tempi di adattamento, questa tecnologia nativamente non è quella ottimale per il trasporto di un traffico a burst, tipico delle telecomunicazioni odierne. Le reti ibride cercano, quindi, di coniugare al meglio le potenzialità della commutazione ottica di circuito e della commutazione ottica a pacchetto. In questo lavoro, in particolare, ci si è concentrati su un'architettura di rete ibrida denominata 3LIHON (3-Level Integrated Hybrid Optical Network). Essa prevede tre distinti livelli di qualità di servizio (QoS) per soddisfare differenti necessità: - Guaranteed Service Type (GST): simile ad un servizio a commutazione di circuito, non ammette perdita di dati. - Statistically Multiplexed Real Time (SM/RT): simile ad un servizio a commutazione di pacchetto, garantisce ritardo nullo o molto basso all'interno della rete, permette un piccolo tasso di perdita di dati e ammette la contesa della banda. - Statistically Multiplexed Best Effort (SM/BE): simile ad un servizio a commutazione di pacchetto, non garantisce alcun ritardo tra i nodi ed ammette un basso tasso di perdita dei dati. In un nodo 3LIHON, il traffico SM/BE impossibile da servire, a causa ad es. dell'interruzione da parte di pacchetti aventi un livello di QoS più prioritario, viene irrimediabilmente perso. Questo implica anche lo spreco del tempo e delle risorse impiegati per trasmettere un pacchetto SM/BE fino alla sua interruzione. Nel presente lavoro si è cercato di limitare, per quanto possibile, questo comportamento sconveniente, adottando e comparando tre strategie, che hanno portato alla modifica del nodo 3LIHON standard ed alla nascita di tre sue varianti.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work deals with the car sequencing (CS) problem, a combinatorial optimization problem for sequencing mixed-model assembly lines. The aim is to find a production sequence for different variants of a common base product, such that work overload of the respective line operators is avoided or minimized. The variants are distinguished by certain options (e.g., sun roof yes/no) and, therefore, require different processing times at the stations of the line. CS introduces a so-called sequencing rule H:N for each option, which restricts the occurrence of this option to at most H in any N consecutive variants. It seeks for a sequence that leads to no or a minimum number of sequencing rule violations. In this work, CS’ suitability for workload-oriented sequencing is analyzed. Therefore, its solution quality is compared in experiments to the related mixed-model sequencing problem. A new sequencing rule generation approach as well as a new lower bound for the problem are presented. Different exact and heuristic solution methods for CS are developed and their efficiency is shown in experiments. Furthermore, CS is adjusted and applied to a resequencing problem with pull-off tables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Zahnverlust zu Lebzeiten („antemortem tooth loss“, AMTL) kann als Folge von Zahnerkrankungen, Traumata, Zahnextraktionen oder extremer kontinuierlicher Eruption sowie als Begleiterscheinung fortgeschrittener Stadien von Skorbut oder Lepra auftreten. Nach dem Zahnverlust setzt die Wundheilung als Sekundärheilung ein, während der sich die Alveole mit Blut füllt und sich ein Koagulum bildet. Anschließend erfolgt dessen Umwandlung in Knochengewebe und schließlich verstreicht die Alveole derart, dass sie makroskopisch nicht mehr erkannt werden kann. Der Zeitrahmen der knöchernen Konsolidierung des Kieferkammes ist im Detail wenig erforscht. Aufgrund des gehäuften Auftretens von AMTL in menschlichen Populationen, ist die Erarbeitung eines Zeitfensters, mit dessen Hilfe durch makroskopische Beobachtung des Knochens die Zeitspanne seit dem Zahnverlust („time since tooth loss“, TSL) ermittelt werden kann, insbesondere im archäologischen Kontext äußerst wertvoll. Solch ein Zeitschema mit Angaben über die Variabilität der zeitlichen Abläufe bei den Heilungsvorgängen kann nicht nur in der Osteologie, sondern auch in der Forensik, der allgemeinen Zahnheilkunde und der Implantologie nutzbringend angewandt werden. rnrnNach dem Verlust eines Zahnes wird das Zahnfach in der Regel durch ein Koagulum aufgefüllt. Das sich bildende Gewebe wird rasch in noch unreifen Knochen umgewandelt, welcher den Kieferknochen und auch die angrenzenden Zähne stabilisiert. Nach seiner Ausreifung passt sich das Gewebe schließlich dem umgebenden Knochen an. Das Erscheinungsbild des Zahnfaches während dieses Vorgangs durchläuft verschiedene Stadien, welche in der vorliegenden Studie anhand von klinischen Röntgenaufnahmen rezenter Patienten sowie durch Untersuchungen an archäologischen Skelettserien identifiziert wurden. Die Heilungsvorgänge im Zahnfach können in eine prä-ossale Phase (innerhalb einer Woche nach Zahnverlust), eine Verknöcherungsphase (etwa 14 Wochen nach Zahnverlust) und eine ossifizierte bzw. komplett verheilte Phase (mindestens 29 Wochen nach Zahnverlust) eingeteilt werden. Etliche Faktoren – wie etwa die Resorption des Interdentalseptums, der Zustand des Alveolarknochens oder das Individualgeschlecht – können den normalen Heilungsprozess signifikant beschleunigen oder hemmen und so Unterschiede von bis zu 19 Wochen verursachen. Weitere Variablen wirkten sich nicht signifikant auf den zeitlichen Rahmen des Heilungsprozesse aus. Relevante Abhängigkeiten zwischen verschiedenen Variabeln wurden ungeachtet der Alveolenauffüllung ebenfalls getestet. Gruppen von unabhängigen Variabeln wurden im Hinblick auf Auffüllungsgrad und TSL in multivariablen Modellen untersucht. Mit Hilfe dieser Ergebnisse ist eine grobe Einschätzung der Zeitspanne nach einem Zahnverlust in Wochen möglich, wobei die Einbeziehung weiterer Parameter eine höhere Präzision ermöglicht. rnrnObwohl verschiedene dentale Pathologien in dieser Studie berücksichtigt wurden, sollten zukünftige Untersuchungen genauer auf deren potenzielle Einflussnahme auf den alveolaren Heilungsprozess eingehen. Der kausale Zusammenhang einiger Variablen (wie z. B. Anwesenheit von Nachbarzähnen oder zahnmedizinische Behandlungen), welche die Geschwindigkeit der Heilungsrate beeinflussen, wäre von Bedeutung für zukünftige Untersuchungen des oralen Knochengewebes. Klinische Vergleichsstudien an forensischen Serien mit bekannter TSL oder an einer sich am Anfang des Heilungsprozesses befindlichen klinischen Serie könnten eine Bekräftigung dieser Ergebnisse liefern.