944 resultados para Mixed Binary Linear Programming


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Im Bereich sicherheitsrelevanter eingebetteter Systeme stellt sich der Designprozess von Anwendungen als sehr komplex dar. Entsprechend einer gegebenen Hardwarearchitektur lassen sich Steuergeräte aufrüsten, um alle bestehenden Prozesse und Signale pünktlich auszuführen. Die zeitlichen Anforderungen sind strikt und müssen in jeder periodischen Wiederkehr der Prozesse erfüllt sein, da die Sicherstellung der parallelen Ausführung von größter Bedeutung ist. Existierende Ansätze können schnell Designalternativen berechnen, aber sie gewährleisten nicht, dass die Kosten für die nötigen Hardwareänderungen minimal sind. Wir stellen einen Ansatz vor, der kostenminimale Lösungen für das Problem berechnet, die alle zeitlichen Bedingungen erfüllen. Unser Algorithmus verwendet Lineare Programmierung mit Spaltengenerierung, eingebettet in eine Baumstruktur, um untere und obere Schranken während des Optimierungsprozesses bereitzustellen. Die komplexen Randbedingungen zur Gewährleistung der periodischen Ausführung verlagern sich durch eine Zerlegung des Hauptproblems in unabhängige Unterprobleme, die als ganzzahlige lineare Programme formuliert sind. Sowohl die Analysen zur Prozessausführung als auch die Methoden zur Signalübertragung werden untersucht und linearisierte Darstellungen angegeben. Des Weiteren präsentieren wir eine neue Formulierung für die Ausführung mit fixierten Prioritäten, die zusätzlich Prozessantwortzeiten im schlimmsten anzunehmenden Fall berechnet, welche für Szenarien nötig sind, in denen zeitliche Bedingungen an Teilmengen von Prozessen und Signalen gegeben sind. Wir weisen die Anwendbarkeit unserer Methoden durch die Analyse von Instanzen nach, welche Prozessstrukturen aus realen Anwendungen enthalten. Unsere Ergebnisse zeigen, dass untere Schranken schnell berechnet werden können, um die Optimalität von heuristischen Lösungen zu beweisen. Wenn wir optimale Lösungen mit Antwortzeiten liefern, stellt sich unsere neue Formulierung in der Laufzeitanalyse vorteilhaft gegenüber anderen Ansätzen dar. Die besten Resultate werden mit einem hybriden Ansatz erzielt, der heuristische Startlösungen, eine Vorverarbeitung und eine heuristische mit einer kurzen nachfolgenden exakten Berechnungsphase verbindet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Khutoretsky dealt with the problem of maximising a linear utility function (MUF) over the set of short-term equilibria in a housing market by reducing it to a linear programming problem, and suggested a combinatorial algorithm for this problem. Two approaches to the market adjustment were considered: the funding of housing construction and the granting of housing allowances. In both cases, locally optimal regulatory measures can be developed using the corresponding dual prices. The optimal effects (with the regulation expenditures restricted by an amount K) can be found using specialised models based on MUF: a model M1 for choice of the optimum structure of investment in housing construction, and a model M2 for optimum distribution of housing allowances. The linear integer optimisation problems corresponding to these models are initially difficult but can be solved after slight modifications of the parameters. In particular, the necessary modification of K does not exceed the maximum construction cost of one dwelling (for M1) or the maximum size of one housing allowance (for M2). The result is particularly useful since slight modification of K is not essential in practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Despite recent algorithmic and conceptual progress, the stoichiometric network analysis of large metabolic models remains a computationally challenging problem. RESULTS: SNA is a interactive, high performance toolbox for analysing the possible steady state behaviour of metabolic networks by computing the generating and elementary vectors of their flux and conversions cones. It also supports analysing the steady states by linear programming. The toolbox is implemented mainly in Mathematica and returns numerically exact results. It is available under an open source license from: http://bioinformatics.org/project/?group_id=546. CONCLUSION: Thanks to its performance and modular design, SNA is demonstrably useful in analysing genome scale metabolic networks. Further, the integration into Mathematica provides a very flexible environment for the subsequent analysis and interpretation of the results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Small-scale farmers in the Chipata District of Zambia rely on their farm fields to grow maize and groundnuts for food security. Cotton production and surplus food security crops are used to generate income to provide for their families. With increasing population pressure, available land has decreased and farmers struggle to provide the necessary food requirements and income to meet their family’s needs. The purpose of the study was to determine how a farmer can best allocate his land to produce maize, groundnuts and cotton when constrained by labor and capital resources to generate the highest potential for food security and financial gains. Data from the 2008-2009 growing season was compiled and analyzed using a linear programming model. The study determined that farmers make the most profit by allocating all additional land and resources to cotton after meeting their minimum food security requirements. The study suggests growing cotton is a beneficial practice for small-scale subsistence farmers to generate income when restricted by limited resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Im operativen Betrieb einer Stückgutspeditionsanlage entscheidet der Betriebslenker bzw. der Disponent in einem ersten Schritt darüber, an welche Tore die Fahrzeuge zur Be- und Entladung andocken sollen. Darüber hinaus muss er für jede Tour ein Zeitfenster ausweisen innerhalb dessen sie das jeweilige Tor belegt. Durch die örtliche und zeitliche Fahrzeug-Tor-Zuordnung wird der für den innerbetrieblichen Umschlagprozess erforderliche Ressourcenaufwand in Form von zu fahrenden Wegstrecken oder aber Gabelstaplerstunden bestimmt. Ein Ziel der Planungsaufgabe ist somit, die Zuordnung der Fahrzeuge an die Tore so vorzunehmen, dass dabei minimale innerbetriebliche Wegstrecken entstehen. Dies führt zu einer minimalen Anzahl an benötigten Umschlagmittelressourcen. Darüber hinaus kann es aber auch zweckmäßig sein, die Fahrzeuge möglichst früh an die Tore anzudocken. Jede Tour verfügt über einen individuellen Fahrplan, der Auskunft über den Ankunftszeitpunkt sowie den Abfahrtszeitpunkt der jeweiligen Tour von der Anlage gibt. Nur innerhalb dieses Zeitfensters darf der Disponent die Tour einem der Tore zuweisen. Geschieht die Zuweisung nicht sofort nach Ankunft in der Anlage, so muss das Fahrzeug auf einer Parkfläche warten. Eine Minimierung der Wartezeiten ist wünschenswert, damit das Gelände der Anlage möglichst nicht durch zuviele Fahrzeuge gleichzeitig belastet wird. Es kann vor allem aber auch im Hinblick auf das Reservieren der Tore für zeitkritische Touren sinnvoll sein, Fahrzeuge möglichst früh abzufertigen. Am Lehrstuhl Verkehrssysteme und -logistik (VSL) der Universität Dortmund wurde die Entscheidungssituation im Rahmen eines Forschungsprojekts bei der Stiftung Industrieforschung in Anlehnung an ein zeitdiskretes Mehrgüterflussproblem mit unsplittable flow Bedingungen modelliert. Die beiden Zielsetzungen wurden dabei in einer eindimensionalen Zielfunktion integriert. Das resultierende Mixed Integer Linear Programm (MILP) wurde programmiert und für mittlere Szenarien durch Eingabe in den Optimization Solver CPlex mit dem dort implementierten exakten Branch-and-Cut Verfahren gelöst. Parallel wurde im Rahmen einer Kooperation zwischen dem Lehrstuhl VSL und dem Unternehmen hafa Docking Systems, einem der weltweit führenden Tor und Rampenhersteller, für die gleiche Planungsaufgabe ein heuristisches Scheduling Verfahren sowie ein Dispositionsleitstand namens LoadDock Navigation entwickelt. Der Dispositionsleitstand dient der optimalen Steuerung der Torbelegungen in logistischen Anlagen. In dem Leitstand wird planerische Intelligenz in Form des heuristischen Schedulingverfahrens, technische Neuerungen in der Rampentechnik in Form von Sensoren und das Expertenwissen des Disponenten in einem Tool verbunden. Das mathematische Modell sowie der Prototyp mit der integrierten Heuristik werden im Rahmen dieses Artikels vorgestellt.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While revenue management (RM) is traditionally considered a tool of service operations, RM shows considerable potential for application in manufacturing operations. The typical challenges in make-to-order manufacturing are fixed manufacturing capacities and a great variety in offered products, going along with pronounced fluctuations in demand and profitability. Since Harris and Pinder in the mid-90s, numerous papers have furthered the understanding of RM theory in this environment. Nevertheless, results to be expected from applying the developed methods to a practical industry setting have yet to be reported. To this end, this paper investigates a possible application of RM at ThyssenKrupp VDM, leading to considerable improvements in several areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The execution of a project requires resources that are generally scarce. Classical approaches to resource allocation assume that the usage of these resources by an individual project activity is constant during the execution of that activity; in practice, however, the project manager may vary resource usage over time within prescribed bounds. This variation gives rise to the project scheduling problem which consists in allocating the scarce resources to the project activities over time such that the project duration is minimized, the total number of resource units allocated equals the prescribed work content of each activity, and various work-content-related constraints are met. We formulate this problem for the first time as a mixed-integer linear program. Our computational results for a standard test set from the literature indicate that this model outperforms the state-of-the-art solution methods for this problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A patient classification system was developed integrating a patient acuity instrument with a computerized nursing distribution method based on a linear programming model. The system was designed for real-time measurement of patient acuity (workload) and allocation of nursing personnel to optimize the utilization of resources.^ The acuity instrument was a prototype tool with eight categories of patients defined by patient severity and nursing intensity parameters. From this tool, the demand for nursing care was defined in patient points with one point equal to one hour of RN time. Validity and reliability of the instrument was determined as follows: (1) Content validity by a panel of expert nurses; (2) predictive validity through a paired t-test analysis of preshift and postshift categorization of patients; (3) initial reliability by a one month pilot of the instrument in a practice setting; and (4) interrater reliability by the Kappa statistic.^ The nursing distribution system was a linear programming model using a branch and bound technique for obtaining integer solutions. The objective function was to minimize the total number of nursing personnel used by optimally assigning the staff to meet the acuity needs of the units. A penalty weight was used as a coefficient of the objective function variables to define priorities for allocation of staff.^ The demand constraints were requirements to meet the total acuity points needed for each unit and to have a minimum number of RNs on each unit. Supply constraints were: (1) total availability of each type of staff and the value of that staff member (value was determined relative to that type of staff's ability to perform the job function of an RN (i.e., value for eight hours RN = 8 points, LVN = 6 points); (2) number of personnel available for floating between units.^ The capability of the model to assign staff quantitatively and qualitatively equal to the manual method was established by a thirty day comparison. Sensitivity testing demonstrated appropriate adjustment of the optimal solution to changes in penalty coefficients in the objective function and to acuity totals in the demand constraints.^ Further investigation of the model documented: correct adjustment of assignments in response to staff value changes; and cost minimization by an addition of a dollar coefficient to the objective function. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with an event-bus tour booked by Bollywood film fans. During the tour, the participants visit selected locations of famous Bollywood films at various sites in Switzerland. Moreover, the tour includes stops for lunch and shopping. Each day, up to five buses operate the tour; for organizational reasons, two or more buses cannot stay at the same location simultaneously. The planning problem is how to compute a feasible schedule for each bus such that the total waiting time (primary objective) and the total travel time (secondary objective) are minimized. We formulate this problem as a mixed-integer linear program, and we report on computational results obtained with the Gurobi solver.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a real-world problem that arises in security threat detection applications. The problem consists of deploying mobile detectors on moving units that follow predefined routes. Examples of such units are buses, coaches, and trolleys. Due to a limited budget not all available units can be equipped with a detector. The goal is to equip a subset of units such that the utility of the resulting coverage is maximized. Existing methods for detector deployment are designed to place detectors in fixed locations and are therefore not applicable to the problem considered here. We formulate the planning problem as a binary linear program and present a coverage heuristic for generating effective deployments in short CPU time. The heuristic has theoretical performance guarantees for important special cases of the problem. The effectiveness of the coverage heuristic is demonstrated in a computational analysis based on 28 instances that we derived from real-world data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data compiled within the IMPENSO project. The Impact of ENSO on Sustainable Water Management and the Decision-Making Community at a Rainforest Margin in Indonesia (IMPENSO), http://www.gwdg.de/~impenso, was a German-Indonesian research project (2003-2007) that has studied the impact of ENSO (El Nino-Southern Oscillation) on the water resources and the agricultural production in the PALU RIVER watershed in Central Sulawesi. ENSO is a climate variability that causes serious droughts in Indonesia and other countries of South-East Asia. The last ENSO event occurred in 1997. As in other regions, many farmers in Central Sulawesi suffered from reduced crop yields and lost their livestock. A better prediction of ENSO and the development of coping strategies would help local communities mitigate the impact of ENSO on rural livelihoods and food security. The IMPENSO project deals with the impact of the climate variability ENSO (El Niño Southern Oscillation) on water resource management and the local communities in the Palu River watershed of Central Sulawesi, Indonesia. The project consists of three interrelated sub-projects, which study the local and regional manifestation of ENSO using the Regional Climate Models REMO and GESIMA (Sub-project A), quantify the impact of ENSO on the availability of water for agriculture and other uses, using the distributed hydrological model WaSiM-ETH (Sub-project B), and analyze the socio-economic impact and the policy implications of ENSO on the basis of a production function analysis, a household vulnerability analysis, and a linear programming model (Sub-project C). The models used in the three sub-projects will be integrated to simulate joint scenarios that are defined in collaboration with local stakeholders and are relevant for the design of coping strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Koopman et al. (2014) developed a method to consistently decompose gross exports in value-added terms that accommodate infinite repercussions of international and inter-sector transactions. This provides a better understanding of trade in value added in global value chains than does the conventional gross exports method, which is affected by double-counting problems. However, the new framework is based on monetary input--output (IO) tables and cannot distinguish prices from quantities; thus, it is unable to consider financial adjustments through the exchange market. In this paper, we propose a framework based on a physical IO system, characterized by its linear programming equivalent that can clarify the various complexities relevant to the existing indicators and is proved to be consistent with Koopman's results when the physical decompositions are evaluated in monetary terms. While international monetary tables are typically described in current U.S. dollars, the physical framework can elucidate the impact of price adjustments through the exchange market. An iterative procedure to calculate the exchange rates is proposed, and we also show that the physical framework is also convenient for considering indicators associated with greenhouse gas (GHG) emissions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La evaluación de la seguridad de estructuras antiguas de fábrica es un problema abierto.El material es heterogéneo y anisótropo, el estado previo de tensiones difícil de conocer y las condiciones de contorno inciertas. A comienzos de los años 50 se demostró que el análisis límite era aplicable a este tipo de estructuras, considerándose desde entonces como una herramienta adecuada. En los casos en los que no se produce deslizamiento la aplicación de los teoremas del análisis límite estándar constituye una herramienta formidable por su simplicidad y robustez. No es necesario conocer el estado real de tensiones. Basta con encontrar cualquier solución de equilibrio, y que satisfaga las condiciones de límite del material, en la seguridad de que su carga será igual o inferior a la carga real de inicio de colapso. Además esta carga de inicio de colapso es única (teorema de la unicidad) y se puede obtener como el óptimo de uno cualquiera entre un par de programas matemáticos convexos duales. Sin embargo, cuando puedan existir mecanismos de inicio de colapso que impliquen deslizamientos, cualquier solución debe satisfacer tanto las restricciones estáticas como las cinemáticas, así como un tipo especial de restricciones disyuntivas que ligan las anteriores y que pueden plantearse como de complementariedad. En este último caso no está asegurada la existencia de una solución única, por lo que es necesaria la búsqueda de otros métodos para tratar la incertidumbre asociada a su multiplicidad. En los últimos años, la investigación se ha centrado en la búsqueda de un mínimo absoluto por debajo del cual el colapso sea imposible. Este método es fácil de plantear desde el punto de vista matemático, pero intratable computacionalmente, debido a las restricciones de complementariedad 0 y z 0 que no son ni convexas ni suaves. El problema de decisión resultante es de complejidad computacional No determinista Polinomial (NP)- completo y el problema de optimización global NP-difícil. A pesar de ello, obtener una solución (sin garantía de exito) es un problema asequible. La presente tesis propone resolver el problema mediante Programación Lineal Secuencial, aprovechando las especiales características de las restricciones de complementariedad, que escritas en forma bilineal son del tipo y z = 0; y 0; z 0 , y aprovechando que el error de complementariedad (en forma bilineal) es una función de penalización exacta. Pero cuando se trata de encontrar la peor solución, el problema de optimización global equivalente es intratable (NP-difícil). Además, en tanto no se demuestre la existencia de un principio de máximo o mínimo, existe la duda de que el esfuerzo empleado en aproximar este mínimo esté justificado. En el capítulo 5, se propone hallar la distribución de frecuencias del factor de carga, para todas las soluciones de inicio de colapso posibles, sobre un sencillo ejemplo. Para ello, se realiza un muestreo de soluciones mediante el método de Monte Carlo, utilizando como contraste un método exacto de computación de politopos. El objetivo final es plantear hasta que punto está justificada la busqueda del mínimo absoluto y proponer un método alternativo de evaluación de la seguridad basado en probabilidades. Las distribuciones de frecuencias, de los factores de carga correspondientes a las soluciones de inicio de colapso obtenidas para el caso estudiado, muestran que tanto el valor máximo como el mínimo de los factores de carga son muy infrecuentes, y tanto más, cuanto más perfecto y contínuo es el contacto. Los resultados obtenidos confirman el interés de desarrollar nuevos métodos probabilistas. En el capítulo 6, se propone un método de este tipo basado en la obtención de múltiples soluciones, desde puntos de partida aleatorios y calificando los resultados mediante la Estadística de Orden. El propósito es determinar la probabilidad de inicio de colapso para cada solución.El método se aplica (de acuerdo a la reducción de expectativas propuesta por la Optimización Ordinal) para obtener una solución que se encuentre en un porcentaje determinado de las peores. Finalmente, en el capítulo 7, se proponen métodos híbridos, incorporando metaheurísticas, para los casos en que la búsqueda del mínimo global esté justificada. Abstract Safety assessment of the historic masonry structures is an open problem. The material is heterogeneous and anisotropic, the previous state of stress is hard to know and the boundary conditions are uncertain. In the early 50's it was proven that limit analysis was applicable to this kind of structures, being considered a suitable tool since then. In cases where no slip occurs, the application of the standard limit analysis theorems constitutes an excellent tool due to its simplicity and robustness. It is enough find any equilibrium solution which satisfy the limit constraints of the material. As we are certain that this load will be equal to or less than the actual load of the onset of collapse, it is not necessary to know the actual stresses state. Furthermore this load for the onset of collapse is unique (uniqueness theorem), and it can be obtained as the optimal from any of two mathematical convex duals programs However, if the mechanisms of the onset of collapse involve sliding, any solution must satisfy both static and kinematic constraints, and also a special kind of disjunctive constraints linking the previous ones, which can be formulated as complementarity constraints. In the latter case, it is not guaranted the existence of a single solution, so it is necessary to look for other ways to treat the uncertainty associated with its multiplicity. In recent years, research has been focused on finding an absolute minimum below which collapse is impossible. This method is easy to set from a mathematical point of view, but computationally intractable. This is due to the complementarity constraints 0 y z 0 , which are neither convex nor smooth. The computational complexity of the resulting decision problem is "Not-deterministic Polynomialcomplete" (NP-complete), and the corresponding global optimization problem is NP-hard. However, obtaining a solution (success is not guaranteed) is an affordable problem. This thesis proposes solve that problem through Successive Linear Programming: taking advantage of the special characteristics of complementarity constraints, which written in bilinear form are y z = 0; y 0; z 0 ; and taking advantage of the fact that the complementarity error (bilinear form) is an exact penalty function. But when it comes to finding the worst solution, the (equivalent) global optimization problem is intractable (NP-hard). Furthermore, until a minimum or maximum principle is not demonstrated, it is questionable that the effort expended in approximating this minimum is justified. XIV In chapter 5, it is proposed find the frequency distribution of the load factor, for all possible solutions of the onset of collapse, on a simple example. For this purpose, a Monte Carlo sampling of solutions is performed using a contrast method "exact computation of polytopes". The ultimate goal is to determine to which extent the search of the global minimum is justified, and to propose an alternative approach to safety assessment based on probabilities. The frequency distributions for the case study show that both the maximum and the minimum load factors are very infrequent, especially when the contact gets more perfect and more continuous. The results indicates the interest of developing new probabilistic methods. In Chapter 6, is proposed a method based on multiple solutions obtained from random starting points, and qualifying the results through Order Statistics. The purpose is to determine the probability for each solution of the onset of collapse. The method is applied (according to expectations reduction given by the Ordinal Optimization) to obtain a solution that is in a certain percentage of the worst. Finally, in Chapter 7, hybrid methods incorporating metaheuristics are proposed for cases in which the search for the global minimum is justified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Agro-areas of Arroyos Menores (La Colacha) west and south of Rand south of R?o Cuarto (Prov. of Cordoba, Argentina) basins are very fertile but have high soil loses. Extreme rain events, inundations and other severe erosions forming gullies demand urgently actions in this area to avoid soil degradation and erosion supporting good levels of agro production. The authors first improved hydrologic data on La Colacha, evaluated the systems of soil uses and actions that could be recommended considering the relevant aspects of the study area and applied decision support systems (DSS) with mathematic tools for planning of defences and uses of soils in these areas. These were conducted here using multi-criteria models, in multi-criteria decision making (MCDM); first of discrete MCDM to chose among global types of use of soils, and then of continuous MCDM to evaluate and optimize combined actions, including repartition of soil use and the necessary levels of works for soil conservation and for hydraulic management to conserve against erosion these basins. Relatively global solutions for La Colacha area have been defined and were optimised by Linear Programming in Goal Programming forms that are presented as Weighted or Lexicographic Goal Programming and as Compromise Programming. The decision methods used are described, indicating algorithms used, and examples for some representative scenarios on La Colacha area are given.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a dominance intensity measuring method to derive a ranking of alternatives to deal with incomplete information in multi-criteria decision-making problems on the basis of multi-attribute utility theory (MAUT) and fuzzy sets theory. We consider the situation where there is imprecision concerning decision-makers’ preferences, and imprecise weights are represented by trapezoidal fuzzy weights.The proposed method is based on the dominance values between pairs of alternatives. These values can be computed by linear programming, as an additive multi-attribute utility model is used to rate the alternatives. Dominance values are then transformed into dominance intensity measures, used to rank the alternatives under consideration. Distances between fuzzy numbers based on the generalization of the left and right fuzzy numbers are utilized to account for fuzzy weights. An example concerning the selection of intervention strategies to restore an aquatic ecosystem contaminated by radionuclides illustrates the approach. Monte Carlo simulation techniques have been used to show that the proposed method performs well for different imprecision levels in terms of a hit ratio and a rank-order correlation measure.