938 resultados para Integer programming, Constraint programming, Sugarcane rail, Job shop


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In today’s competitive markets, the importance of goodscheduling strategies in manufacturing companies lead to theneed of developing efficient methods to solve complexscheduling problems.In this paper, we studied two production scheduling problemswith sequence-dependent setups times. The setup times areone of the most common complications in scheduling problems,and are usually associated with cleaning operations andchanging tools and shapes in machines.The first problem considered is a single-machine schedulingwith release dates, sequence-dependent setup times anddelivery times. The performance measure is the maximumlateness.The second problem is a job-shop scheduling problem withsequence-dependent setup times where the objective is tominimize the makespan.We present several priority dispatching rules for bothproblems, followed by a study of their performance. Finally,conclusions and directions of future research are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Production flow analysis (PFA) is a well-established methodology used for transforming traditional functional layout into product-oriented layout. The method uses part routings to find natural clusters of workstations forming production cells able to complete parts and components swiftly with simplified material flow. Once implemented, the scheduling system is based on period batch control aiming to establish fixed planning, production and delivery cycles for the whole production unit. PFA is traditionally applied to job-shops with functional layouts, and after reorganization within groups lead times reduce, quality improves and motivation among personnel improves. Several papers have documented this, yet no research has studied its application to service operations management. This paper aims to show that PFA can well be applied not only to job-shop and assembly operations, but also to back-office and service processes with real cases. The cases clearly show that PFA reduces non-value adding operations, introduces flow by evening out bottlenecks and diminishes process variability, all of which contribute to efficient operations management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tehdyssä kirjallisuus- ja teoriakatsauksessa vuosien 2006 - 2010 välisenä aikana, Keski-Suomessa toimivan konepajateollisuuden järjestelmätoimittajayrityksen toimeksiannosta, pyrittiin muodostamaan kokonaiskuva laajasta tuotannonsuunnittelun ja -ohjauksen aihealueesta. Perustutkimuskysymykset liittyivät ns. MPC-systeemiin, jolla tarkoitetaan sitä, että tuotannonsuunnittelu- ja ohjauskysymyksissä on huomioitava aina henkilöiden, organisaation, teknologioiden ja prosessien muodostama kokonaisuus. Operatiivisen johtamisen tehtävänä on yrityksen tuotteita koskevan kysynnän ja tarjonnan tasapainottaminen niin, että resursseja käytettäisiin ja tarvittaisiin mahdollisimman vähän vastattaessa kysyntään asiakasvaatimukset huomioiden. Tuotantostrategian pohjalta on voitava rakentaa MPC-systeemi, jonka avulla ja jota kehittäen tuotanto saavuttaisi sille asetetut suorituskykytavoitteet mm. kustannusten, laadun, nopeuden, luotettavuuden sekä tuottavuuskehityksen osalta. Työssä tarkasteltiin yleisen kolmitasoisen viitekehyksen kautta ”perinteisistä MPC-systeemien perusratkaisuista” hierarkkisia, suunnittelu- ja laskentaintensiiviä, MRP-pohjaisia sekä yksinkertaistamiseen ja nopeuteen perustuvia JIT/Lean -menetelmiä. Tämä viitekehys käsittää: 1) kysynnän- ja resurssien hallinnan, 2) yksityiskohtaisemman kapasiteetin ja materiaalien hallinnan sekä 3) tarkemman tuotannon ja hankintojen ohjauksen sekä tuotannon lattiatason osa-alueet. Johtamisen ja MPC-systeemien kehittämisen ”uusina aaltoina ja näkökulmina” raportissa käsiteltiin myös johtamisen eri koulukuntia sekä em. viitekehyksen pohjalta tarvittavia tietojärjestelmiä. Olennaisimpana johtopäätöksenä todettiin, että MRP-pohjaisten ratkaisujen lisäksi, etenkin monimutkaisia tuotteita tilausohjautuvasti valmistavien kappaletavarateollisuuden yritysten, on mahdollisesti hyödynnettävä myös kehittyneempiä suunnittelu- ja ohjausjärjestelmiä. Lisäksi huomattiin, että ”perinteisten strategioiden” rinnalle yritysten on nostettava myös tieto- ja viestintäteknologiastrategiat. On tärkeää ymmärtää, että täydellistä MPC-systeemiä ei ole vielä keksitty: jokaisen yrityksen tehtäväksi ja vastuulle jää ”oman totuutensa” muodostaminen ja systeeminsä rakentaminen sen pohjalta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tarkoitus on kehittää pk-konepajayrityksen tuotannonohjausta ja toteuttaa tuotannonohjausjärjestelmän hallittu käyttöönotto. Tavoitteena on parantaa yrityksen kapasiteetin hallintaa ja toimitusaikapitävyyttä sekä kehittää tuotannon päivittäisohjausta ja koordinointia. Työssä on sovellettu prosessijohtamisen kuvausmenetelmiä ja prosessin kehittämistyökaluja tuotannonohjausprosessin kehittämiseen ja tuotannonohjausjärjestelmän käyttöönottoon. Lisäksi työssä on lähdekirjallisuuden avulla tutkittu eri tuotannonohjausperiaatteiden soveltuvuutta asiakasohjautuvaan joustavaan konepajatuotantoon. Työ on toteutettu kvalitatiivisena toimintatutkimuksena. Tuotannonohjausjärjestelmän käyttöönoton avulla on mahdollista kehittää tuotannon kapasiteetin ohjausta ja tuotannonkoordinaatiota. Tämä kuitenkin edellyttää tuotannonohjausprosessin kuvaamista sekä ohjaukseen osallistuvien henkilöiden roolien ja vastuiden selkeää määrittelyä. Erityisen kriittistä on saada koko organisaatio suhtautumaan muutokseen positiivisesti.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il presente lavoro ha come obiettivo la definizione e la misura della complessità tecnologica, al fine di costruire strumenti a supporto di tutti gli operatori che si occupano dello sviluppo e della fabbricazione di un prodotto industriale, quali progettisti di prodotto e responsabili di produzione. La ricerca è stata sviluppata attraverso le fasi di seguito descritte. Analisi dello stato dell’arte su definizioni e misure della complessità in ambito industriale attraverso l’individuazione e studio di oltre un centinaio di pubblicazioni al riguardo. Classificazione dei metodi proposti in letteratura per la misura della complessità in cinque categorie e analisi critica dei punti di forza e di debolezza dei differenti metodi, ai fini di orientare la elaborazione di un nuovo metodo. Sono stati inoltre analizzati i principali metodi di Intelligenza Artificiali quali potenziali strumenti di calcolo della complessità. Indagine su tematiche correlate alla complessità quali indicatori, trasferimento tecnologico e innovazione. La complessità viene misurata in termini di un indice che appartiene alla categoria degli indicatori, utilizzati in molti ambiti industriali, in particolare quello della misura delle prestazioni di produzione. In particolare si è approfondito significato e utilizzo dell’OEE (Overall Equipment Effectiveness), particolarmente diffuso nelle piccole medie imprese emilianoromagnole e in generale dalle aziende che utilizzano un sistema produttivo di tipo job-shop. È stato implementato un efficace sistema di calcolo dell’OEE presso una azienda meccanica locale. L’indice di complessità trova una delle sue più interessanti applicazioni nelle operazioni di trasferimento tecnologico. Introdurre un’innovazione significa in genere aumentare la complessità del sistema, quindi i due concetti sono connessi. Sono stati esaminati diversi casi aziendali di trasferimento di tecnologia e di misura delle prestazioni produttive, evidenziando legami e influenza della complessità tecnologica sulle scelte delle imprese. Elaborazione di un nuovo metodo di calcolo di un indice di complessità tecnologica di prodotto, a partire dalla metodologia ibrida basata su modello entropico proposta dai Prof. ElMaraghy e Urbanic nel 2003. L’attenzione è stata focalizzata sulla sostituzione nella formula originale a valori determinati tramite interviste agli operatori e pertanto soggettivi, valori oggettivi. Verifica sperimentale della validità della nuova metodologia attraverso l’applicazione della formula ad alcuni componenti meccanici grazie alla collaborazione di un’azienda meccanica manifatturiera. Considerazioni e conclusioni sui risultati ottenuti, sulla metodologia proposta e sulle applicazioni del nuovo indice, delineando gli obiettivi del proseguo della ricerca. In tutto il lavoro si sono evidenziate connessioni e convergenze delle diverse fonti e individuati in diversi ambiti concetti e teorie che forniscono importanti spunti e considerazioni sul tema della complessità. Particolare attenzione è stata dedicata all’intera bibliografia dei Prof. ElMaraghy al momento riconosciuti a livello internazionale come i più autorevoli studiosi del tema della complessità in ambito industriale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE The decision-making process plays a key role in organizations. Every decision-making process produces a final choice that may or may not prompt action. Recurrently, decision makers find themselves in the dichotomous question of following a traditional sequence decision-making process where the output of a decision is used as the input of the next stage of the decision, or following a joint decision-making approach where several decisions are taken simultaneously. The implication of the decision-making process will impact different players of the organization. The choice of the decision- making approach becomes difficult to find, even with the current literature and practitioners’ knowledge. The pursuit of better ways for making decisions has been a common goal for academics and practitioners. Management scientists use different techniques and approaches to improve different types of decisions. The purpose of this decision is to use the available resources as well as possible (data and techniques) to achieve the objectives of the organization. The developing and applying of models and concepts may be helpful to solve managerial problems faced every day in different companies. As a result of this research different decision models are presented to contribute to the body of knowledge of management science. The first models are focused on the manufacturing industry and the second part of the models on the health care industry. Despite these models being case specific, they serve the purpose of exemplifying that different approaches to the problems and could provide interesting results. Unfortunately, there is no universal recipe that could be applied to all the problems. Furthermore, the same model could deliver good results with certain data and bad results for other data. A framework to analyse the data before selecting the model to be used is presented and tested in the models developed to exemplify the ideas. METHODOLOGY As the first step of the research a systematic literature review on the joint decision is presented, as are the different opinions and suggestions of different scholars. For the next stage of the thesis, the decision-making process of more than 50 companies was analysed in companies from different sectors in the production planning area at the Job Shop level. The data was obtained using surveys and face-to-face interviews. The following part of the research into the decision-making process was held in two application fields that are highly relevant for our society; manufacturing and health care. The first step was to study the interactions and develop a mathematical model for the replenishment of the car assembly where the problem of “Vehicle routing problem and Inventory” were combined. The next step was to add the scheduling or car production (car sequencing) decision and use some metaheuristics such as ant colony and genetic algorithms to measure if the behaviour is kept up with different case size problems. A similar approach is presented in a production of semiconductors and aviation parts, where a hoist has to change from one station to another to deal with the work, and a jobs schedule has to be done. However, for this problem simulation was used for experimentation. In parallel, the scheduling of operating rooms was studied. Surgeries were allocated to surgeons and the scheduling of operating rooms was analysed. The first part of the research was done in a Teaching hospital, and for the second part the interaction of uncertainty was added. Once the previous problem had been analysed a general framework to characterize the instance was built. In the final chapter a general conclusion is presented. FINDINGS AND PRACTICAL IMPLICATIONS The first part of the contributions is an update of the decision-making literature review. Also an analysis of the possible savings resulting from a change in the decision process is made. Then, the results of the survey, which present a lack of consistency between what the managers believe and the reality of the integration of their decisions. In the next stage of the thesis, a contribution to the body of knowledge of the operation research, with the joint solution of the replenishment, sequencing and inventory problem in the assembly line is made, together with a parallel work with the operating rooms scheduling where different solutions approaches are presented. In addition to the contribution of the solving methods, with the use of different techniques, the main contribution is the framework that is proposed to pre-evaluate the problem before thinking of the techniques to solve it. However, there is no straightforward answer as to whether it is better to have joint or sequential solutions. Following the proposed framework with the evaluation of factors such as the flexibility of the answer, the number of actors, and the tightness of the data, give us important hints as to the most suitable direction to take to tackle the problem. RESEARCH LIMITATIONS AND AVENUES FOR FUTURE RESEARCH In the first part of the work it was really complicated to calculate the possible savings of different projects, since in many papers these quantities are not reported or the impact is based on non-quantifiable benefits. The other issue is the confidentiality of many projects where the data cannot be presented. For the car assembly line problem more computational power would allow us to solve bigger instances. For the operation research problem there was a lack of historical data to perform a parallel analysis in the teaching hospital. In order to keep testing the decision framework it is necessary to keep applying more case studies in order to generalize the results and make them more evident and less ambiguous. The health care field offers great opportunities since despite the recent awareness of the need to improve the decision-making process there are many opportunities to improve. Another big difference with the automotive industry is that the last improvements are not spread among all the actors. Therefore, in the future this research will focus more on the collaboration between academia and the health care sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes an investigation by the author into the spares operation of compare BroomWade Ltd. Whilst the complete system, including the warehousing and distribution functions, was investigated, the thesis concentrates on the provisioning aspect of the spares supply problem. Analysis of the historical data showed the presence of significant fluctuations in all the measures of system performance. Two Industrial Dynamics simulation models were developed to study this phenomena. The models showed that any fluctuation in end customer demand would be amplified as it passed through the distributor and warehouse stock control systems. The evidence from the historical data available supported this view of the system's operation. The models were utilised to determine which parts of the total system could be expected to exert a critical influence on its performance. The lead time parameters of the supply sector were found to be critical and further study showed that the manner in which the lead time changed with work in progress levels was also an important factor. The problem therefore resolved into the design of a spares manufacturing system. Which exhibited the appropriate dynamic performance characteristics. The gross level of entity presentation, inherent in the Industrial Dynamics methodology, was found to limit the value of these models in the development of detail design proposals. Accordingly, an interacting job shop simulation package was developed to allow detailed evaluation of organisational factors on the performance characteristics of a manufacturing system. The package was used to develop a design for a pilot spares production unit. The need for a manufacturing system to perform successfully under conditions of fluctuating demand is not limited to the spares field. Thus, although the spares exercise provides an example of the approach, the concepts and techniques developed can be considered to have broad application throughout batch manufacturing industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O presente trabalho propõe demonstrar como o sistema PHC Manufactor se adequa à empresa em estudo, Ciclo Fapril, apresentando as opções de planeamento que este oferece, as dificuldades com que a empresa se irá deparar e, quando possível, o que fazer para ultrapassar as adversidades colocadas pelo sistema. Numa segunda parte são estudadas algumas heurísticas, nomeadamente FIFO, Tempo de Processamento, EDD, MOR e LOR, para se perceber qual a que melhor se adapta à empresa, de forma a poder cumprir com os prazos acordados. Posteriormente utilizou-se a heurística com melhores resultados e fez-se algumas alterações aos tempos de processamento dos centros de trabalho para melhorar a sua capacidade de resposta aos pedidos. No Final deste estudo percebeuse que o planeamento por EDD era o que melhor se adaptava a empresa. Percebeu-se ainda que os centros de trabalho AS e AT são os que têm menor produtividade e por este motivo se deveria aumentar a sua produtividade, de forma a aumentar a produtividade global.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 4: Transition Towards Product-Service Systems

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A High-Performance Computing job dispatcher is a critical software that assigns the finite computing resources to submitted jobs. This resource assignment over time is known as the on-line job dispatching problem in HPC systems. The fact the problem is on-line means that solutions must be computed in real-time, and their required time cannot exceed some threshold to do not affect the normal system functioning. In addition, a job dispatcher must deal with a lot of uncertainty: submission times, the number of requested resources, and duration of jobs. Heuristic-based techniques have been broadly used in HPC systems, at the cost of achieving (sub-)optimal solutions in a short time. However, the scheduling and resource allocation components are separated, thus generates a decoupled decision that may cause a performance loss. Optimization-based techniques are less used for this problem, although they can significantly improve the performance of HPC systems at the expense of higher computation time. Nowadays, HPC systems are being used for modern applications, such as big data analytics and predictive model building, that employ, in general, many short jobs. However, this information is unknown at dispatching time, and job dispatchers need to process large numbers of them quickly while ensuring high Quality-of-Service (QoS) levels. Constraint Programming (CP) has been shown to be an effective approach to tackle job dispatching problems. However, state-of-the-art CP-based job dispatchers are unable to satisfy the challenges of on-line dispatching, such as generate dispatching decisions in a brief period and integrate current and past information of the housing system. Given the previous reasons, we propose CP-based dispatchers that are more suitable for HPC systems running modern applications, generating on-line dispatching decisions in a proper time and are able to make effective use of job duration predictions to improve QoS levels, especially for workloads dominated by short jobs.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This thesis deals with an investigation of Decomposition and Reformulation to solve Integer Linear Programming Problems. This method is often a very successful approach computationally, producing high-quality solutions for well-structured combinatorial optimization problems like vehicle routing, cutting stock, p-median and generalized assignment . However, until now the method has always been tailored to the specific problem under investigation. The principal innovation of this thesis is to develop a new framework able to apply this concept to a generic MIP problem. The new approach is thus capable of auto-decomposition and autoreformulation of the input problem applicable as a resolving black box algorithm and works as a complement and alternative to the normal resolving techniques. The idea of Decomposing and Reformulating (usually called in literature Dantzig and Wolfe Decomposition DWD) is, given a MIP, to convexify one (or more) subset(s) of constraints (slaves) and working on the partially convexified polyhedron(s) obtained. For a given MIP several decompositions can be defined depending from what sets of constraints we want to convexify. In this thesis we mainly reformulate MIPs using two sets of variables: the original variables and the extended variables (representing the exponential extreme points). The master constraints consist of the original constraints not included in any slaves plus the convexity constraint(s) and the linking constraints(ensuring that each original variable can be viewed as linear combination of extreme points of the slaves). The solution procedure consists of iteratively solving the reformulated MIP (master) and checking (pricing) if a variable of reduced costs exists, and in which case adding it to the master and solving it again (columns generation), or otherwise stopping the procedure. The advantage of using DWD is that the reformulated relaxation gives bounds stronger than the original LP relaxation, in addition it can be incorporated in a Branch and bound scheme (Branch and Price) in order to solve the problem to optimality. If the computational time for the pricing problem is reasonable this leads in practice to a stronger speed up in the solution time, specially when the convex hull of the slaves is easy to compute, usually because of its special structure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper is on the problem of short-term hydro, scheduling, particularly concerning head-dependent cascaded hydro systems. We propose a novel mixed-integer quadratic programming approach, considering not only head-dependency, but also discontinuous operating regions and discharge ramping constraints. Thus, an enhanced short-term hydro scheduling is provided due to the more realistic modeling presented in this paper. Numerical results from two case studies, based on Portuguese cascaded hydro systems, illustrate the proficiency of the proposed approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the most difficult problems that face researchers experimenting with complex systems in real world applications is the Facility Layout Design Problem. It relies with the design and location of production lines, machinery and equipment, inventory storage and shipping facilities. In this work it is intended to address this problem through the use of Constraint Logic Programming (CLP) technology. The use of Genetic Algorithms (GA) as optimisation technique in CLP environment is also an issue addressed. The approach aims the implementation of genetic algorithm operators following the CLP paradigm.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the energy management of the isolated operation of small power system, the economic scheduling of the generation units is a crucial problem. Applying right timing can maximize the performance of the supply. The optimal operation of a wind turbine, a solar unit, a fuel cell and a storage battery is searched by a mixed-integer linear programming implemented in General Algebraic Modeling Systems (GAMS). A Virtual Power Producer (VPP) can optimal operate the generation units, assured the good functioning of equipment, including the maintenance, operation cost and the generation measurement and control. A central control at system allows a VPP to manage the optimal generation and their load control. The application of methodology to a real case study in Budapest Tech, demonstrates the effectiveness of this method to solve the optimal isolated dispatch of the DC micro-grid renewable energy park. The problem has been converged in 0.09 s and 30 iterations.