879 resultados para priority dispatching rules
Resumo:
In todays competitive markets, the importance of goodscheduling strategies in manufacturing companies lead to theneed of developing efficient methods to solve complexscheduling problems.In this paper, we studied two production scheduling problemswith sequence-dependent setups times. The setup times areone of the most common complications in scheduling problems,and are usually associated with cleaning operations andchanging tools and shapes in machines.The first problem considered is a single-machine schedulingwith release dates, sequence-dependent setup times anddelivery times. The performance measure is the maximumlateness.The second problem is a job-shop scheduling problem withsequence-dependent setup times where the objective is tominimize the makespan.We present several priority dispatching rules for bothproblems, followed by a study of their performance. Finally,conclusions and directions of future research are presented.
Resumo:
In the contemporary business environment, to adhere to the need of the customers, caused the shift from mass production to mass-customization. This necessitates the supply chain (SC) to be effective flexible. The purpose of this paper is to seek flexibility through adoption of family-based dispatching rules under the influence of inventory system implemented at downstream echelons of an industrial supply chain network. We compared the family-based dispatching rules in existing literature under the purview of inventory system and information sharing within a supply chain network. The dispatching rules are compared for Average Flow Time performance, which is averaged over the three product families. The performance is measured using extensive discrete event simulation process. Given the various inventory related operational factors at downstream echelons, the present paper highlights the importance of strategically adopting appropriate family-based dispatching rule at the manufacturing end. In the environment of mass customization, it becomes imperative to adopt the family-based dispatching rule from the system wide SC perspective. This warrants the application of intra as well as inter-echelon information coordination. The holonic paradigm emerges in this research stream, amidst the holistic approach and the vital systemic approach. The present research shows its novelty in triplet. Firstly, it provides leverage to manager to strategically adopting a dispatching rule from the inventory system perspective. Secondly, the findings provide direction for the attenuation of adverse impact accruing from demand amplification (bullwhip effect) in the form of inventory levels by appropriately adopting family-based dispatching rule. Thirdly, the information environment is conceptualized under the paradigm of Koestler's holonic theory.
Resumo:
A job shop with one batch processing and several discrete machines is analyzed. Given a set of jobs, their process routes, processing requirements, and size, the objective is to schedule the jobs such that the makespan is minimized. The batch processing machine can process a batch of jobs as long as the machine capacity is not violated. The batch processing time is equal to the longest processing job in the batch. The problem under study can be represented as Jm:batch:Cmax. If no batches were formed, the scheduling problem under study reduces to the classical job shop scheduling problem (i.e. Jm:: Cmax), which is known to be NP-hard. This research extends the scheduling literature by combining Jm::Cmax with batch processing. The primary contributions are the mathematical formulation, a new network representation and several solution approaches. The problem under study is observed widely in metal working and other industries, but received limited or no attention due to its complexity. A novel network representation of the problem using disjunctive and conjunctive arcs, and a mathematical formulation are proposed to minimize the makespan. Besides that, several algorithms, like batch forming heuristics, dispatching rules, Modified Shifting Bottleneck, Tabu Search (TS) and Simulated Annealing (SA), were developed and implemented. An experimental study was conducted to evaluate the proposed heuristics, and the results were compared to those from a commercial solver (i.e., CPLEX). TS and SA, with the combination of MWKR-FF as the initial solution, gave the best solutions among all the heuristics proposed. Their results were close to CPLEX; and for some larger instances, with total operations greater than 225, they were competitive in terms of solution quality and runtime. For some larger problem instances, CPLEX was unable to report a feasible solution even after running for several hours. Between SA and the experimental study indicated that SA produced a better average Cmax for all instances. The solution approaches proposed will benefit practitioners to schedule a job shop (with both discrete and batch processing machines) more efficiently. The proposed solution approaches are easier to implement and requires short run times to solve large problem instances.
Resumo:
This research aims at a study of the hybrid flow shop problem which has parallel batch-processing machines in one stage and discrete-processing machines in other stages to process jobs of arbitrary sizes. The objective is to minimize the makespan for a set of jobs. The problem is denoted as: FF: batch1,sj:Cmax. The problem is formulated as a mixed-integer linear program. The commercial solver, AMPL/CPLEX, is used to solve problem instances to their optimality. Experimental results show that AMPL/CPLEX requires considerable time to find the optimal solution for even a small size problem, i.e., a 6-job instance requires 2 hours in average. A bottleneck-first-decomposition heuristic (BFD) is proposed in this study to overcome the computational (time) problem encountered while using the commercial solver. The proposed BFD heuristic is inspired by the shifting bottleneck heuristic. It decomposes the entire problem into three sub-problems, and schedules the sub-problems one by one. The proposed BFD heuristic consists of four major steps: formulating sub-problems, prioritizing sub-problems, solving sub-problems and re-scheduling. For solving the sub-problems, two heuristic algorithms are proposed; one for scheduling a hybrid flow shop with discrete processing machines, and the other for scheduling parallel batching machines (single stage). Both consider job arrival and delivery times. An experiment design is conducted to evaluate the effectiveness of the proposed BFD, which is further evaluated against a set of common heuristics including a randomized greedy heuristic and five dispatching rules. The results show that the proposed BFD heuristic outperforms all these algorithms. To evaluate the quality of the heuristic solution, a procedure is developed to calculate a lower bound of makespan for the problem under study. The lower bound obtained is tighter than other bounds developed for related problems in literature. A meta-search approach based on the Genetic Algorithm concept is developed to evaluate the significance of further improving the solution obtained from the proposed BFD heuristic. The experiment indicates that it reduces the makespan by 1.93 % in average within a negligible time when problem size is less than 50 jobs.
Resumo:
In this paper we address the real-time capabilities of P-NET, which is a multi-master fieldbus standard based on a virtual token passing scheme. We show how P-NET’s medium access control (MAC) protocol is able to guarantee a bounded access time to message requests. We then propose a model for implementing fixed prioritybased dispatching mechanisms at each master’s application level. In this way, we diminish the impact of the first-come-first-served (FCFS) policy that P-NET uses at the data link layer. The proposed model rises several issues well known within the real-time systems community: message release jitter; pre-run-time schedulability analysis in non pre-emptive contexts; non-independence of tasks at the application level. We identify these issues in the proposed model and show how results available for priority-based task dispatching can be adapted to encompass priority-based message dispatching in P-NET networks.
Resumo:
We study the assignment of indivisible objects with quotas (houses, jobs, or offices) to a set of agents (students, job applicants, or professors). Each agent receives at most one object and monetary compensations are not possible. We characterize efficient priority rules by efficiency, strategy-proofness, and renegotiation-proofness. Such a rule respects an acyclical priority structure and the allocations can be determined using the deferred acceptance algorithm.
Resumo:
One of the assumptions of the Capacitated Facility Location Problem (CFLP) is thatdemand is known and fixed. Most often, this is not the case when managers take somestrategic decisions such as locating facilities and assigning demand points to thosefacilities. In this paper we consider demand as stochastic and we model each of thefacilities as an independent queue. Stochastic models of manufacturing systems anddeterministic location models are put together in order to obtain a formula for thebacklogging probability at a potential facility location.Several solution techniques have been proposed to solve the CFLP. One of the mostrecently proposed heuristics, a Reactive Greedy Adaptive Search Procedure, isimplemented in order to solve the model formulated. We present some computationalexperiments in order to evaluate the heuristics performance and to illustrate the use ofthis new formulation for the CFLP. The paper finishes with a simple simulationexercise.
Resumo:
We study the assignment of indivisible objects with quotas (houses, jobs, or offices) to a set of agents (students, job applicants, or professors). Each agent receives at most one object and monetary compensations are not possible. We characterize efficient priority rules by efficiency, strategy-proofness, and reallocation-consistency. Such a rule respects an acyclical priority structure and the allocations can be determined using the deferred acceptance algorithm.
Resumo:
Various software packages for project management include a procedure for resource-constrained scheduling. In several packages, the user can influence this procedure by selecting a priority rule. However, the resource-allocation methods that are implemented in the procedures are proprietary information; therefore, the question of how the priority-rule selection impacts the performance of the procedures arises. We experimentally evaluate the resource-allocation methods of eight recent software packages using the 600 instances of the PSPLIB J120 test set. The results of our analysis indicate that applying the default rule tends to outperform a randomly selected rule, whereas applying two randomly selected rules tends to outperform the default rule. Applying a small set of more than two rules further improves the project durations considerably. However, a large number of rules must be applied to obtain the best possible project durations.
Resumo:
Augustine Lonergan, chairman of subcommittee.
Resumo:
Despite an extensive market segmentation literature, applied academic studies which bridge segmentation theory and practice remain a priority for researchers. The need for studies which examine the segmentation implementation barriers faced by organisations is particularly acute. We explore segmentation implementation through the eyes of a European utilities business, by following its progress through a major segmentation project. The study reveals the character and impact of implementation barriers occurring at different stages in the segmentation process. By classifying the barriers, we develop implementation "rules" for practitioners which are designed to minimise their occurrence and impact. We further contribute to the literature by developing a deeper understanding of the mechanisms through which these implementation rules can be applied.
Resumo:
Enforcement of copyright online and fighting online “piracy” is a high priority on the EU agenda. Private international law questions have recently become some of the most challenging issues in this area. Internet service providers are still uncertain how the Brussels I Regulation (Recast) provisions would apply in EU-wide copyright infringement cases and in which country they can be sued for copyright violations. Meanwhile, because of the territorial approach that still underlies EU copyright law, right holders are unable to acquire EU-wide relief for copyright infringements online. This article first discusses the recent CJEU rulings in the Pinckney and Hejduk cases and argues that the “access approach” that the Court adopted for solving jurisdiction questions could be quite reasonable if it is applied with additional legal measures at the level of substantive law, such as the targeting doctrine. Secondly, the article explores the alternatives to the currently established lex loci protectionis rule that would enable right holders to get EU-wide remedies under a single applicable law. In particular, the analysis focuses on the special applicable law rule for ubiquitous copyright infringements, as suggested by the CLIP Group, and other international proposals.
Resumo:
We consider collective decision problems given by a profile of single-peaked preferences defined over the real line and a set of pure public facilities to be located on the line. In this context, Bochet and Gordon (2012) provide a large class of priority rules based on efficiency, object-population monotonicity and sovereignty. Each such rule is described by a fixed priority ordering among interest groups. We show that any priority rule which treats agents symmetrically — anonymity — respects some form of coherence across collective decision problems — reinforcement — and only depends on peak information — peakonly — is a weighted majoritarian rule. Each such rule defines priorities based on the relative size of the interest groups and specific weights attached to locations. We give an explicit account of the richness of this class of rules.
Resumo:
The Internet revolution and the digital environment have spurred a significant amount of innovative activity that has had spillover effects on many sectors of the economy. For a growing group of countries – both developed and developing – digital goods and services have become an important engine of economic growth and a clear priority in their future-oriented economic strategies. Neither the rapid technological developments associated with digitization, nor their increased societal significance have so far been reflected in international economic law in a comprehensive manner. The law of the World Trade Organization (WTO) in particular, has not reacted in any proactive manner. A pertinent question that arises is whether the WTO rules are still useful and able to accommodate the new digital economy or whether they have been rendered outdated and incapable of dealing with this important development? The present think-piece seeks answers to these questions and maps the key issues and challenges which the WTO faces. In appraisal of the current state of affairs, developments in venues other than the WTO, and proposals tabled by stakeholders, some recommendations for the ways forward are made.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física