950 resultados para Management|Industrial engineering|Operations research


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many organizations are currently facing inventory management problems such as distributing inventory on-time and maintain the correct inventory levels to satisfy the customer or end users. Organizations understand the need for maintaining the accurate inventory levels but sometimes fall short leading a wide performance gap in maintaining inventory accurately. The inventory inaccuracy can consume much of the investment on purchasing inventory and many times leads to excessive inventory. The research objective of thesis is to provide a decision making criteria to the management for closing or maintaining the warehouse based on basic purchasing and holding cost information. The specific objectives provide information regarding the impact of inventory carrying cost, obsolete inventory, inventory turns. The methodology section explains about the carrying cost ratio that would help inventory managers to adopt best practices to avoid obsolete inventory and also reduce excessive inventory levels. The research model was helpful in providing a decision making criteria based on the performance metric developed. This research model and performance metric had been validated by analysis of warehouse data and results indicated a shift from two-echelon inventory supply chain to a one-echelon or Just In Time (JIT) based inventory supply chain. The recommendations from the case study were used by a health care organization to reorganize the supply chain resulting in the reduction of excessive inventory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is based on the premises that teams can be designed to optimize its performance, and appropriate team coordination is a significant factor to team outcome performance. Contingency theory argues that the effectiveness of a team depends on the right fit of the team design factors to the particular job at hand. Therefore, organizations need computational tools capable of predict the performance of different configurations of teams. This research created an agent-based model of teams called the Team Coordination Model (TCM). The TCM estimates the coordination load and performance of a team, based on its composition, coordination mechanisms, and job’s structural characteristics. The TCM can be used to determine the team’s design characteristics that most likely lead the team to achieve optimal performance. The TCM is implemented as an agent-based discrete-event simulation application built using JAVA and Cybele Pro agent architecture. The model implements the effect of individual team design factors on team processes, but the resulting performance emerges from the behavior of the agents. These team member agents use decision making, and explicit and implicit mechanisms to coordinate the job. The model validation included the comparison of the TCM’s results with statistics from a real team and with the results predicted by the team performance literature. An illustrative 26-1 fractional factorial experimental design demonstrates the application of the simulation model to the design of a team. The results from the ANOVA analysis have been used to recommend the combination of levels of the experimental factors that optimize the completion time for a team that runs sailboats races. This research main contribution to the team modeling literature is a model capable of simulating teams working on complex job environments. The TCM implements a stochastic job structure model capable of capturing some of the complexity not capture by current models. In a stochastic job structure, the tasks required to complete the job change during the team execution of the job. This research proposed three new types of dependencies between tasks required to model a job as a stochastic structure. These dependencies are conditional sequential, single-conditional sequential, and the merge dependencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INVESTIGATION INTO CURRENT EFFICIENCY FOR PULSE ELECTROCHEMICAL MACHINING OF NICKEL ALLOY Yu Zhang, M.S. University of Nebraska, 2010 Adviser: Kamlakar P. Rajurkar Electrochemical machining (ECM) is a nontraditional manufacturing process that can machine difficult-to-cut materials. In ECM, material is removed by controlled electrochemical dissolution of an anodic workpiece in an electrochemical cell. ECM has extensive applications in automotive, petroleum, aerospace, textile, medical, and electronics industries. Improving current efficiency is a challenging task for any electro-physical or electrochemical machining processes. The current efficiency is defined as the ratio of the observed amount of metal dissolved to the theoretical amount predicted from Faraday’s law, for the same specified conditions of electrochemical equivalent, current, etc [1]. In macro ECM, electrolyte conductivity greatly influences the current efficiency of the process. Since there is a certain limit to enhance the conductivity of the electrolyte, a process innovation is needed for further improvement in current efficiency in ECM. Pulse electrochemical machining (PECM) is one such approach in which the electrolyte conductivity is improved by electrolyte flushing in pulse off-time. The aim of this research is to study the influence of major factors on current efficiency in a pulse electrochemical machining process in macro scale and to develop a linear regression model for predicting current efficiency of the process. An in-house designed electrochemical cell was used for machining nickel alloy (ASTM B435) by PECM. The effects of current density, type of electrolyte, and electrolyte flow rate, on current efficiency under different experimental conditions were studied. Results indicated that current efficiency is dependent on electrolyte, electrolyte flow rate, and current density. Linear regression models of current efficiency were compared with twenty new data points graphically and quantitatively. Models developed were close enough to the actual results to be reliable. In addition, an attempt has been made in this work to consider those factors in PECM that have not been investigated in earlier works. This was done by simulating the process by using COMSOL software. However, it was found that the results from this attempt were not substantially different from the earlier reported studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real Options Analysis (ROA) has become a complimentary tool for engineering economics. It has become popular due to the limitations of conventional engineering valuation methods; specifically, the assumptions of uncertainty. Industry is seeking to quantify the value of engineering investments with uncertainty. One problem with conventional tools are that they may assume that cash flows are certain, therefore minimizing the possibility of the uncertainty of future values. Real options analysis provides a solution to this problem, but has been used sparingly by practitioners. This paper seeks to provide a new model, referred to as the Beta Distribution Real Options Pricing Model (BDROP), which addresses these limitations and can be easily used by practitioners. The positive attributes of this new model include unconstrained market assumptions, robust representation of the underlying asset‟s uncertainty, and an uncomplicated methodology. This research demonstrates the use of the model to evaluate the use of automation for inventory control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Factor analysis was used to develop a more detailed description of the human hand to be used in the creation of glove sizes; currently gloves sizes are small, medium, and large. The created glove sizes provide glove designers with the ability to create a glove design that can provide fit to the majority of hand variations in both the male and female populations. The research used the American National Survey (ANSUR) data that was collected in 1988. This data contains eighty-six length, width, height, and circumference measurements of the human hand for one thousand male subjects and thirteen hundred female subjects. Eliminating redundant measurements reduced the data to forty-six essential measurements. Factor analysis grouped the variables to form three factors. The factors were used to generate hand sizes by using percentiles along each factor axis. Two different sizing systems were created. The first system contains 125 sizes for male and female. The second system contains 7 sizes for males and 14 sizes for females. The sizing systems were compared to another hand sizing system that was created using the ANSUR database indicating that the systems created using factor analysis provide better fit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Offset printing is a common method to produce large amounts of printed matter. We consider a real-world offset printing process that is used to imprint customer-specific designs on napkin pouches. The print- ing technology used yields a number of specific constraints. The planning problem consists of allocating designs to printing-plate slots such that the given customer demand for each design is fulfilled, all technologi- cal and organizational constraints are met and the total overproduction and setup costs are minimized. We formulate this planning problem as a mixed-binary linear program, and we develop a multi-pass matching-based savings heuristic. We report computational results for a set of problem instances devised from real-world data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation develops a process improvement method for service operations based on the Theory of Constraints (TOC), a management philosophy that has been shown to be effective in manufacturing for decreasing WIP and improving throughput. While TOC has enjoyed much attention and success in the manufacturing arena, its application to services in general has been limited. The contribution to industry and knowledge is a method for improving global performance measures based on TOC principles. The method proposed in this dissertation will be tested using discrete event simulation based on the scenario of the service factory of airline turnaround operations. To evaluate the method, a simulation model of aircraft turn operations of a U.S. based carrier was made and validated using actual data from airline operations. The model was then adjusted to reflect an application of the Theory of Constraints for determining how to deploy the scarce resource of ramp workers. The results indicate that, given slight modifications to TOC terminology and the development of a method for constraint identification, the Theory of Constraints can be applied with success to services. Bottlenecks in services must be defined as those processes for which the process rates and amount of work remaining are such that completing the process will not be possible without an increase in the process rate. The bottleneck ratio is used to determine to what degree a process is a constraint. Simulation results also suggest that redefining performance measures to reflect a global business perspective of reducing costs related to specific flights versus the operational local optimum approach of turning all aircraft quickly results in significant savings to the company. Savings to the annual operating costs of the airline were simulated to equal 30% of possible current expenses for misconnecting passengers with a modest increase in utilization of the workers through a more efficient heuristic of deploying them to the highest priority tasks. This dissertation contributes to the literature on service operations by describing a dynamic, adaptive dispatch approach to manage service factory operations similar to airline turnaround operations using the management philosophy of the Theory of Constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A job shop with one batch processing and several discrete machines is analyzed. Given a set of jobs, their process routes, processing requirements, and size, the objective is to schedule the jobs such that the makespan is minimized. The batch processing machine can process a batch of jobs as long as the machine capacity is not violated. The batch processing time is equal to the longest processing job in the batch. The problem under study can be represented as Jm:batch:Cmax. If no batches were formed, the scheduling problem under study reduces to the classical job shop scheduling problem (i.e. Jm:: Cmax), which is known to be NP-hard. This research extends the scheduling literature by combining Jm::Cmax with batch processing. The primary contributions are the mathematical formulation, a new network representation and several solution approaches. The problem under study is observed widely in metal working and other industries, but received limited or no attention due to its complexity. A novel network representation of the problem using disjunctive and conjunctive arcs, and a mathematical formulation are proposed to minimize the makespan. Besides that, several algorithms, like batch forming heuristics, dispatching rules, Modified Shifting Bottleneck, Tabu Search (TS) and Simulated Annealing (SA), were developed and implemented. An experimental study was conducted to evaluate the proposed heuristics, and the results were compared to those from a commercial solver (i.e., CPLEX). TS and SA, with the combination of MWKR-FF as the initial solution, gave the best solutions among all the heuristics proposed. Their results were close to CPLEX; and for some larger instances, with total operations greater than 225, they were competitive in terms of solution quality and runtime. For some larger problem instances, CPLEX was unable to report a feasible solution even after running for several hours. Between SA and the experimental study indicated that SA produced a better average Cmax for all instances. The solution approaches proposed will benefit practitioners to schedule a job shop (with both discrete and batch processing machines) more efficiently. The proposed solution approaches are easier to implement and requires short run times to solve large problem instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is motivated by the need for considering lot sizing while accepting customer orders in a make-to-order (MTO) environment, in which each customer order must be delivered by its due date. Job shop is the typical operation model used in an MTO operation, where the production planner must make three concurrent decisions; they are order selection, lot size, and job schedule. These decisions are usually treated separately in the literature and are mostly led to heuristic solutions. The first phase of the study is focused on a formal definition of the problem. Mathematical programming techniques are applied to modeling this problem in terms of its objective, decision variables, and constraints. A commercial solver, CPLEX is applied to solve the resulting mixed-integer linear programming model with small instances to validate the mathematical formulation. The computational result shows it is not practical for solving problems of industrial size, using a commercial solver. The second phase of this study is focused on development of an effective solution approach to this problem of large scale. The proposed solution approach is an iterative process involving three sequential decision steps of order selection, lot sizing, and lot scheduling. A range of simple sequencing rules are identified for each of the three subproblems. Using computer simulation as the tool, an experiment is designed to evaluate their performance against a set of system parameters. For order selection, the proposed weighted most profit rule performs the best. The shifting bottleneck and the earliest operation finish time both are the best scheduling rules. For lot sizing, the proposed minimum cost increase heuristic, based on the Dixon-Silver method performs the best, when the demand-to-capacity ratio at the bottleneck machine is high. The proposed minimum cost heuristic, based on the Wagner-Whitin algorithm is the best lot-sizing heuristic for shops of a low demand-to-capacity ratio. The proposed heuristic is applied to an industrial case to further evaluate its performance. The result shows it can improve an average of total profit by 16.62%. This research contributes to the production planning research community with a complete mathematical definition of the problem and an effective solution approach to solving the problem of industry scale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ENEGI 2013: Atas do 2º Encontro Nacional de Engenharia e Gestão Industrial, Universidade de Aveiro, 17 e 18 de maio de 2013, Aveiro, Portugal.