962 resultados para 010206 Operations Research
Diagnostic errors and repetitive sequential classifications in on-line process control by attributes
Resumo:
The procedure of on-line process control by attributes, known as Taguchi`s on-line process control, consists of inspecting the mth item (a single item) at every m produced items and deciding, at each inspection, whether the fraction of conforming items was reduced or not. If the inspected item is nonconforming, the production is stopped for adjustment. As the inspection system can be subject to diagnosis errors, one develops a probabilistic model that classifies repeatedly the examined item until a conforming or b non-conforming classification is observed. The first event that occurs (a conforming classifications or b non-conforming classifications) determines the final classification of the examined item. Proprieties of an ergodic Markov chain were used to get the expression of average cost of the system of control, which can be optimized by three parameters: the sampling interval of the inspections (m); the number of repeated conforming classifications (a); and the number of repeated non-conforming classifications (b). The optimum design is compared with two alternative approaches: the first one consists of a simple preventive policy. The production system is adjusted at every n produced items (no inspection is performed). The second classifies the examined item repeatedly r (fixed) times and considers it conforming if most classification results are conforming. Results indicate that the current proposal performs better than the procedure that fixes the number of repeated classifications and classifies the examined item as conforming if most classifications were conforming. On the other hand, the preventive policy can be averagely the most economical alternative rather than those ones that require inspection depending on the degree of errors and costs. A numerical example illustrates the proposed procedure. (C) 2009 Elsevier B. V. All rights reserved.
Resumo:
The procedure for online process control by attributes consists of inspecting a single item at every m produced items. It is decided on the basis of the inspection result whether the process is in-control (the conforming fraction is stable) or out-of-control (the conforming fraction is decreased, for example). Most articles about online process control have cited the stoppage of the production process for an adjustment when the inspected item is non-conforming (then the production is restarted in-control, here denominated as corrective adjustment). Moreover, the articles related to this subject do not present semi-economical designs (which may yield high quantities of non-conforming items), as they do not include a policy of preventive adjustments (in such case no item is inspected), which can be more economical, mainly if the inspected item can be misclassified. In this article, the possibility of preventive or corrective adjustments in the process is decided at every m produced item. If a preventive adjustment is decided upon, then no item is inspected. On the contrary, the m-th item is inspected; if it conforms, the production goes on, otherwise, an adjustment takes place and the process restarts in-control. This approach is economically feasible for some practical situations and the parameters of the proposed procedure are determined minimizing an average cost function subject to some statistical restrictions (for example, to assure a minimal levelfixed in advanceof conforming items in the production process). Numerical examples illustrate the proposal.
Resumo:
In this paper, we consider a real-life heterogeneous fleet vehicle routing problem with time windows and split deliveries that occurs in a major Brazilian retail group. A single depot attends 519 stores of the group distributed in 11 Brazilian states. To find good solutions to this problem, we propose heuristics as initial solutions and a scatter search (SS) approach. Next, the produced solutions are compared with the routes actually covered by the company. Our results show that the total distribution cost can be reduced significantly when such methods are used. Experimental testing with benchmark instances is used to assess the merit of our proposed procedure. (C) 2008 Published by Elsevier B.V.
Resumo:
The study of Information Technology (IT) outsourcing is relevant because companies are outsourcing their activities more than ever. An important IT outsourcing research area is the decision-making process. In other words, the comprehension of how companies decide about outsourcing their IT operations is relevant from research point of view. Therefore, the objective of this study is to understand the decision-making process used by Brazilian companies when outsourcing their IT operations. An analysis of the literature that refers to this subject showed that six aspects are usually considered by companies on the evaluation of IT outsourcing service alternatives. This research verified how these six aspects are considered by Brazilian companies on IT outsourcing decisions. The survey showed that Brazilian companies consider all the six aspects, but each of them has a different level of importance. The research also grouped the aspects according to their level of importance and interdependency, using factorial analysis to understand the logic behind IT outsourcing decision process. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This paper analyzes the internationalization of new multinationals from emerging countries. It also focuses on Production`s role in firm internationalization, a subject seldom addressed because the discipline of International Manufacturing is still embryonic, while International Business tends to overlook production. The authors integrate International Business and International Manufacturing concepts and frameworks in order to analyze new multinationals from emerging countries, using the empirical evidence of a survey plus case studies of Brazilian multinationals for understanding late-movers` strategies and competences, with emphasis on production. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A great deal of attention in the supply chain management literature is devoted to study material and demand information flows and their coordination. But in many situations, supply chains may convey information from different nature, they may be an important channel companies have to deliver knowledge, or specifically, technical information to the market. This paper studies the technical flow and highlights its particular requirements. Drawing upon a qualitative field research, it studies pharmaceutical companies, since those companies face a very specific challenge: consumers do not have discretion over their choices, ethical drugs must be prescribed by physicians to be bought and used by final consumers. Technical information flow is rich, and must be redundant and early delivered at multiple points. Thus, apart from the regular material channel where products and order information flow, those companies build a specialized information channel, developed to communicate to those who need it to create demand. Conclusions can be extended to supply chains where products and services are complex and decision makers must be clearly informed about technology-related information. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The paper proposes a methodology and design rules for organisational structures facing higher necessity of rapidly reconfigure themselves to cope with unpredictable situations-new markets, new products, changing mix of production, problems in production process or flows etc. It implies changing and often conflictive criteria for production goals and for the allocation of work. The methodology was developed based on a large field action research and consulting. Their basis is the design of auto-reconfigurable working groups-or groups with variable geometry, depending on the events to face. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Hub-and-spoke networks are widely studied in the area of location theory. They arise in several contexts, including passenger airlines, postal and parcel delivery, and computer and telecommunication networks. Hub location problems usually involve three simultaneous decisions to be made: the optimal number of hub nodes, their locations and the allocation of the non-hub nodes to the hubs. In the uncapacitated single allocation hub location problem (USAHLP) hub nodes have no capacity constraints and non-hub nodes must be assigned to only one hub. In this paper, we propose three variants of a simple and efficient multi-start tabu search heuristic as well as a two-stage integrated tabu search heuristic to solve this problem. With multi-start heuristics, several different initial solutions are constructed and then improved by tabu search, while in the two-stage integrated heuristic tabu search is applied to improve both the locational and allocational part of the problem. Computational experiments using typical benchmark problems (Civil Aeronautics Board (CAB) and Australian Post (AP) data sets) as well as new and modified instances show that our approaches consistently return the optimal or best-known results in very short CPU times, thus allowing the possibility of efficiently solving larger instances of the USAHLP than those found in the literature. We also report the integer optimal solutions for all 80 CAB data set instances and the 12 AP instances up to 100 nodes, as well as for the corresponding new generated AP instances with reduced fixed costs. Published by Elsevier Ltd.
Model for facilities or vendors location in a global scale considering several echelons in the Chain
Resumo:
The facilities location problem for companies with global operations is very complex and not well explored in the literature. This work proposes a MILP model that solves the problem through minimization of the total logistic cost. Main contributions of the model are the pioneer carrying cost calculation, the treatment given to the take-or-pay costs and to the international tax benefits such as drawback and added value taxes in Brazil. The model was successfully applied to a real case of a chemical industry with industrial plants and sales all over the world. The model application recommended a totally new sourcing model for the company.
Resumo:
Recently, the development of industrial processes brought on the outbreak of technologically complex systems. This development generated the necessity of research relative to the mathematical techniques that have the capacity to deal with project complexities and validation. Fuzzy models have been receiving particular attention in the area of nonlinear systems identification and analysis due to it is capacity to approximate nonlinear behavior and deal with uncertainty. A fuzzy rule-based model suitable for the approximation of many systems and functions is the Takagi-Sugeno (TS) fuzzy model. IS fuzzy models are nonlinear systems described by a set of if then rules which gives local linear representations of an underlying system. Such models can approximate a wide class of nonlinear systems. In this paper a performance analysis of a system based on IS fuzzy inference system for the calibration of electronic compass devices is considered. The contribution of the evaluated IS fuzzy inference system is to reduce the error obtained in data acquisition from a digital electronic compass. For the reliable operation of the TS fuzzy inference system, adequate error measurements must be taken. The error noise must be filtered before the application of the IS fuzzy inference system. The proposed method demonstrated an effectiveness of 57% at reducing the total error based on considered tests. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The aim of this paper is to present an economical design of an X chart for a short-run production. The process mean starts equal to mu(0) (in-control, State I) and in a random time it shifts to mu(1) > mu(0) (out-of-control, State II). The monitoring procedure consists of inspecting a single item at every m produced ones. If the measurement of the quality characteristic does not meet the control limits, the process is stopped, adjusted, and additional (r - 1) items are inspected retrospectively. The probabilistic model was developed considering only shifts in the process mean. A direct search technique is applied to find the optimum parameters which minimizes the expected cost function. Numerical examples illustrate the proposed procedure. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The flowshop scheduling problem with blocking in-process is addressed in this paper. In this environment, there are no buffers between successive machines: therefore intermediate queues of jobs waiting in the system for their next operations are not allowed. Heuristic approaches are proposed to minimize the total tardiness criterion. A constructive heuristic that explores specific characteristics of the problem is presented. Moreover, a GRASP-based heuristic is proposed and Coupled with a path relinking strategy to search for better outcomes. Computational tests are presented and the comparisons made with an adaptation of the NEH algorithm and with a branch-and-bound algorithm indicate that the new approaches are promising. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
A large number of models have been derived from the two-parameter Weibull distribution and are referred to as Weibull models. They exhibit a wide range of shapes for the density and hazard functions, which makes them suitable for modelling complex failure data sets. The WPP and IWPP plot allows one to determine in a systematic manner if one or more of these models are suitable for modelling a given data set. This paper deals with this topic.
Resumo:
Quasi-birth-and-death (QBD) processes with infinite “phase spaces” can exhibit unusual and interesting behavior. One of the simplest examples of such a process is the two-node tandem Jackson network, with the “phase” giving the state of the first queue and the “level” giving the state of the second queue. In this paper, we undertake an extensive analysis of the properties of this QBD. In particular, we investigate the spectral properties of Neuts’s R-matrix and show that the decay rate of the stationary distribution of the “level” process is not always equal to the convergence norm of R. In fact, we show that we can obtain any decay rate from a certain range by controlling only the transition structure at level zero, which is independent of R. We also consider the sequence of tandem queues that is constructed by restricting the waiting room of the first queue to some finite capacity, and then allowing this capacity to increase to infinity. We show that the decay rates for the finite truncations converge to a value, which is not necessarily the decay rate in the infinite waiting room case. Finally, we show that the probability that the process hits level n before level 0 given that it starts in level 1 decays at a rate which is not necessarily the same as the decay rate for the stationary distribution.
Resumo:
In this paper, we propose a fast adaptive importance sampling method for the efficient simulation of buffer overflow probabilities in queueing networks. The method comprises three stages. First, we estimate the minimum cross-entropy tilting parameter for a small buffer level; next, we use this as a starting value for the estimation of the optimal tilting parameter for the actual (large) buffer level. Finally, the tilting parameter just found is used to estimate the overflow probability of interest. We study various properties of the method in more detail for the M/M/1 queue and conjecture that similar properties also hold for quite general queueing networks. Numerical results support this conjecture and demonstrate the high efficiency of the proposed algorithm.