941 resultados para Operations Research, Systems Engineering and Industrial Engineering


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the promotional brochure from the March 2004 national conference, Making Learning Visible: Peer Review and the Scholarship of Teaching. This conference was hosted by the UNL Peer Review of Teaching project and the University of Nebraska-Lincoln.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hazardous materials are substances that, if not regulated, can pose a threat to human populations and their environmental health, safety or property when transported in commerce. About 1.5 million tons of hazardous material shipments are transported by truck in the US annually, with a steady increase of approximately 5% per year. The objective of this study was to develop a routing tool for hazardous material transport in order to facilitate reduced environmental impacts and less transportation difficulties, yet would also find paths that were still compelling for the shipping carriers as a matter of trucking cost. The study started with identification of inhalation hazard impact zones and explosion protective areas around the location of hypothetical hazardous material releases, considering different parameters (i.e., chemicals characteristics, release quantities, atmospheric condition, etc.). Results showed that depending on the quantity of release, chemical, and atmospheric stability (a function of wind speed, meteorology, sky cover, time and location of accidents, etc.) the consequence of these incidents can differ. The study was extended by selection of other evaluation criteria for further investigation because health risk as an evaluation criterion would not be the only concern in selection of routes. Transportation difficulties (i.e., road blockage and congestion) were incorporated as important factor due to their indirect impact/cost on the users of transportation networks. Trucking costs were also considered as one of the primary criteria in selection of hazardous material paths; otherwise the suggested routes would have not been convincing for the shipping companies. The last but not least criterion was proximity of public places to the routes. The approach evolved from a simple framework to a complicated and efficient GIS-based tool able to investigate transportation networks of any given study area, and capable of generating best routing options for cargos. The suggested tool uses a multi-criteria-decision-making method, which considers the priorities of the decision makers in choosing the cargo routes. Comparison of the routing options based on each criterion and also the overall suitableness of the path in regards to all the criteria (using a multi-criteria-decision-making method) showed that using similar tools as the one proposed by this study can provide decision makers insights in the area of hazardous material transport. This tool shows the probable consequences of considering each path in a very easily understandable way; in the formats of maps and tables, which makes the tradeoffs of costs and risks considerably simpler, as in some cases slightly compromising on trucking cost may drastically decrease the probable health risk and/or traffic difficulties. This will not only be rewarding to the community by making cities safer places to live, but also can be beneficial to shipping companies by allowing them to advertise as environmental friendly conveyors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INVESTIGATION INTO CURRENT EFFICIENCY FOR PULSE ELECTROCHEMICAL MACHINING OF NICKEL ALLOY Yu Zhang, M.S. University of Nebraska, 2010 Adviser: Kamlakar P. Rajurkar Electrochemical machining (ECM) is a nontraditional manufacturing process that can machine difficult-to-cut materials. In ECM, material is removed by controlled electrochemical dissolution of an anodic workpiece in an electrochemical cell. ECM has extensive applications in automotive, petroleum, aerospace, textile, medical, and electronics industries. Improving current efficiency is a challenging task for any electro-physical or electrochemical machining processes. The current efficiency is defined as the ratio of the observed amount of metal dissolved to the theoretical amount predicted from Faraday’s law, for the same specified conditions of electrochemical equivalent, current, etc [1]. In macro ECM, electrolyte conductivity greatly influences the current efficiency of the process. Since there is a certain limit to enhance the conductivity of the electrolyte, a process innovation is needed for further improvement in current efficiency in ECM. Pulse electrochemical machining (PECM) is one such approach in which the electrolyte conductivity is improved by electrolyte flushing in pulse off-time. The aim of this research is to study the influence of major factors on current efficiency in a pulse electrochemical machining process in macro scale and to develop a linear regression model for predicting current efficiency of the process. An in-house designed electrochemical cell was used for machining nickel alloy (ASTM B435) by PECM. The effects of current density, type of electrolyte, and electrolyte flow rate, on current efficiency under different experimental conditions were studied. Results indicated that current efficiency is dependent on electrolyte, electrolyte flow rate, and current density. Linear regression models of current efficiency were compared with twenty new data points graphically and quantitatively. Models developed were close enough to the actual results to be reliable. In addition, an attempt has been made in this work to consider those factors in PECM that have not been investigated in earlier works. This was done by simulating the process by using COMSOL software. However, it was found that the results from this attempt were not substantially different from the earlier reported studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PREPARATION OF COATED MICROTOOLS FOR ELECTROCHEMICAL MACHINING APPLICATIONS Ajaya K. Swain, M.S. University of Nebraska, 2010 Advisor: K.P. Rajurkar Coated tools have improved the performance of both traditional and nontraditional machining processes and have resulted in higher material removal, better surface finish, and increased wear resistance. However, a study on the performance of coated tools in micromachining has not yet been adequately conducted. One possible reason is the difficulties associated with the preparation of coated microtools. Besides the technical requirement, economic and environmental aspects of the material and the coating technique used also play a significant role in coating microtools. This, in fact, restricts the range of coating materials and the type of coating process. Handling is another major issue in case of microtools purely because of their miniature size. This research focuses on the preparation of coated microtools for pulse electrochemical machining by electrodeposition. The motivation of this research is derived from the fact that although there were reports of improved machining by using insulating coatings on ECM tools, particularly in ECM drilling operations, not much literature was found relating to use of metallic coating materials in other ECM process types. An ideal ECM tool should be good thermal and electrical conductor, corrosion resistant, electrochemically stable, and stiff enough to withstand electrolyte pressure. Tungsten has almost all the properties desired in an ECM tool material except being electrochemically unstable. Tungsten can be oxidized during machining resulting in poor machining quality. Electrochemical stability of a tungsten ECM tool can be improved by electroplating it with nickel which has superior electrochemical resistance. Moreover, a tungsten tool can be coated in situ reducing the tool handling and breakage frequency. The tungsten microtool was electroplated with nickel with direct and pulse current. The effect of the various input parameters on the coating characteristics was studied and performance of the coated microtool was evaluated in pulse ECM. The coated tool removed more material (about 28%) than the uncoated tool under similar conditions and was more electrochemical stable. It was concluded that nickel coated tungsten microtool can improve the pulse ECM performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real Options Analysis (ROA) has become a complimentary tool for engineering economics. It has become popular due to the limitations of conventional engineering valuation methods; specifically, the assumptions of uncertainty. Industry is seeking to quantify the value of engineering investments with uncertainty. One problem with conventional tools are that they may assume that cash flows are certain, therefore minimizing the possibility of the uncertainty of future values. Real options analysis provides a solution to this problem, but has been used sparingly by practitioners. This paper seeks to provide a new model, referred to as the Beta Distribution Real Options Pricing Model (BDROP), which addresses these limitations and can be easily used by practitioners. The positive attributes of this new model include unconstrained market assumptions, robust representation of the underlying asset‟s uncertainty, and an uncomplicated methodology. This research demonstrates the use of the model to evaluate the use of automation for inventory control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Factor analysis was used to develop a more detailed description of the human hand to be used in the creation of glove sizes; currently gloves sizes are small, medium, and large. The created glove sizes provide glove designers with the ability to create a glove design that can provide fit to the majority of hand variations in both the male and female populations. The research used the American National Survey (ANSUR) data that was collected in 1988. This data contains eighty-six length, width, height, and circumference measurements of the human hand for one thousand male subjects and thirteen hundred female subjects. Eliminating redundant measurements reduced the data to forty-six essential measurements. Factor analysis grouped the variables to form three factors. The factors were used to generate hand sizes by using percentiles along each factor axis. Two different sizing systems were created. The first system contains 125 sizes for male and female. The second system contains 7 sizes for males and 14 sizes for females. The sizing systems were compared to another hand sizing system that was created using the ANSUR database indicating that the systems created using factor analysis provide better fit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Product miniaturization for applications in fields such as biotechnology, medical devices, aerospace, optics and communications has made the advancement of micromachining techniques essential. Machining of hard and brittle materials such as ceramics, glass and silicon is a formidable task. Rotary ultrasonic machining (RUM) is capable of machining these materials. RUM is a hybrid machining process which combines the mechanism of material removal of conventional grinding and ultrasonic machining. Downscaling of RUM for micro scale machining is essential to generate miniature features or parts from hard and brittle materials. The goal of this thesis is to conduct a feasibility study and to develop a knowledge base for micro rotary ultrasonic machining (MRUM). Positive outcome of the feasibility study led to a comprehensive investigation on the effect of process parameters. The effect of spindle speed, grit size, vibration amplitude, tool geometry, static load and coolant on the material removal rate (MRR) of MRUM was studied. In general, MRR was found to increase with increase in spindle speed, vibration amplitude and static load. MRR was also noted to depend upon the abrasive grit size and tool geometry. The behavior of the cutting forces was modeled using time series analysis. Being a vibration assisted machining process, heat generation in MRUM is low which is essential for bone machining. Capability of MRUM process for machining bone tissue was investigated. Finally, to estimate the MRR a predictive model was proposed. The experimental and the theoretical results exhibited a matching trend.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many organizations are currently facing inventory management problems such as distributing inventory on-time and maintain the correct inventory levels to satisfy the customer or end users. Organizations understand the need for maintaining the accurate inventory levels but sometimes fall short leading a wide performance gap in maintaining inventory accurately. The inventory inaccuracy can consume much of the investment on purchasing inventory and many times leads to excessive inventory. The research objective of thesis is to provide a decision making criteria to the management for closing or maintaining the warehouse based on basic purchasing and holding cost information. The specific objectives provide information regarding the impact of inventory carrying cost, obsolete inventory, inventory turns. The methodology section explains about the carrying cost ratio that would help inventory managers to adopt best practices to avoid obsolete inventory and also reduce excessive inventory levels. The research model was helpful in providing a decision making criteria based on the performance metric developed. This research model and performance metric had been validated by analysis of warehouse data and results indicated a shift from two-echelon inventory supply chain to a one-echelon or Just In Time (JIT) based inventory supply chain. The recommendations from the case study were used by a health care organization to reorganize the supply chain resulting in the reduction of excessive inventory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A central design challenge facing network planners is how to select a cost-effective network configuration that can provide uninterrupted service despite edge failures. In this paper, we study the Survivable Network Design (SND) problem, a core model underlying the design of such resilient networks that incorporates complex cost and connectivity trade-offs. Given an undirected graph with specified edge costs and (integer) connectivity requirements between pairs of nodes, the SND problem seeks the minimum cost set of edges that interconnects each node pair with at least as many edge-disjoint paths as the connectivity requirement of the nodes. We develop a hierarchical approach for solving the problem that integrates ideas from decomposition, tabu search, randomization, and optimization. The approach decomposes the SND problem into two subproblems, Backbone design and Access design, and uses an iterative multi-stage method for solving the SND problem in a hierarchical fashion. Since both subproblems are NP-hard, we develop effective optimization-based tabu search strategies that balance intensification and diversification to identify near-optimal solutions. To initiate this method, we develop two heuristic procedures that can yield good starting points. We test the combined approach on large-scale SND instances, and empirically assess the quality of the solutions vis-à-vis optimal values or lower bounds. On average, our hierarchical solution approach generates solutions within 2.7% of optimality even for very large problems (that cannot be solved using exact methods), and our results demonstrate that the performance of the method is robust for a variety of problems with different size and connectivity characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of checking the normality assumption in most statistical procedures especially parametric tests cannot be over emphasized as the validity of the inferences drawn from such procedures usually depend on the validity of this assumption. Numerous methods have been proposed by different authors over the years, some popular and frequently used, others, not so much. This study addresses the performance of eighteen of the available tests for different sample sizes, significance levels, and for a number of symmetric and asymmetric distributions by conducting a Monte-Carlo simulation. The results showed that considerable power is not achieved for symmetric distributions when sample size is less than one hundred and for such distributions, the kurtosis test is most powerful provided the distribution is leptokurtic or platykurtic. The Shapiro-Wilk test remains the most powerful test for asymmetric distributions. We conclude that different tests are suitable under different characteristics of alternative distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an overcapacity world, where the customers can choose from many similar products to satisfy their needs, enterprises are looking for new approaches and tools that can help them not only to maintain, but also to increase their competitive edge. Innovation, flexibility, quality, and service excellence are required to, at the very least, survive the on-going transition that industry is experiencing from mass production to mass customization. In order to help these enterprises, this research develops a Supply Chain Capability Maturity Model named S(CM)2. The Supply Chain Capability Maturity Model is intended to model, analyze, and improve the supply chain management operations of an enterprise. The Supply Chain Capability Maturity Model provides a clear roadmap for enterprise improvement, covering multiple views and abstraction levels of the supply chain, and provides tools to aid the firm in making improvements. The principal research tool applied is the Delphi method, which systematically gathered the knowledge and experience of eighty eight experts in Mexico. The model is validated using a case study and interviews with experts in supply chain management. The resulting contribution is a holistic model of the supply chain integrating multiple perspectives, and providing a systematic procedure for the improvement of a company’s supply chain operations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The contemporary world is crowded of large, interdisciplinary, complex systems made of other systems, personnel, hardware, software, information, processes, and facilities. The Systems Engineering (SE) field proposes an integrated holistic approach to tackle these socio-technical systems that is crucial to take proper account of their multifaceted nature and numerous interrelationships, providing the means to enable their successful realization. Model-Based Systems Engineering (MBSE) is an emerging paradigm in the SE field and can be described as the formalized application of modelling principles, methods, languages, and tools to the entire lifecycle of those systems, enhancing communications and knowledge capture, shared understanding, improved design precision and integrity, better development traceability, and reduced development risks. This thesis is devoted to the application of the novel MBSE paradigm to the Urban Traffic & Environment domain. The proposed system, the GUILTE (Guiding Urban Intelligent Traffic & Environment), deals with a present-day real challenging problem “at the agenda” of world leaders, national governors, local authorities, research agencies, academia, and general public. The main purposes of the system are to provide an integrated development framework for the municipalities, and to support the (short-time and real-time) operations of the urban traffic through Intelligent Transportation Systems, highlighting two fundamental aspects: the evaluation of the related environmental impacts (in particular, the air pollution and the noise), and the dissemination of information to the citizens, endorsing their involvement and participation. These objectives are related with the high-level complex challenge of developing sustainable urban transportation networks. The development process of the GUILTE system is supported by a new methodology, the LITHE (Agile Systems Modelling Engineering), which aims to lightening the complexity and burdensome of the existing methodologies by emphasizing agile principles such as continuous communication, feedback, stakeholders involvement, short iterations and rapid response. These principles are accomplished through a universal and intuitive SE process, the SIMILAR process model (which was redefined at the light of the modern international standards), a lean MBSE method, and a coherent System Model developed through the benchmark graphical modeling languages SysML and OPDs/OPL. The main contributions of the work are, in their essence, models and can be settled as: a revised process model for the SE field, an agile methodology for MBSE development environments, a graphical tool to support the proposed methodology, and a System Model for the GUILTE system. The comprehensive literature reviews provided for the main scientific field of this research (SE/MBSE) and for the application domain (Traffic & Environment) can also be seen as a relevant contribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This book provides the latest in a series of books growing out of the International Joint Conferences on Computer, Information and Systems Sciences and Engineering. It includes chapters in the most advanced areas of Computing, Informatics, Systems Sciences and Engineering. It has accessible to a wide range of readership, including professors, researchers, practitioners and students. This book includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of Computer Science, Informatics, and Systems Sciences, and Engineering. It includes selected papers form the conference proceedings of the Ninth International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2013). Coverage includes topics in: Industrial Electronics, Technology & Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different kinds of algorithms can be chosen so as to compute elementary functions. Among all of them, it is worthwhile mentioning the shift-and-add algorithms due to the fact that they have been specifically designed to be very simple and to save computer resources. In fact, almost the only operations usually involved with these methods are additions and shifts, which can be easily and efficiently performed by a digital processor. Shift-and-add algorithms allow fairly good precision with low cost iterations. The most famous algorithm belonging to this type is CORDIC. CORDIC has the capability of approximating a wide variety of functions with only the help of a slight change in their iterations. In this paper, we will analyze the requirements of some engineering and industrial problems in terms of type of operands and functions to approximate. Then, we will propose the application of shift-and-add algorithms based on CORDIC to these problems. We will make a comparison between the different methods applied in terms of the precision of the results and the number of iterations required.