65 resultados para design or documentation process


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to describe how the application of systems thinking to designing, managing and improving business processes has resulted in a new and unique holonic-based process modeling methodology know as process orientated holonic modeling. Design/methodology/approach: The paper describes key systems thinking axioms that are built upon in an overview of the methodology; the techniques are described using an example taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. These were produced in an 18 month project, using an action research approach, to improve quality and process efficiency. Findings: The findings of this research show that this new methodology can support process depiction and improvement in industrial sectors which are characterized by environments of high variety and low volume (e.g. projects; such as the design and manufacture of a radar system or a hybrid production process) which do not provide repetitive learning opportunities. In such circumstances, the methodology has not only been able to deliver holonic-based process diagrams but also been able to transfer strategic vision from top management to middle and operational levels without being reductionistic. Originality/value: This paper will be of interest to organizational analysts looking at large complex projects whom require a methodology that does not confine them to thinking reductionistically in "task-breakdown" based approaches. The novel ideas in this paper have great impact on the way analysts should perceive organizational processes. Future research is applying the methodology in similar environments in other industries. © Emerald Group Publishing Limited.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To meet changing needs of customers and to survive in the increasingly globalised and competitive environment, it is necessary for companies to equip themselves with intelligent tools, thereby enabling managerial levels to use the tactical decision in a better way. However, the implementation of an intelligent system is always a challenge in Small- and Medium-sized Enterprises (SMEs). Therefore, a new and simple approach with 'process rethinking' ability is proposed to generate ongoing process improvements over time. In this paper, a roadmap of the development of an agent-based information system is described. A case example has also been provided to show how the system can assist non-specialists, for example, managers and engineers to make right decisions for a continual process improvement. Copyright © 2006 Inderscience Enterprises Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In designing new product the ability to retrieve drawings of existing components is important if costs are to be controlled by preventing unnecessary duplication if parts. Component coding and classification systems have been used successfully for these purposes but suffer from high operational costs and poor usability arising directly from the manual nature of the coding process itself. A new version of an existing coding system (CAMAC) has been developed to reduce costs by automatically coding engineering drawings. Usability is improved be supporting searches based on a drawing or sketch of the desired component. Test results from a database of several thousand drawings are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a greedy Bayesian experimental design criterion for heteroscedastic Gaussian process models. The criterion is based on the Fisher information and is optimal in the sense of minimizing parameter uncertainty for likelihood based estimators. We demonstrate the validity of the criterion under different noise regimes and present experimental results from a rabies simulator to demonstrate the effectiveness of the resulting approximately optimal designs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As levels of investment in advanced manufacturing systems increase, effective project management becomes ever more critical. This paper demonstrates how the model proposed by Mintzberg, Raisinghani and Theoret in 1976, which structures complicated strategic decision processes, can be applied to the design of new production systems for both descriptive and analytical research purposes. This paper sets a detailed case study concerning the design and development of an advanced manufacturing system within the Mintzberg decision model and so breaks down the decision sequence into constituent parts. It thus shows how a structured model can provide a framework for the researcher who wishes to study decision episodes in the design of manufacturing facilities in greater depth.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

National meteorological offices are largely concerned with synoptic-scale forecasting where weather predictions are produced for a whole country for 24 hours ahead. In practice, many local organisations (such as emergency services, construction industries, forestry, farming, and sports) require only local short-term, bespoke, weather predictions and warnings. This thesis shows that the less-demanding requirements do not require exceptional computing power and can be met by a modern, desk-top system which monitors site-specific ground conditions (such as temperature, pressure, wind speed and direction, etc) augmented with above ground information from satellite images to produce `nowcasts'. The emphasis in this thesis has been towards the design of such a real-time system for nowcasting. Local site-specific conditions are monitored using a custom-built, stand alone, Motorola 6809 based sub-system. Above ground information is received from the METEOSAT 4 geo-stationary satellite using a sub-system based on a commercially available equipment. The information is ephemeral and must be captured in real-time. The real-time nowcasting system for localised weather handles the data as a transparent task using the limited capabilities of the PC system. Ground data produces a time series of measurements at a specific location which represents the past-to-present atmospheric conditions of the particular site from which much information can be extracted. The novel approach adopted in this thesis is one of constructing stochastic models based on the AutoRegressive Integrated Moving Average (ARIMA) technique. The satellite images contain features (such as cloud formations) which evolve dynamically and may be subject to movement, growth, distortion, bifurcation, superposition, or elimination between images. The process of extracting a weather feature, following its motion and predicting its future evolution involves algorithms for normalisation, partitioning, filtering, image enhancement, and correlation of multi-dimensional signals in different domains. To limit the processing requirements, the analysis in this thesis concentrates on an `area of interest'. By this rationale, only a small fraction of the total image needs to be processed, leading to a major saving in time. The thesis also proposes an extention to an existing manual cloud classification technique for its implementation in automatically classifying a cloud feature over the `area of interest' for nowcasting using the multi-dimensional signals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper we present a novel method for emulating a stochastic, or random output, computer model and show its application to a complex rabies model. The method is evaluated both in terms of accuracy and computational efficiency on synthetic data and the rabies model. We address the issue of experimental design and provide empirical evidence on the effectiveness of utilizing replicate model evaluations compared to a space-filling design. We employ the Mahalanobis error measure to validate the heteroscedastic Gaussian process based emulator predictions for both the mean and (co)variance. The emulator allows efficient screening to identify important model inputs and better understanding of the complex behaviour of the rabies model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Gas absorption, the removal of one or more constitutents from a gas mixture, is widely used in chemical processes. In many gas absorption processes, the gas mixture is already at high pressure and in recent years organic solvents have been developed for the process of physical absorption at high pressure followed by low pressure regeneration of the solvent and recovery of the absorbed gases. Until now the discovery of new solvents has usually been by expensive and time consuming trial and error laboratory tests. This work describes a new approach, whereby a solvent is selected from considerations of its molecular structure by applying recently published methods of predicting gas solubility from the molecular groups which make up the solvent molecule. The removal of the acid gases of carbon dioxide and hydrogen sulfide from methane or hydrogen was used as a commercially important example. After a preliminary assessment to identify promising moecular groups, more than eighty new solvent molecules were designed and evaluated by predicting gas solubility. The other important physical properties were also predicted by appropriate theoretical procedures, and a commercially promising new solvent was chosen to have a high solubility for acid gases, a low solubility for methane and hydrogen, a low vapour pressure, and a low viscosity. The solvent chosen, of molecular structure Ch3-COCH2-CH2-CO-CH3, was tested in the laboratory and shown to have physical properties, except for vapour pressures, close to those predicted. That is gas solubilities were within 10% but lower than predicted. Viscosity within 10% but higher than predicted and a vapour pressure significantly lower than predicted. A computer program was written to predict gas solubility in the new solvent at the high pressures (25 bar) used in practice. This is based on the group contribution method of Skold Jorgensen (1984). Before using this with the new solvent, Acetonyl acetone, the method was show to be sufficiently accurate by comparing predicted values of gas solubility with experimental solubilities from the literature for 14 systems up to 50 bar. A test of the commercial potential of the new solvent was made by means of two design studies which compared the size of plant and approximate relative costs of absorbing acid gases by means of the new solvent with other commonly used solvents. These were refrigerated methanol(Rectisol process) and Dimethyl Ether or Polyethylene Glycol(Selexol process). Both studies showed in terms of capital and operating cost some significant advantage for plant designed for the new solvent process.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The concept of a task is fundamental to the discipline of ergonomics. Approaches to the analysis of tasks began in the early 1900's. These approaches have evolved and developed to the present day, when there is a vast array of methods available. Some of these methods are specific to particular contexts or applications, others more general. However, whilst many of these analyses allow tasks to be examined in detail, they do not act as tools to aid the design process or the designer. The present thesis examines the use of task analysis in a process control context, and in particular the use of task analysis to specify operator information and display requirements in such systems. The first part of the thesis examines the theoretical aspect of task analysis and presents a review of the methods, issues and concepts relating to task analysis. A review of over 80 methods of task analysis was carried out to form a basis for the development of a task analysis method to specify operator information requirements in industrial process control contexts. Of the methods reviewed Hierarchical Task Analysis was selected to provide such a basis and developed to meet the criteria outlined for such a method of task analysis. The second section outlines the practical application and evolution of the developed task analysis method. Four case studies were used to examine the method in an empirical context. The case studies represent a range of plant contexts and types, both complex and more simple, batch and continuous and high risk and low risk processes. The theoretical and empirical issues are drawn together and a method developed to provide a task analysis technique to specify operator information requirements and to provide the first stages of a tool to aid the design of VDU displays for process control.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The concept of an Expert System (ES) has been acknowledged as a very useful tool, but few studies have been carried out in its application to the design of cold rolled sections. This study involves primarily the use of an ES as a tool to improve the design process and to capture the draughtsman's knowledge. Its main purpose is to reduce substantially the time taken to produce a section drawing, thereby facilitating a speedy feedback to the customer. In order to communicate with a draughtsman, it is necessary to use sketches, symbolic representations and numerical data. This increases the complexity of programming an ES, as it is necessary to use a combination of languages so that decisions, calculations, graphical drawings and control of the system can be effected. A production system approach is used and a further step has been taken by introducing an Activator which is an autoexecute operation set up by the ES to operate an external program automatically. To speed up the absorption of new knowledge into the knowledge base, a new Learning System has been constructed. In addition to developing the ES, other software has been written to assist the design process. The section properties software has been introduced to improve the speed and consistency of calculating the section properties. A method of selecting or comparing the most appropriate section for a given specification is also implemented. Simple loading facilities have been introduced to guide the designer as to the loading capacity of the section. This research has concluded that the application of an ES is beneficial and with the activator approach, automated designing can be achieved. On average a complex drawing can be displayed on the screen in about 100 seconds, where over 95% of the initial section design time for repetitive or similar profile can be saved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Case studies in copper-alloy rolling mill companies showed that existing planning systems suffer from numerous shortcomings. Where computerised systems are in use, these tend to simply emulate older manual systems and still rely heavily on modification by experienced planners on the shopfloor. As the size and number of orders increase, the task of process planners, while seeking to optimise the manufacturing objectives and keep within the production constraints, becomes extremely complicated because of the number of options for mixing or splitting the orders into batches. This thesis develops a modular approach to computerisation of the production management and planning functions. The full functional specification of each module is discussed, together with practical problems associated with their phased implementation. By adapting the Distributed Bill of Material concept from Material Requirements Planning (MRP) philosophy, the production routes generated by the planning system are broken down to identify the rolling stages required. Then to optimise the use of material at each rolling stage, the system generates an optimal cutting pattern using a new algorithm that produces practical solutions to the cutting stock problem. It is shown that the proposed system can be accommodated on a micro-computer, which brings it into the reach of typical companies in the copper-alloy rolling industry, where profit margins are traditionally low and the cost of widespread use of mainframe computers would be prohibitive.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Deep hole drilling is one of the most complicated metal cutting processes and one of the most difficult to perform on CNC machine-tools or machining centres under conditions of limited manpower or unmanned operation. This research work investigates aspects of the deep hole drilling process with small diameter twist drills and presents a prototype system for real time process monitoring and adaptive control; two main research objectives are fulfilled in particular : First objective is the experimental investigation of the mechanics of the deep hole drilling process, using twist drills without internal coolant supply, in the range of diarneters Ø 2.4 to Ø4.5 mm and working length up to 40 diameters. The definition of the problems associated with the low strength of these tools and the study of mechanisms of catastrophic failure which manifest themselves well before and along with the classic mechanism of tool wear. The relationships between drilling thrust and torque with the depth of penetration and the various machining conditions are also investigated and the experimental evidence suggests that the process is inherently unstable at depths beyond a few diameters. Second objective is the design and implementation of a system for intelligent CNC deep hole drilling, the main task of which is to ensure integrity of the process and the safety of the tool and the workpiece. This task is achieved by means of interfacing the CNC system of the machine tool to an external computer which performs the following functions: On-line monitoring of the drilling thrust and torque, adaptive control of feed rate, spindle speed and tool penetration (Z-axis), indirect monitoring of tool wear by pattern recognition of variations of the drilling thrust with cumulative cutting time and drilled depth, operation as a data base for tools and workpieces and finally issuing of alarms and diagnostic messages.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work attempts to create a systemic design framework for man-machine interfaces which is self consistent, compatible with other concepts, and applicable to real situations. This is tackled by examining the current architecture of computer applications packages. The treatment in the main is philosophical and theoretical and analyses the origins, assumptions and current practice of the design of applications packages. It proposes that the present form of packages is fundamentally contradictory to the notion of packaging itself. This is because as an indivisible ready-to-implement solution, current package architecture displays the following major disadvantages. First, it creates problems as a result of user-package interactions, in which the designer tries to mould all potential individual users, no matter how diverse they are, into one model. This is worsened by the minute provision, if any, of important properties such as flexibility, independence and impartiality. Second, it displays rigid structure that reduces the variety and/or multi-use of the component parts of such a package. Third, it dictates specific hardware and software configurations which probably results in reducing the number of degrees of freedom of its user. Fourth, it increases the dependence of its user upon its supplier through inadequate documentation and understanding of the package. Fifth, it tends to cause a degeneration of the expertise of design of the data processing practitioners. In view of this understanding an alternative methodological design framework which is both consistent with systems approach and the role of a package in its likely context is proposed. The proposition is based upon an extension of the identified concept of the hierarchy of holons* which facilitates the examination of the complex relationships of a package with its two principal environments. First, the user characteristics and his decision making practice and procedures; implying an examination of the user's M.I.S. network. Second, the software environment and its influence upon a package regarding support, control and operation of the package. The framework is built gradually as discussion advances around the central theme of a compatible M.I.S., software and model design. This leads to the formation of the alternative package architecture that is based upon the design of a number of independent, self-contained small parts. Such is believed to constitute the nucleus around which not only packages can be more effectively designed, but is also applicable to many man-machine systems design.