41 resultados para Process control applications

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of a task is fundamental to the discipline of ergonomics. Approaches to the analysis of tasks began in the early 1900's. These approaches have evolved and developed to the present day, when there is a vast array of methods available. Some of these methods are specific to particular contexts or applications, others more general. However, whilst many of these analyses allow tasks to be examined in detail, they do not act as tools to aid the design process or the designer. The present thesis examines the use of task analysis in a process control context, and in particular the use of task analysis to specify operator information and display requirements in such systems. The first part of the thesis examines the theoretical aspect of task analysis and presents a review of the methods, issues and concepts relating to task analysis. A review of over 80 methods of task analysis was carried out to form a basis for the development of a task analysis method to specify operator information requirements in industrial process control contexts. Of the methods reviewed Hierarchical Task Analysis was selected to provide such a basis and developed to meet the criteria outlined for such a method of task analysis. The second section outlines the practical application and evolution of the developed task analysis method. Four case studies were used to examine the method in an empirical context. The case studies represent a range of plant contexts and types, both complex and more simple, batch and continuous and high risk and low risk processes. The theoretical and empirical issues are drawn together and a method developed to provide a task analysis technique to specify operator information requirements and to provide the first stages of a tool to aid the design of VDU displays for process control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major application of computers has been to control physical processes in which the computer is embedded within some large physical process and is required to control concurrent physical processes. The main difficulty with these systems is their event-driven characteristics, which complicate their modelling and analysis. Although a number of researchers in the process system community have approached the problems of modelling and analysis of such systems, there is still a lack of standardised software development formalisms for the system (controller) development, particular at early stage of the system design cycle. This research forms part of a larger research programme which is concerned with the development of real-time process-control systems in which software is used to control concurrent physical processes. The general objective of the research in this thesis is to investigate the use of formal techniques in the analysis of such systems at their early stages of development, with a particular bias towards an application to high speed machinery. Specifically, the research aims to generate a standardised software development formalism for real-time process-control systems, particularly for software controller synthesis. In this research, a graphical modelling formalism called Sequential Function Chart (SFC), a variant of Grafcet, is examined. SFC, which is defined in the international standard IEC1131 as a graphical description language, has been used widely in industry and has achieved an acceptable level of maturity and acceptance. A comparative study between SFC and Petri nets is presented in this thesis. To overcome identified inaccuracies in the SFC, a formal definition of the firing rules for SFC is given. To provide a framework in which SFC models can be analysed formally, an extended time-related Petri net model for SFC is proposed and the transformation method is defined. The SFC notation lacks a systematic way of synthesising system models from the real world systems. Thus a standardised approach to the development of real-time process control systems is required such that the system (software) functional requirements can be identified, captured, analysed. A rule-based approach and a method called system behaviour driven method (SBDM) are proposed as a development formalism for real-time process-control systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work presented in this thesis describes an investigation into the production and properties of thin amorphous C films, with and without Cr doping, as a low wear / friction coating applicable to MEMS and other micro- and nano-engineering applications. Firstly, an assessment was made of the available testing techniques. Secondly, the optimised test methods were applied to a series of sputtered films of thickness 10 - 2000 nm in order to: (i) investigate the effect of thickness on the properties of coatingslcoating process (ii) investigate fundamental tribology at the nano-scale and (iii) provide a starting point for nanotribological coating optimisation at ultra low thickness. The use of XPS was investigated for the determination of Sp3/Sp2 carbon bonding. Under C 1s peak analysis, significant errors were identified and this was attributed to the absence of sufficient instrument resolution to guide the component peak structure (even with a high resolution instrument). A simple peak width analysis and correlation work with C KLL D value confirmed the errors. The use of XPS for Sp3/Sp2 was therefore limited to initial tentative estimations. Nanoindentation was shown to provide consistent hardness and reduced modulus results with depth (to < 7nm) when replicate data was suitably statistically processed. No significant pile-up or cracking of the films was identified under nanoindentation. Nanowear experimentation by multiple nanoscratching provided some useful information, however the conditions of test were very different to those expect for MEMS and micro- / nano-engineering systems. A novel 'sample oscillated nanoindentation' system was developed for testing nanowear under more relevant conditions. The films were produced in an industrial production coating line. In order to maximise the available information and to take account of uncontrolled process variation a statistical design of experiment procedure was used to investigate the effect of four key process control parameters. Cr doping was the most significant control parameter at all thicknesses tested and produced a softening effect and thus increased nanowear. Substrate bias voltage was also a significant parameter and produced hardening and a wear reducing effect at all thicknesses tested. The use of a Cr adhesion layer produced beneficial results at 150 nm thickness, but was ineffective at 50 nm. Argon flow to the coating chamber produced a complex effect. All effects reduced significantly with reducing film thickness. Classic fretting wear was produced at low amplitude under nanowear testing. Reciprocating sliding was produced at higher amplitude which generated three body abrasive wear and this was generally consistent with the Archard model. Specific wear rates were very low (typically 10-16 - 10-18 m3N-1m-1). Wear rates reduced exponentially with reduced film thickness and below (approx.) 20 nm, thickness was identified as the most important control of wear.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We consider the direct adaptive inverse control of nonlinear multivariable systems with different delays between every input-output pair. In direct adaptive inverse control, the inverse mapping is learned from examples of input-output pairs. This makes the obtained controller sub optimal, since the network may have to learn the response of the plant over a larger operational range than necessary. Moreover, in certain applications, the control problem can be redundant, implying that the inverse problem is ill posed. In this paper we propose a new algorithm which allows estimating and exploiting uncertainty in nonlinear multivariable control systems. This approach allows us to model strongly non-Gaussian distribution of control signals as well as processes with hysteresis. The proposed algorithm circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the increasing use of digital computers for data acquisition and digital process control, frequency domain transducers have become very attractive due to their virtual digital output. Essentially they are electrically maintained oscillators where the sensor is the controlling resonator.They are designed to make the frequency a function of the physical parameter being measured. Because of their high quality factor, mechanical resonators give very good frequency stability and are widely used as sensors. For this work symmetrical mechanical resonators such as the tuning fork were considered, to be the most promising. These are dynamically clamped and can be designed to have extensive regions where no vibrations occur.This enables the resonators to be robustly mounted in a way convenient for various applications. Designs for the measurement of fluid density and tension have been produced. The principle of the design of the resonator for fluid density measurement is a thin gap (trapping a lamina of fluid) between its two members which vibrate in antiphase.An analysis of the inter­ action between this resonator and the fluid lamina has carried out.In gases narrow gaps are needed for a good sensitivity and the use of the material fused quartz, because of its low density and very low temperature coefficient, is ideally suitable. In liquids an adequate sensitivity is achieved even with a wide lamina gap. Practical designs of such transducers have been evolved. The accuracy for liquid measurements is better than 1%. For gases it was found that, in air, a change of atmospheric pressure of 0.3% could be detected. In constructing a tension transducer using such a mechanical sensor as a wire or a beam, major difficulties are encountered in making an efficient clamping arrangement for the sensor. The use of dynamically clamped beams has been found to overcome the problem and this is the basis of the transducer investigated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis addresses the viability of automatic speech recognition for control room systems; with careful system design, automatic speech recognition (ASR) devices can be useful means for human computer interaction in specific types of task. These tasks can be defined as complex verbal activities, such as command and control, and can be paired with spatial tasks, such as monitoring, without detriment. It is suggested that ASR use be confined to routine plant operation, as opposed the critical incidents, due to possible problems of stress on the operators' speech.  It is proposed that using ASR will require operators to adapt a commonly used skill to cater for a novel use of speech. Before using the ASR device, new operators will require some form of training. It is shown that a demonstration by an experienced user of the device can lead to superior performance than instructions. Thus, a relatively cheap and very efficient form of operator training can be supplied by demonstration by experienced ASR operators. From a series of studies into speech based interaction with computers, it is concluded that the interaction be designed to capitalise upon the tendency of operators to use short, succinct, task specific styles of speech. From studies comparing different types of feedback, it is concluded that operators be given screen based feedback, rather than auditory feedback, for control room operation. Feedback will take two forms: the use of the ASR device will require recognition feedback, which will be best supplied using text; the performance of a process control task will require task feedback integrated into the mimic display. This latter feedback can be either textual or symbolic, but it is suggested that symbolic feedback will be more beneficial. Related to both interaction style and feedback is the issue of handling recognition errors. These should be corrected by simple command repetition practices, rather than use error handling dialogues. This method of error correction is held to be non intrusive to primary command and control operations. This thesis also addresses some of the problems of user error in ASR use, and provides a number of recommendations for its reduction.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A refractive index sensing system has been demonstrated, which is based upon an in-line fibre long period grating Mach-Zehnder interferometer with a heterodyne interrogation technique. This sensing system has comparable accuracy to laboratory-based techniques used in industry such as high performance liquid chromatography and UV spectroscopy. The advantage of this system is that measurements can be made in-situ for applications in continuous process control. Compared to other refractive index sensing schemes using LPGs, this approach has two main advantages. Firstly, the system relies on a simple optical interrogation system and therefore has the real potential for being low cost, and secondly, so far as we are aware it provides the highest refractive index resolution reported for any fibre LPG device.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In today’s modern manufacturing industry there is an increasing need to improve internal processes to meet diverse client needs. Process re-engineering is an important activity that is well understood by industry but its rate of application within small to medium size enterprises (SME) is less developed. Business pressures shift the focus of SMEs toward winning new projects and contracts rather than developing long-term, sustainable manufacturing processes. Variations in manufacturing processes are inevitable, but the amount of non-conformity often exceeds the acceptable levels. This paper is focused on the re-engineering of the manufacturing and verification procedure for discrete parts production with the aim of enhancing process control and product verification. The ideologies of the ‘Push’ and ‘Pull’ approaches to manufacturing are useful in the context of process re-engineering for data improvement. Currently information is pulled from the market and prominent customers, and manufacturing companies always try to make the right product, by following customer procedures that attempt to verify against specifications. This approach can result in significant quality control challenges. The aim of this paper is to highlight the importance of process re-engineering in product verification in SMEs. Leadership, culture, ownership and process management are among the main attributes required for the successful deployment of process re-engineering. This paper presents the findings from a case study showcasing the application of a modified re-engingeering method for the manufacturing and verification process. The findings from the case study indicate there are several advantages to implementing the re-engineering method outlined in this paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It has never been easy for manufacturing companies to understand their confidence level in terms of how accurate and to what degree of flexibility parts can be made. This brings uncertainty in finding the most suitable manufacturing method as well as in controlling their product and process verification systems. The aim of this research is to develop a system for capturing the company’s knowledge and expertise and then reflect it into an MRP (Manufacturing Resource Planning) system. A key activity here is measuring manufacturing and machining capabilities to a reasonable confidence level. For this purpose an in-line control measurement system is introduced to the company. Using SPC (Statistical Process Control) not only helps to predict the trend in manufacturing of parts but also minimises the human error in measurement. Gauge R&R (Repeatability and Reproducibility) study identifies problems in measurement systems. Measurement is like any other process in terms of variability. Reducing this variation via an automated machine probing system helps to avoid defects in future products.Developments in aerospace, nuclear, oil and gas industries demand materials with high performance and high temperature resistance under corrosive and oxidising environments. Superalloys were developed in the latter half of the 20th century as high strength materials for such purposes. For the same characteristics superalloys are considered as difficult-to-cut alloys when it comes to formation and machining. Furthermore due to the sensitivity of superalloy applications, in many cases they should be manufactured with tight tolerances. In addition superalloys, specifically Nickel based, have unique features such as low thermal conductivity due to having a high amount of Nickel in their material composition. This causes a high surface temperature on the work-piece at the machining stage which leads to deformation in the final product.Like every process, the material variations have a significant impact on machining quality. The main cause of variations can originate from chemical composition and mechanical hardness. The non-uniform distribution of metal elements is a major source of variation in metallurgical structures. Different heat treatment standards are designed for processing the material to the desired hardness levels based on application. In order to take corrective actions, a study on the material aspects of superalloys has been conducted. In this study samples from different batches of material have been analysed. This involved material preparation for microscopy analysis, and the effect of chemical compositions on hardness (before and after heat treatment). Some of the results are discussed and presented in this paper.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The internationally accepted Wolfson Heat Treatment Centre Engineering Group test was used to evaluate the cooling characteristics of the most popular commercial polymer quenchants: polyalkylene glycols, polyvinylpyrrolidones and polyacrylates. Prototype solutions containing poly(ethyloxazoline) were also examined. Each class of polymer was capable of providing a wide range of cooling rates depending on the product formulation, concentration, temperature, agitation, ageing and contamination. Cooling rates for synthetic quenchants were generally intermediate between those of water and oil. Control techniques, drag-out losses and response to quenching in terms of hardness and residual stress for a plain carbon steel, were also considered. A laboratory scale method for providing a controllable level of forced convection was developed. Test reproducibility was improved by positioning the preheated Wolfson probe 25mm above the geometric centre of a 25mm diameter orifice through which the quenchant was pumped at a velocity of 0.5m/s. On examination, all polymer quenchants were found to operate by the same fundamental mechanism associated with their viscosity and ability to form an insulating polymer-rich-film. The nature of this film, which formed at the vapour/liquid interface during boiling, was dependent on the polymer's solubility characteristics. High molecular weight polymers and high concentration solutions produced thicker, more stable insulating films. Agitation produced thinner more uniform films. Higher molecular weight polymers were more susceptible to degradation, and increased cooling rates, with usage. Polyvinylpyrrolidones can be cross-linked resulting in erratic performance, whilst the anionic character of polyacrylates can lead to control problems. Volatile contaminants tend to decrease the rate of cooling and salts to increase it. Drag-out increases upon raising the molecular weight of the polymer and its solution viscosity. Kinematic viscosity measurements are more effective than refractometer readings for concentration control, although a quench test is the most satisfactory process control method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis is based upon a case study of the adoption of digital, electronic, microprocessor-based control systems by Albright & Wilson Limited - a UK chemical producer. It offers an explanation of the company's changing technology policy between 1978 and 1981, by examining its past development, internal features and industrial environment. Part One of the thesis gives an industry-level analysis which relates the development of process control technology to changes in the economic requirements of production . The rapid diffusion of microcomputers and other microelectronic equipment in the chemical industry is found to be a response to general need to raise the efficiency of all processes, imposed by the economic recession following 1973. Part Two examines the impaot of these technical and eoonomic ohanges upon Albright & Wilson Limited. The company's slowness in adopting new control technology is explained by its long history in which trends are identified whlich produced theconservatism of the 1970s. By contrast, a study of Tenneco Incorporated, a much more successful adoptor of automating technology, is offered with an analysis of the new technology policy of adoption of such equipment which it imposed upon Albright & Wilson, following the latter's takeover by Tenneco in 1978. Some indications of the consequences by this new policy of widespread adoptions of microprocessor-based control equipment are derived from a study of the first Albright & Wilson plant to use such equipment. The thesis concludes that companies which fail to adopt rapidly the new control technology may not survive in the recessionary environment, the long- established British companies may lack the flexibility to make such necessary changes and that multi-national companies may have an important role jn the planned transfer and adoption of new production technology through their subsidiaries in the UK.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Apoptosis is an important cell death mechanism by which multicellular organisms remove unwanted cells. It culminates in a rapid, controlled removal of cell corpses by neighboring or recruited viable cells. Whilst many of the molecular mechanisms that mediate corpse clearance are components of the innate immune system, clearance of apoptotic cells is an anti-inflammatory process. Control of cell death is dependent on competing pro-apoptotic and anti-apoptotic signals. Evidence now suggests a similar balance of competing signals is central to the effective removal of cells, through so called 'eat me' and 'don't eat me' signals. Competing signals are also important for the controlled recruitment of phagocytes to sites of cell death. Consequently recruitment of phagocytes to and from sites of cell death can underlie the resolution or inappropriate propagation of cell death and inflammation. This article highlights our understanding of mechanisms mediating clearance of dying cells and discusses those mechanisms controlling phagocyte migration and how inappropriate control may promote important pathologies. © the authors, publisher and licensee libertas academica limited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Microfluidics has recently emerged as a new method of manufacturing liposomes, which allows for reproducible mixing in miliseconds on the nanoliter scale. Here we investigate microfluidics-based manufacturing of liposomes. The aim of these studies was to assess the parameters in a microfluidic process by varying the total flow rate (TFR) and the flow rate ratio (FRR) of the solvent and aqueous phases. Design of experiment and multivariate data analysis were used for increased process understanding and development of predictive and correlative models. High FRR lead to the bottom-up synthesis of liposomes, with a strong correlation with vesicle size, demonstrating the ability to in-process control liposomes size; the resulting liposome size correlated with the FRR in the microfluidics process, with liposomes of 50 nm being reproducibly manufactured. Furthermore, we demonstrate the potential of a high throughput manufacturing of liposomes using microfluidics with a four-fold increase in the volumetric flow rate, maintaining liposome characteristics. The efficacy of these liposomes was demonstrated in transfection studies and was modelled using predictive modeling. Mathematical modelling identified FRR as the key variable in the microfluidic process, with the highest impact on liposome size, polydispersity and transfection efficiency. This study demonstrates microfluidics as a robust and high-throughput method for the scalable and highly reproducible manufacture of size-controlled liposomes. Furthermore, the application of statistically based process control increases understanding and allows for the generation of a design-space for controlled particle characteristics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Commercial process simulators are increasing interest in the chemical engineer education. In this paper, the use of commercial dynamic simulation software, D-SPICE® and K-Spice®, for three different chemical engineering courses is described and discussed. The courses cover the following topics: basic chemical engineering, operability and safety analysis and process control. User experiences from both teachers and students are presented. The benefits of dynamic simulation as an additional teaching tool are discussed and summarized. The experiences confirm that commercial dynamic simulators provide realistic training and can be successfully integrated into undergraduate and graduate teaching, laboratory courses and research. © 2012 The Institution of Chemical Engineers.