926 resultados para Statistical Process Control (SPC)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major application of computers has been to control physical processes in which the computer is embedded within some large physical process and is required to control concurrent physical processes. The main difficulty with these systems is their event-driven characteristics, which complicate their modelling and analysis. Although a number of researchers in the process system community have approached the problems of modelling and analysis of such systems, there is still a lack of standardised software development formalisms for the system (controller) development, particular at early stage of the system design cycle. This research forms part of a larger research programme which is concerned with the development of real-time process-control systems in which software is used to control concurrent physical processes. The general objective of the research in this thesis is to investigate the use of formal techniques in the analysis of such systems at their early stages of development, with a particular bias towards an application to high speed machinery. Specifically, the research aims to generate a standardised software development formalism for real-time process-control systems, particularly for software controller synthesis. In this research, a graphical modelling formalism called Sequential Function Chart (SFC), a variant of Grafcet, is examined. SFC, which is defined in the international standard IEC1131 as a graphical description language, has been used widely in industry and has achieved an acceptable level of maturity and acceptance. A comparative study between SFC and Petri nets is presented in this thesis. To overcome identified inaccuracies in the SFC, a formal definition of the firing rules for SFC is given. To provide a framework in which SFC models can be analysed formally, an extended time-related Petri net model for SFC is proposed and the transformation method is defined. The SFC notation lacks a systematic way of synthesising system models from the real world systems. Thus a standardised approach to the development of real-time process control systems is required such that the system (software) functional requirements can be identified, captured, analysed. A rule-based approach and a method called system behaviour driven method (SBDM) are proposed as a development formalism for real-time process-control systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of a task is fundamental to the discipline of ergonomics. Approaches to the analysis of tasks began in the early 1900's. These approaches have evolved and developed to the present day, when there is a vast array of methods available. Some of these methods are specific to particular contexts or applications, others more general. However, whilst many of these analyses allow tasks to be examined in detail, they do not act as tools to aid the design process or the designer. The present thesis examines the use of task analysis in a process control context, and in particular the use of task analysis to specify operator information and display requirements in such systems. The first part of the thesis examines the theoretical aspect of task analysis and presents a review of the methods, issues and concepts relating to task analysis. A review of over 80 methods of task analysis was carried out to form a basis for the development of a task analysis method to specify operator information requirements in industrial process control contexts. Of the methods reviewed Hierarchical Task Analysis was selected to provide such a basis and developed to meet the criteria outlined for such a method of task analysis. The second section outlines the practical application and evolution of the developed task analysis method. Four case studies were used to examine the method in an empirical context. The case studies represent a range of plant contexts and types, both complex and more simple, batch and continuous and high risk and low risk processes. The theoretical and empirical issues are drawn together and a method developed to provide a task analysis technique to specify operator information requirements and to provide the first stages of a tool to aid the design of VDU displays for process control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has never been easy for manufacturing companies to understand their confidence level in terms of how accurate and to what degree of flexibility parts can be made. This brings uncertainty in finding the most suitable manufacturing method as well as in controlling their product and process verification systems. The aim of this research is to develop a system for capturing the company’s knowledge and expertise and then reflect it into an MRP (Manufacturing Resource Planning) system. A key activity here is measuring manufacturing and machining capabilities to a reasonable confidence level. For this purpose an in-line control measurement system is introduced to the company. Using SPC (Statistical Process Control) not only helps to predict the trend in manufacturing of parts but also minimises the human error in measurement. Gauge R&R (Repeatability and Reproducibility) study identifies problems in measurement systems. Measurement is like any other process in terms of variability. Reducing this variation via an automated machine probing system helps to avoid defects in future products.Developments in aerospace, nuclear, oil and gas industries demand materials with high performance and high temperature resistance under corrosive and oxidising environments. Superalloys were developed in the latter half of the 20th century as high strength materials for such purposes. For the same characteristics superalloys are considered as difficult-to-cut alloys when it comes to formation and machining. Furthermore due to the sensitivity of superalloy applications, in many cases they should be manufactured with tight tolerances. In addition superalloys, specifically Nickel based, have unique features such as low thermal conductivity due to having a high amount of Nickel in their material composition. This causes a high surface temperature on the work-piece at the machining stage which leads to deformation in the final product.Like every process, the material variations have a significant impact on machining quality. The main cause of variations can originate from chemical composition and mechanical hardness. The non-uniform distribution of metal elements is a major source of variation in metallurgical structures. Different heat treatment standards are designed for processing the material to the desired hardness levels based on application. In order to take corrective actions, a study on the material aspects of superalloys has been conducted. In this study samples from different batches of material have been analysed. This involved material preparation for microscopy analysis, and the effect of chemical compositions on hardness (before and after heat treatment). Some of the results are discussed and presented in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A modelação dos sistemas industriais apresenta para as organizações uma vantagem estratégica no domínio do estudo dos seus processos produtivos. Através da modelação será possível aumentar o conhecimento sobre os sistemas podendo permitir, quando possível, melhorias na gestão e planeamento da produção. Este conhecimento poderá permitir também um aumento da eficiência dos processos produtivos, através da melhoria ou eliminação das principais perdas detetadas no processo. Este trabalho tem como principal objetivo o desenvolvimento e validação de uma ferramenta de modelação, previsão e análise para sistemas produtivos industriais, tendo em vista o aumento do conhecimento sobre estes. Para a execução e desenvolvimento deste trabalho, foram utilizadas e desenvolvidas várias ferramentas, conceitos, metodologias e fundamentos teóricos conhecidos da bibliografia, como OEE (Overall Equipment Effectiveness), RdP (Redes de Petri), Séries Temporais, Kmeans, ou SPC (Statistical Process Control). A ferramenta de modelação, previsão e análise desenvolvida e descrita neste trabalho, mostrou-se capaz de auxiliar na deteção e interpretação das causas que influenciam os resultados do sistema produtivo e originam perdas, demonstrando as vantagens esperadas. Estes resultados foram baseados em dados reais de um sistema produtivo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Continuous-flow generation of α-diazosulfoxides results in a two- to three-fold increase in yields and decreased reaction times compared to standard batch synthesis methods. These high yielding reactions are enabled by flowing through a bed of polystyrene-supported base (PS-DBU or PS-NMe2) with highly controlled residence times. This engineered solution allows the α-diazosulfoxides to be rapidly synthesized while limiting exposure of the products to basic reaction conditions, which have been found to cause rapid decomposition. In addition to improved yields, this work has the added advantage of ease of processing, increased safety profile, and scale-up potential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metalorganic chemical vapor deposition is examined as a technique for growing compound semiconductor structures. Material analysis techniques for characterizing the quality and properties of compound semiconductor material are explained and data from recent commissioning work on a newly installed reactor at the University of Illinois is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: In profile monitoring, which is a growing research area in the field of statistical process control, the relationship between response and explanatory variables is monitored over time. The purpose of this paper is to focus on the process capability analysis of linear profiles. Process capability indices give a quick indication of the capability of a manufacturing process. Design/methodology/approach: In this paper, the proportion of the non-conformance criteria is employed to estimate process capability index. The paper has considered the cases where specification limits is constant or is a function of explanatory variable X. Moreover, cases where both equal and random design schemes in profile data acquisition is required (as the explanatory variable) is considered. Profiles with the assumption of deterministic design points are usually used in the calibration applications. However, there are other applications where design points within a profile would be i.i.d. random variables from a given distribution. Findings: Simulation studies using simple linear profile processes for both fixed and random explanatory variable with constant and functional specification limits are considered to assess the efficacy of the proposed method. Originality/value: There are many cases in industries such as semiconductor industries where quality characteristics are in form of profiles. There is no method in the literature to analyze process capability for theses processes, however recently quite a few methods have been presented in monitoring profiles. Proposed methods provide a framework for quality engineers and production engineers to evaluate and analyze capability of the profile processes. © Emerald Group Publishing Limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

随着软件对社会各领域、各层次的渗透,软件逐渐转变为一种对社会团体、 甚至对社会公众的服务,软件的规模越来越大、用户需求越来越多、功能和性能 要求也越来越复杂。因此,对软件的可用性、可靠性、可信性等质量要求不断提 高。伴随着软件业的逐渐发展,软件过程技术逐渐被应用于软件产品的开发当中。 “质量形成于产品的生产过程”这一理念逐渐被软件组织所接受。其核心思想体 现在通过对软件过程的策划、控制和改进来保证软件产品的质量,进而提高软件 组织的经营业绩。软件过程度量作为软件过程管理和过程改进的关键活动,越来 越为软件组织所重视。 通过实施过程管理,能够刻画项目或过程目标的满足程度,找到造成过程 或产品重大偏差的根本原因,进而实施过程改进。然而,在软件过程度量实施期 间,软件组织面对不同的软件开发过程、众多的过程性能度量指标、复杂的统计 分析方法,既要考虑量化管理方法的合理性和复杂程度,又要权衡量化管理的实 施成本,这使得实施有效的过程度量充满挑战。本文基于经验软件工程方法,提 出一种多粒度多维度软件过程度量框架,以及实现该框架的关键技术:软件过程 性能基线的建立和维护方法;同时介绍了该框架下的软件项目进度量化控制模 型,支持软件组织实施有效的过程管理和改进。 本文的主要贡献包括: 提出了一种多粒度多维度软件过程度量框架(Multi-granularity Multi-dimensional software Process Measurement Framework,M2-PMF),该框架通 过综合考虑软件过程管理和改进的必要信息所属的特征维度和软件组织的过程 管理粒度,自底向上的通过实体层、度量分析层和目标层指导软件组织建立一套 可以覆盖软件全生命周期的、开放的、支持过程改进的综合指标体系和模型。支 持软件组织裁减和定制确定环境下的度量体系,清晰了解其软件过程能力和性 能,提高软件组织对软件过程的控制能力,保障软件开发过程和软件产品的质量。 提出了基于统计分析的过程性能基线的建立和改进方法(Baseline – Statistic - Refinement, BSR),该方法可以有效地建立和维护过程性能基线,支持软件 组织从定性管理提升到定量管理。该方法应用波动图,在过程尚不稳定、数据样 本不足的情况下尽可能多的获得过程改进信息,识别过程改进机会,确定过程改 进途径,帮助软件组织高效地改进其过程中明显的弱项。在过程逐步稳定之后, 利用控制图、排列图、因果图、散点图等统计工具,分析过程性能,建立过程性 多粒度多维度软件过程度量和改进方法研究 ii 能基线,并不断精化。 在M2-PMF 框架下,提出了基于统计过程控制(Statitical Process ControlSPC)和挣值管理(Earned Value Management,EVM)的项目进度量化控制模型 SEVM,该方法通过对项目进度指数的统计控制,分析其稳定性,并通过估算模 型,根据项目当前挣值数据推算项目总进度偏差,并加以控制。支持软件组织对 项目进度进行量化控制,提高了项目按期交付的可能性。 最后,介绍了本文提出的过程度量框架和量化管理方法在国内多家软件组 织中的实际应用。应用案例表明,本文的方法和模型具有广泛的适应性和高度的 可操作性。应用本文方法能够对项目进行有效的估算、度量和控制,进而提高产 品质量并改善客户满意度。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

提出了一种基于数字化的生产模型,使用控制图、故障树分析和专家知识,能够进行制造过程实时监控的诊断,该模型提高了故障诊断系统的可靠性,并提供了可实际操作的可视化建模工具。所开发的在线统计过程控制系统能够根据生产事件的监测,动态响应制造过程变化。该系统运用可视化建模工具,根据专家经验进行故障树建模,通过故障树自动生成专家系统诊断规则库,实现诊断知识的自动获取。将该系统应用于汽车变速箱装配过程的检测与故障诊断,验证了方法的有效性。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This brief examines the application of nonlinear statistical process control to the detection and diagnosis of faults in automotive engines. In this statistical framework, the computed score variables may have a complicated nonparametric distri- bution function, which hampers statistical inference, notably for fault detection and diagnosis. This brief shows that introducing the statistical local approach into nonlinear statistical process control produces statistics that follow a normal distribution, thereby enabling a simple statistical inference for fault detection. Further, for fault diagnosis, this brief introduces a compensation scheme that approximates the fault condition signature. Experimental results from a Volkswagen 1.9-L turbo-charged diesel engine are included.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Treasure et al. (2004) recently proposed a new sub space-monitoring technique, based on the N4SID algorithm, within the multivariate statistical process control framework. This dynamic-monitoring method requires considerably fewer variables to be analysed when compared with dynamic principal component analysis (PCA). The contribution charts and variable reconstruction, traditionally employed for static PCA, are analysed in a dynamic context. The contribution charts and variable reconstruction may be affected by the ratio of the number of retained components to the total number of analysed variables. Particular problems arise if this ratio is large and a new reconstruction chart is introduced to overcome these. The utility of such a dynamic contribution chart and variable reconstruction is shown in a simulation and by application to industrial data from a distillation unit.