893 resultados para Statistical process control
Resumo:
An approach to reconfiguring control systems in the event of major failures is advocated. The approach relies on the convergence of several technologies which are currently emerging: Constrained predictive control, High-fidelity modelling of complex systems, Fault detection and identification, and Model approximation and simplification. Much work is needed, both theoretical and algorithmic, to make this approach practical, but we believe that there is enough evidence, especially from existing industrial practice, for the scheme to be considered realistic. After outlining the problem and proposed solution, the paper briefly reviews constrained predictive control and object-oriented modelling, which are the essential ingredients for practical implementation. The prospects for automatic model simplification are also reviewed briefly. The paper emphasizes some emerging trends in industrial practice, especially as regards modelling and control of complex systems. Examples from process control and flight control are used to illustrate some of the ideas.
Resumo:
This paper reports the application of Advanced Process Control (APC) techniques for improving the thermal energy efficiency of a paperboard-making process by regulating the Machine Direction (MD) profile of the basis weight and moisture content of the paper-board. A Model Predictive Controller (MPC) is designed so that the sheet moisture and basis weight tracking errors along with variations of the sheet moisture and basis weight are reduced. Also, the drainage is maximised through improved wet-end stability which can facilitate driving the sheet moisture set-point closer to its upper specification limit over time. It is shown that the proposed strategy can result in reducing steam usage by 8-10%. A simulation study based on a UK board machine is presented to show the effectiveness of the proposed technique. © 2011 Intl Journal of Adv Mechatr.
Resumo:
A design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the process control of micro and nano-electronics based manufacturing processes is presented in this paper. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to help understand how a pre-defined geometry of micro- and nano- structures can be achieved using this technology. The process performance is characterised on the basis of developed Reduced Order Models (ROM) and are generated using results from a mathematical model of the Focused Ion Beam and Design of Experiment (DoE) methods. Two ion beam sources, Argon and Gallium ions, have been used to compare and quantify the process variable uncertainties that can be observed during the milling process. The evaluations of the process performance takes into account the uncertainties and variations of the process variables and are used to identify their impact on the reliability and quality of the fabricated structure. An optimisation based design task is to identify the optimal process conditions, by varying the process variables, so that certain quality objectives and requirements are achieved and imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.
Resumo:
This is the first paper that shows and theoretically analyses that the presence of auto-correlation can produce considerable alterations in the Type I and Type II errors in univariate and multivariate statistical control charts. To remove this undesired effect, linear inverse ARMA filter are employed and the application studies in this paper show that false alarms (increased Type I errors) and an insensitive monitoring statistics (increased Type II errors) were eliminated.
Resumo:
The work in this paper is of particular significance since it considers the problem of modelling cross- and auto-correlation in statistical process monitoring. The presence of both types of correlation can lead to fault insensitivity or false alarms, although in published literature to date, only autocorrelation has been broadly considered. The proposed method, which uses a Kalman innovation model, effectively removes both correlations. The paper (and Part 2 [2]) has emerged from work supported by EPSRC grant GR/S84354/01 and is of direct relevance to problems in several application areas including chemical, electrical, and mechanical process monitoring.
Resumo:
This paper describes the application of multivariate regression techniques to the Tennessee Eastman benchmark process for modelling and fault detection. Two methods are applied : linear partial least squares, and a nonlinear variant of this procedure using a radial basis function inner relation. The performance of the RBF networks is enhanced through the use of a recently developed training algorithm which uses quasi-Newton optimization to ensure an efficient and parsimonious network; details of this algorithm can be found in this paper. The PLS and PLS/RBF methods are then used to create on-line inferential models of delayed process measurements. As these measurements relate to the final product composition, these models suggest that on-line statistical quality control analysis should be possible for this plant. The generation of `soft sensors' for these measurements has the further effect of introducing a redundant element into the system, redundancy which can then be used to generate a fault detection and isolation scheme for these sensors. This is achieved by arranging the sensors and models in a manner comparable to the dedicated estimator scheme of Clarke et al. 1975, IEEE Trans. Pero. Elect. Sys., AES-14R, 465-473. The effectiveness of this scheme is demonstrated on a series of simulated sensor and process faults, with full detection and isolation shown to be possible for sensor malfunctions, and detection feasible in the case of process faults. Suggestions for enhancing the diagnostic capacity in the latter case are covered towards the end of the paper.
Resumo:
In polymer extrusion, the delivery of a melt which is homogenous in composition and temperature is paramount for achieving high quality extruded products. However, advancements in process control are required to reduce temperature variations across the melt flow which can result in poor product quality. The majority of thermal monitoring methods provide only low accuracy point/bulk melt temperature measurements and cause poor controller performance. Furthermore, the most common conventional proportional-integral-derivative controllers seem to be incapable of performing well over the nonlinear operating region. This paper presents a model-based fuzzy control approach to reduce the die melt temperature variations across the melt flow while achieving desired average die melt temperature. Simulation results confirm the efficacy of the proposed controller.
Resumo:
The work presented in this paper takes advantage of newly developed instrumentation suitable for in process monitoring of an industrial stretch blow molding machine. The instrumentation provides blowing pressure and stretch rod force histories along with the kinematics of polymer contact with the mould wall. A Design of Experiments pattern was used to qualitatively relate machine inputs with these process parameters and the thickness distribution of stretch blow molded PET (polyethylene terephtalate) bottles. Material slippage at the mold wall and thickness distribution is also discussed in relation to machine inputs. The key process indicators defined have great potential for use in a closed loop process control system and for validation of process simulations.
Resumo:
In collaboration with Airbus-UK, the dimensional growth of aircraft panels while being riveted with stiffeners is investigated. Small panels are used in this investigation. The stiffeners have been fastened to the panels with rivets and it has been observed that during this operation the panels expand in the longitudinal and transverse directions. It has been observed that the growth is variable and the challenge is to control the riveting process to minimize this variability. In this investigation, the assembly of the small panels and longitudinal stiffeners has been simulated using static stress and nonlinear explicit finite element models. The models have been validated against a limited set of experimental measurements; it was found that more accurate predictions of the riveting process are achieved using explicit finite element models. Yet, the static stress finite element model is more time efficient, and more practical to simulate hundreds of rivets and the stochastic nature of the process. Furthermore, through a series of numerical simulations and probabilistic analyses, the manufacturing process control parameters that influence panel growth have been identified. Alternative fastening approaches were examined and it was found that dimensional growth can be controlled by changing the design of the dies used for forming the rivets.
Resumo:
In collaboration with Airbus-UK, the dimensional growth of small panels while being riveted with stiffeners is investigated. The stiffeners have been fastened to the panels with rivets and it has been observed that during this operation the panels expand in the longitudinal and transverse directions. It has been observed that the growth is variable and the challenge is to control the riveting process to minimize this variability. In this investigation, the assembly of the small panels and longitudinal stiffeners has been simulated using low and high fidelity nonlinear finite element models. The models have been validated against a limited set of experimental measurements; it was found that more accurate predictions of the riveting process are achieved using high fidelity explicit finite element models. Furthermore, through a series of numerical simulations and probabilistic analyses, the manufacturing process control parameters that influence panel growth have been identified. Alternative fastening approaches were examined and it was found that dimensional growth can be controlled by changing the design of the dies used for forming the rivets.
Resumo:
The main objective of this work was to monitor a set of physical-chemical properties of heavy oil procedural streams through nuclear magnetic resonance spectroscopy, in order to propose an analysis procedure and online data processing for process control. Different statistical methods which allow to relate the results obtained by nuclear magnetic resonance spectroscopy with the results obtained by the conventional standard methods during the characterization of the different streams, have been implemented in order to develop models for predicting these same properties. The real-time knowledge of these physical-chemical properties of petroleum fractions is very important for enhancing refinery operations, ensuring technically, economically and environmentally proper refinery operations. The first part of this work involved the determination of many physical-chemical properties, at Matosinhos refinery, by following some standard methods important to evaluate and characterize light vacuum gas oil, heavy vacuum gas oil and fuel oil fractions. Kinematic viscosity, density, sulfur content, flash point, carbon residue, P-value and atmospheric and vacuum distillations were the properties analysed. Besides the analysis by using the standard methods, the same samples were analysed by nuclear magnetic resonance spectroscopy. The second part of this work was related to the application of multivariate statistical methods, which correlate the physical-chemical properties with the quantitative information acquired by nuclear magnetic resonance spectroscopy. Several methods were applied, including principal component analysis, principal component regression, partial least squares and artificial neural networks. Principal component analysis was used to reduce the number of predictive variables and to transform them into new variables, the principal components. These principal components were used as inputs of the principal component regression and artificial neural networks models. For the partial least squares model, the original data was used as input. Taking into account the performance of the develop models, by analysing selected statistical performance indexes, it was possible to conclude that principal component regression lead to worse performances. When applying the partial least squares and artificial neural networks models better results were achieved. However, it was with the artificial neural networks model that better predictions were obtained for almost of the properties analysed. With reference to the results obtained, it was possible to conclude that nuclear magnetic resonance spectroscopy combined with multivariate statistical methods can be used to predict physical-chemical properties of petroleum fractions. It has been shown that this technique can be considered a potential alternative to the conventional standard methods having obtained very promising results.
Resumo:
Tese de doutoramento, Biologia (Biologia Marinha e Aquacultura), Universidade de Lisboa, Faculdade de Ciências, 2014
Resumo:
Nos últimos anos a indústria de semicondutores, nomeadamente a produção de memórias, tem sofrido uma grande evolução. A necessidade de baixar custos de produção, assim como de produzir sistemas mais complexos e com maior capacidade, levou à criação da tecnologia WLP (Wafer Level Packaging). Esta tecnologia permite a produção de sistemas mais pequenos, simplificar o fluxo do processo e providenciar uma redução significativa do custo final do produto. A WLP é uma tecnologia de encapsulamento de circuitos integrados quando ainda fazem parte de wafers (bolachas de silício), em contraste com o método tradicional em que os sistemas são individualizados previamente antes de serem encapsulados. Com o desenvolvimento desta tecnologia, surgiu a necessidade de melhor compreender o comportamento mecânico do mold compound (MC - polímero encapsulante) mais especificamente do warpage (empeno) de wafers moldadas. O warpage é uma característica deste produto e deve-se à diferença do coeficiente de expansão térmica entre o silício e o mold compound. Este problema é observável no produto através do arqueamento das wafers moldadas. O warpage de wafers moldadas tem grande impacto na manufatura. Dependendo da quantidade e orientação do warpage, o transporte, manipulação, bem como, a processamento das wafers podem tornar-se complicados ou mesmo impossíveis, o que se traduz numa redução de volume de produção e diminuição da qualidade do produto. Esta dissertação foi desenvolvida na Nanium S.A., empresa portuguesa líder mundial na tecnologia de WLP em wafers de 300mm e aborda a utilização da metodologia Taguchi, no estudo da variabilidade do processo de debond para o produto X. A escolha do processo e produto baseou-se numa análise estatística da variação e do impacto do warpage ao longo doprocesso produtivo. A metodologia Taguchi é uma metodologia de controlo de qualidade e permite uma aproximação sistemática num dado processo, combinando gráficos de controlo, controlo do processo/produto, e desenho do processo para alcançar um processo robusto. Os resultados deste método e a sua correta implementação permitem obter poupanças significativas nos processos com um impacto financeiro significativo. A realização deste projeto permitiu estudar e quantificar o warpage ao longo da linha de produção e minorar o impacto desta característica no processo de debond. Este projecto permitiu ainda a discussão e o alinhamento entre as diferentes áreas de produção no que toca ao controlo e a melhoria de processos. Conseguiu–se demonstrar que o método Taguchi é um método eficiente no que toca ao estudo da variabilidade de um processo e otimização de parâmetros. A sua aplicação ao processo de debond permitiu melhorar ou a fiabilidade do processo em termos de garantia da qualidade do produto, como ao nível do aumento de produção.
Resumo:
A promising technique for the large-scale manufacture of micro-fluidic devices and photonic devices is hot embossing of polymers such as PMMA. Micro-embossing is a deformation process where the workpiece material is heated to permit easier material flow and then forced over a planar patterned tool. While there has been considerable, attention paid to process feasibility very little effort has been put into production issues such as process capability and eventual process control. In this paper, we present initial studies aimed at identifying the origins and magnitude of variability for embossing features at the micron scale in PMMA. Test parts with features ranging from 3.5- 630 µm wide and 0.9 µm deep were formed. Measurements at this scale proved very difficult, and only atomic force microscopy was able to provide resolution sufficient to identify process variations. It was found that standard deviations of widths at the 3-4 µm scale were on the order of 0.5 µm leading to a coefficient of variation as high as 13%. Clearly, the transition from test to manufacturing for this process will require understanding the causes of this variation and devising control methods to minimize its magnitude over all types of parts.