58 resultados para Process control -- Statistical methods

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper points out a serious flaw in dynamic multivariate statistical process control (MSPC). The principal component analysis of a linear time series model that is employed to capture auto- and cross-correlation in recorded data may produce a considerable number of variables to be analysed. To give a dynamic representation of the data (based on variable correlation) and circumvent the production of a large time-series structure, a linear state space model is used here instead. The paper demonstrates that incorporating a state space model, the number of variables to be analysed dynamically can be considerably reduced, compared to conventional dynamic MSPC techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anti-islanding protection is becoming increasingly important due to the rapid installation of distributed generation from renewable resources like wind, tidal and wave, solar PV, bio-fuels, as well as from other resources like diesel. Unintentional islanding presents a potential risk for damaging utility plants and equipment connected from the demand side, as well as to public and personnel in utility plants. This paper investigates automatic islanding detection. This is achieved by deploying a statistical process control approach for fault detection with the real-time data acquired through a wide area measurement system, which is based on Phasor Measurement Unit (PMU) technology. In particular, the principal component analysis (PCA) is used to project the data into principal component subspace and residual space, and two statistics are used to detect the occurrence of fault. Then a fault reconstruction method is used to identify the fault and its development over time. The proposed scheme has been used in a real system and the results have confirmed that the proposed method can correctly identify the fault and islanding site.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new algorithm for training of nonlinear optimal neuro-controllers (in the form of the model-free, action-dependent, adaptive critic paradigm). Overcomes problems with existing stochastic backpropagation training: need for data storage, parameter shadowing and poor convergence, offering significant benefits for online applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Polymer extrusion is a complex process and the availability of good dynamic models is key for improved system operation. Previous modelling attempts have failed adequately to capture the non-linearities of the process or prove too complex for control applications. This work presents a novel approach to the problem by the modelling of extrusion viscosity and pressure, adopting a grey box modelling technique that combines mechanistic knowledge with empirical data using a genetic algorithm approach. The models are shown to outperform those of a much higher order generated by a conventional black box technique while providing insight into the underlying processes at work within the extruder.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chemical Imaging (CI) is an emerging platform technology that integrates conventional imaging and spectroscopy to attain both spatial and spectral information from an object. Vibrational spectroscopic methods, such as Near Infrared (NIR) and Raman spectroscopy, combined with imaging are particularly useful for analysis of biological/pharmaceutical forms. The rapid, non-destructive and non-invasive features of CI mark its potential suitability as a process analytical tool for the pharmaceutical industry, for both process monitoring and quality control in the many stages of drug production. This paper provides an overview of CI principles, instrumentation and analysis. Recent applications of Raman and NIR-CI to pharmaceutical quality and process control are presented; challenges facing Cl implementation and likely future developments in the technology are also discussed. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The environmental quality of land can be assessed by calculating relevant threshold values, which differentiate between concentrations of elements resulting from geogenic and diffuse anthropogenic sources and concentrations generated by point sources of elements. A simple process allowing the calculation of these typical threshold values (TTVs) was applied across a region of highly complex geology (Northern Ireland) to six elements of interest; arsenic, chromium, copper, lead, nickel and vanadium. Three methods for identifying domains (areas where a readily identifiable factor can be shown to control the concentration of an element) were used: k-means cluster analysis, boxplots and empirical cumulative distribution functions (ECDF). The ECDF method was most efficient at determining areas of both elevated and reduced concentrations and was used to identify domains in this investigation. Two statistical methods for calculating normal background concentrations (NBCs) and upper limits of geochemical baseline variation (ULBLs), currently used in conjunction with legislative regimes in the UK and Finland respectively, were applied within each domain. The NBC methodology was constructed to run within a specific legislative framework, and its use on this soil geochemical data set was influenced by the presence of skewed distributions and outliers. In contrast, the ULBL methodology was found to calculate more appropriate TTVs that were generally more conservative than the NBCs. TTVs indicate what a "typical" concentration of an element would be within a defined geographical area and should be considered alongside the risk that each of the elements pose in these areas to determine potential risk to receptors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thermoforming processes generally employ sheet temperature monitoring as the primary means of process control. In this paper the development of an alternative system that monitors plug force is described. Tests using a prototype device have shown that the force record over a forming cycle creates a unique map of the process operation. Key process features such as the sheet modulus, sheet sag and the timing of the process stages may be readily observed, and the effects of changes in all of the major processing parameters are easily distinguished. Continuous, cycle-to-cycle tests show that the output is consistent and repeatable over a longer time frame, providing the opportunity for development of an on-line process control system. Further testing of the system is proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the development of neural model-based control strategies for the optimisation of an industrial aluminium substrate disk grinding process. The grindstone removal rate varies considerably over a stone life and is a highly nonlinear function of process variables. Using historical grindstone performance data, a NARX-based neural network model is developed. This model is then used to implement a direct inverse controller and an internal model controller based on the process settings and previous removal rates. Preliminary plant investigations show that thickness defects can be reduced by 50% or more, compared to other schemes employed. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This case study examines how the lean ideas behind the Toyota production system can be applied to software project management. It is a detailed investigation of the performance of a nine-person software development team employed by BBC Worldwide based in London. The data collected in 2009 involved direct observations of the development team, the kanban boards, the daily stand-up meetings, semistructured interviews with a wide variety of staff, and statistical analysis. The evidence shows that over the 12-month period, lead time to deliver software improved by 37%, consistency of delivery rose by 47%, and defects reported by customers fell 24%. The significance of this work is showing that the use of lean methods including visual management, team-based problem solving, smaller batch sizes, and statistical process control can improve software development. It also summarizes key differences between agile and lean approaches to software development. The conclusion is that the performance of the software development team was improved by adopting a lean approach. The faster delivery with a focus on creating the highest value to the customer also reduced both technical and market risks. The drawbacks are that it may not fit well with existing corporate standards.