905 resultados para Statistical quality control


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This document is the Argo quality control manual for Dissolved oxygen concentration. It describes two levels of quality control: • The first level is the real-time system that performs a set of agreed automatic checks. • Adjustment in real-time can also be performed and the real-time system can evaluate quality flags for adjusted fields • The second level is the delayed-mode quality control system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In April 2017, CMEMS plans to launch the WAVES NRT products. This document is focused in the automatic RTQC of the collected wave data. The validation procedure includes the delayed mode quality control of the data and will be specified in another guideline. To perform any kind of quality control to wave data, first it’s necessary to know the nature of the measurements and the analysis performed to those measurements to obtain the wave parameters. For that reason next chapter is dedicated to show the usual wave analysis and the different parameters and estimators obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de mestrado, Qualidade em Análises, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the application of multivariate regression techniques to the Tennessee Eastman benchmark process for modelling and fault detection. Two methods are applied : linear partial least squares, and a nonlinear variant of this procedure using a radial basis function inner relation. The performance of the RBF networks is enhanced through the use of a recently developed training algorithm which uses quasi-Newton optimization to ensure an efficient and parsimonious network; details of this algorithm can be found in this paper. The PLS and PLS/RBF methods are then used to create on-line inferential models of delayed process measurements. As these measurements relate to the final product composition, these models suggest that on-line statistical quality control analysis should be possible for this plant. The generation of `soft sensors' for these measurements has the further effect of introducing a redundant element into the system, redundancy which can then be used to generate a fault detection and isolation scheme for these sensors. This is achieved by arranging the sensors and models in a manner comparable to the dedicated estimator scheme of Clarke et al. 1975, IEEE Trans. Pero. Elect. Sys., AES-14R, 465-473. The effectiveness of this scheme is demonstrated on a series of simulated sensor and process faults, with full detection and isolation shown to be possible for sensor malfunctions, and detection feasible in the case of process faults. Suggestions for enhancing the diagnostic capacity in the latter case are covered towards the end of the paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An economic model including the labor resource and the process stage configuration is proposed to design g charts allowing for all the design parameters to be varied in an adaptive way. A random shift size is considered during the economic design selection. The results obtained for a benchmark of 64 process stage scenarios show that the activities configuration and some process operating parameters influence the selection of the best control chart strategy: to model the random shift size, its exact distribution can be approximately fitted by a discrete distribution obtained from a relatively small sample of historical data. However, an accurate estimation of the inspection costs associated to the SPC activities is far from being achieved. An illustrative example shows the implementation of the proposed economic model in a real industrial case. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the sugarcane mechanized harvester process, the wear of base cutting knives is directly correlated to the quality of the cut made by the machines, and the use of quality control tools important in monitoring this process. Thus, the present study in Ribeirão Preto region aimed to assess the knives cut baseline and damage caused to wear brass knuckles in mechanized harvesting of cane raw sugar, from the viewpoint of statistical quality control (SQC). The wear of the knives was quantified by mass loss and its dimensions, while cutting quality was assessed by cutting height and damage to stumps visually classified according to the level of damage caused. The results showed that the wear of the knives was more pronounced in certain periods of use, but still within control standards. The cutting height was not affected by the wear of the knives, keeping within the limits of desirable quality for operation. Eventually damage the stumps ranged among themselves depending on each face of the cutting knives evaluated, and the predominance of certain classes of damage in each cutting face, but always remained in statistical control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Control charts are very important tools in statistical quality control of industrial processes and its use started last century. Since its development, the charts have always been attributed to independent processes, i.e. without any correlation between samples. But nowadays, with the high level of automation in the industrial environment, it is noticeable the autocorrelation factor between samples. The main Xcharts used in monitoring quality characteristics represented by continuous variables are the mean (X ), amplitude (R) and variance (S²). Therefore, this work aims to analyze the performance of X and R charts and in of X and S² charts with different sample sizes (4 and 5) for monitoring autocorrelated processes. Through computer simulations using the Fortran software and the use of mathematical expressions was possible to obtain data and performance analysis of the detection power charts for independent observations and for autocorrelated observations according to the model AR (1). The results show that the effect of autocorrelation reduces the ability of monitoring the control charts and that, the greater this effect, the slower the chart becomes in misfits signaling

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Produção Vegetal) - FCAV

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a wide range of potential study designs for intervention studies to decrease nosocomial infections in hospitals. The analysis is complex due to competing events, clustering, multiple timescales and time-dependent period and intervention variables. This review considers the popular pre-post quasi-experimental design and compares it with randomized designs. Randomization can be done in several ways: randomization of the cluster [intensive care unit (ICU) or hospital] in a parallel design; randomization of the sequence in a cross-over design; and randomization of the time of intervention in a stepped-wedge design. We introduce each design in the context of nosocomial infections and discuss the designs with respect to the following key points: bias, control for nonintervention factors, and generalizability. Statistical issues are discussed. A pre-post-intervention design is often the only choice that will be informative for a retrospective analysis of an outbreak setting. It can be seen as a pilot study with further, more rigorous designs needed to establish causality. To yield internally valid results, randomization is needed. Generally, the first choice in terms of the internal validity should be a parallel cluster randomized trial. However, generalizability might be stronger in a stepped-wedge design because a wider range of ICU clinicians may be convinced to participate, especially if there are pilot studies with promising results. For analysis, the use of extended competing risk models is recommended.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is remarkable growing concern about the quality control at the time, which has led to the search for methods capable of addressing effectively the reliability analysis as part of the Statistic. Managers, researchers and Engineers must understand that 'statistical thinking' is not just a set of statistical tools. They should start considering 'statistical thinking' from a 'system', which means, developing systems that meet specific statistical tools and other methodologies for an activity. The aim of this article is to encourage them (engineers, researchers and managers) to develop a new way of thinking.