961 resultados para Statistical Process Control (SPC)
Resumo:
In which refers to statistical process control, the analysis of univariate cases is not enough for many types of company, being necessary to resort to multivariate cases. Besides, it is usually supposed that the observations are independent. However, the violation of this hypothesis indicates the existence of autocorrelation in the process. In this work, by a basic quantitative approach for an exploratory and experimental research, the study target are the multivariate autocorrelated control charts, using Hotteling T². The ARL values were collected by simulations of a computational program on FORTRAN language, with objective of studying the charts properties, in addition to compare with the
Resumo:
In which refers to statistical process control, the analysis of univariate cases is not enough for many types of company, being necessary to resort to multivariate cases. Besides, it is usually supposed that the observations are independent. However, the violation of this hypothesis indicates the existence of autocorrelation in the process. In this work, by a basic quantitative approach for an exploratory and experimental research, the study target are the multivariate autocorrelated control charts, using Hotteling T². The ARL values were collected by simulations of a computational program on FORTRAN language, with objective of studying the charts properties, in addition to compare with the
Resumo:
Data visualization techniques are powerful in the handling and analysis of multivariate systems. One such technique known as parallel coordinates was used to support the diagnosis of an event, detected by a neural network-based monitoring system, in a boiler at a Brazilian Kraft pulp mill. Its attractiveness is the possibility of the visualization of several variables simultaneously. The diagnostic procedure was carried out step-by-step going through exploratory, explanatory, confirmatory, and communicative goals. This tool allowed the visualization of the boiler dynamics in an easier way, compared to commonly used univariate trend plots. In addition it facilitated analysis of other aspects, namely relationships among process variables, distinct modes of operation and discrepant data. The whole analysis revealed firstly that the period involving the detected event was associated with a transition between two distinct normal modes of operation, and secondly the presence of unusual changes in process variables at this time.
Resumo:
Surgery and other invasive therapies are complex interventions, the assessment of which is challenged by factors that depend on operator, team, and setting, such as learning curves, quality variations, and perception of equipoise. We propose recommendations for the assessment of surgery based on a five-stage description of the surgical development process. We also encourage the widespread use of prospective databases and registries. Reports of new techniques should be registered as a professional duty, anonymously if necessary when outcomes are adverse. Case series studies should be replaced by prospective development studies for early technical modifications and by prospective research databases for later pre-trial evaluation. Protocols for these studies should be registered publicly. Statistical process control techniques can be useful in both early and late assessment. Randomised trials should be used whenever possible to investigate efficacy, but adequate pre-trial data are essential to allow power calculations, clarify the definition and indications of the intervention, and develop quality measures. Difficulties in doing randomised clinical trials should be addressed by measures to evaluate learning curves and alleviate equipoise problems. Alternative prospective designs, such as interrupted time series studies, should be used when randomised trials are not feasible. Established procedures should be monitored with prospective databases to analyse outcome variations and to identify late and rare events. Achievement of improved design, conduct, and reporting of surgical research will need concerted action by editors, funders of health care and research, regulatory bodies, and professional societies.
Resumo:
Detector uniformity is a fundamental performance characteristic of all modern gamma camera systems, and ensuring a stable, uniform detector response is critical for maintaining clinical images that are free of artifact. For these reasons, the assessment of detector uniformity is one of the most common activities associated with a successful clinical quality assurance program in gamma camera imaging. The evaluation of this parameter, however, is often unclear because it is highly dependent upon acquisition conditions, reviewer expertise, and the application of somewhat arbitrary limits that do not characterize the spatial location of the non-uniformities. Furthermore, as the goal of any robust quality control program is the determination of significant deviations from standard or baseline conditions, clinicians and vendors often neglect the temporal nature of detector degradation (1). This thesis describes the development and testing of new methods for monitoring detector uniformity. These techniques provide more quantitative, sensitive, and specific feedback to the reviewer so that he or she may be better equipped to identify performance degradation prior to its manifestation in clinical images. The methods exploit the temporal nature of detector degradation and spatially segment distinct regions-of-non-uniformity using multi-resolution decomposition. These techniques were tested on synthetic phantom data using different degradation functions, as well as on experimentally acquired time series floods with induced, progressively worsening defects present within the field-of-view. The sensitivity of conventional, global figures-of-merit for detecting changes in uniformity was evaluated and compared to these new image-space techniques. The image-space algorithms provide a reproducible means of detecting regions-of-non-uniformity prior to any single flood image’s having a NEMA uniformity value in excess of 5%. The sensitivity of these image-space algorithms was found to depend on the size and magnitude of the non-uniformities, as well as on the nature of the cause of the non-uniform region. A trend analysis of the conventional figures-of-merit demonstrated their sensitivity to shifts in detector uniformity. The image-space algorithms are computationally efficient. Therefore, the image-space algorithms should be used concomitantly with the trending of the global figures-of-merit in order to provide the reviewer with a richer assessment of gamma camera detector uniformity characteristics.
Resumo:
Nanoparticles offer an ideal platform for the delivery of small molecule drugs, subunit vaccines and genetic constructs. Besides the necessity of a homogenous size distribution, defined loading efficiencies and reasonable production and development costs, one of the major bottlenecks in translating nanoparticles into clinical application is the need for rapid, robust and reproducible development techniques. Within this thesis, microfluidic methods were investigated for the manufacturing, drug or protein loading and purification of pharmaceutically relevant nanoparticles. Initially, methods to prepare small liposomes were evaluated and compared to a microfluidics-directed nanoprecipitation method. To support the implementation of statistical process control, design of experiment models aided the process robustness and validation for the methods investigated and gave an initial overview of the size ranges obtainable in each method whilst evaluating advantages and disadvantages of each method. The lab-on-a-chip system resulted in a high-throughput vesicle manufacturing, enabling a rapid process and a high degree of process control. To further investigate this method, cationic low transition temperature lipids, cationic bola-amphiphiles with delocalized charge centers, neutral lipids and polymers were used in the microfluidics-directed nanoprecipitation method to formulate vesicles. Whereas the total flow rate (TFR) and the ratio of solvent to aqueous stream (flow rate ratio, FRR) was shown to be influential for controlling the vesicle size in high transition temperature lipids, the factor FRR was found the most influential factor controlling the size of vesicles consisting of low transition temperature lipids and polymer-based nanoparticles. The biological activity of the resulting constructs was confirmed by an invitro transfection of pDNA constructs using cationic nanoprecipitated vesicles. Design of experiments and multivariate data analysis revealed the mathematical relationship and significance of the factors TFR and FRR in the microfluidics process to the liposome size, polydispersity and transfection efficiency. Multivariate tools were used to cluster and predict specific in-vivo immune responses dependent on key liposome adjuvant characteristics upon delivery a tuberculosis antigen in a vaccine candidate. The addition of a low solubility model drug (propofol) in the nanoprecipitation method resulted in a significantly higher solubilisation of the drug within the liposomal bilayer, compared to the control method. The microfluidics method underwent scale-up work by increasing the channel diameter and parallelisation of the mixers in a planar way, resulting in an overall 40-fold increase in throughput. Furthermore, microfluidic tools were developed based on a microfluidics-directed tangential flow filtration, which allowed for a continuous manufacturing, purification and concentration of liposomal drug products.
Resumo:
BACKGROUND: Guidance for appropriate utilisation of transthoracic echocardiograms (TTEs) can be incorporated into ordering prompts, potentially affecting the number of requests. METHODS: We incorporated data from the 2011 Appropriate Use Criteria for Echocardiography, the 2010 National Institute for Clinical Excellence Guideline on Chronic Heart Failure, and American College of Cardiology Choosing Wisely list on TTE use for dyspnoea, oedema and valvular disease into electronic ordering systems at Durham Veterans Affairs Medical Center. Our primary outcome was TTE orders per month. Secondary outcomes included rates of outpatient TTE ordering per 100 visits and frequency of brain natriuretic peptide (BNP) ordering prior to TTE. Outcomes were measured for 20 months before and 12 months after the intervention. RESULTS: The number of TTEs ordered did not decrease (338±32 TTEs/month prior vs 320±33 afterwards, p=0.12). Rates of outpatient TTE ordering decreased minimally post intervention (2.28 per 100 primary care/cardiology visits prior vs 1.99 afterwards, p<0.01). Effects on TTE ordering and ordering rate significantly interacted with time from intervention (p<0.02 for both), as the small initial effects waned after 6 months. The percentage of TTE orders with preceding BNP increased (36.5% prior vs 42.2% after for inpatients, p=0.01; 10.8% prior vs 14.5% after for outpatients, p<0.01). CONCLUSIONS: Ordering prompts for TTEs initially minimally reduced the number of TTEs ordered and increased BNP measurement at a single institution, but the effect on TTEs ordered was likely insignificant from a utilisation standpoint and decayed over time.
Resumo:
An economic model including the labor resource and the process stage configuration is proposed to design g charts allowing for all the design parameters to be varied in an adaptive way. A random shift size is considered during the economic design selection. The results obtained for a benchmark of 64 process stage scenarios show that the activities configuration and some process operating parameters influence the selection of the best control chart strategy: to model the random shift size, its exact distribution can be approximately fitted by a discrete distribution obtained from a relatively small sample of historical data. However, an accurate estimation of the inspection costs associated to the SPC activities is far from being achieved. An illustrative example shows the implementation of the proposed economic model in a real industrial case. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Purpose - The aim of this paper is to present a synthetic chart based on the non-central chi-square statistic that is operationally simpler and more effective than the joint X̄ and R chart in detecting assignable cause(s). This chart will assist in identifying which (mean or variance) changed due to the occurrence of the assignable causes. Design/methodology/approach - The approach used is based on the non-central chi-square statistic and the steady-state average run length (ARL) of the developed chart is evaluated using a Markov chain model. Findings - The proposed chart always detects process disturbances faster than the joint X̄ and R charts. The developed chart can monitor the process instead of looking at two charts separately. Originality/value - The most important advantage of using the proposed chart is that practitioners can monitor the process by looking at only one chart instead of looking at two charts separately. © Emerald Group Publishing Limted.
Resumo:
L’objecte del present treball és la realització d’una aplicació que permeti portar a terme el control estadístic multivariable en línia d’una planta SBR.Aquesta eina ha de permetre realitzar un anàlisi estadístic multivariable complet del lot en procés, de l’últim lot finalitzat i de la resta de lots processats a la planta.L’aplicació s’ha de realitzar en l’entorn LabVIEW. L’elecció d’aquest programa vecondicionada per l’actualització del mòdul de monitorització de la planta que s’estàdesenvolupant en aquest mateix entorn
Resumo:
A specification for contractor moisture quality control (QC) in roadway embankment construction has been in use for approximately 10 years in Iowa on about 190 projects. The use of this QC specification and the development of the soils certification program for the Iowa Department of Transportation (DOT) originated from Iowa Highway Research Board (IHRB) embankment quality research projects. Since this research, the Iowa DOT has applied compaction with moisture control on most embankment work under pavements. This study set out to independently evaluate the actual quality of compaction using the current specifications. Results show that Proctor tests conducted by Iowa State University (ISU) using representative material obtained from each test section where field testing was conducted had optimum moisture contents and maximum dry densities that are different from what was selected by the Iowa DOT for QC/quality assurance (QA) testing. Comparisons between the measured and selected values showed a standard error of 2.9 lb/ft3 for maximum dry density and 2.1% for optimum moisture content. The difference in optimum moisture content was as high as 4% and the difference in maximum dry density was as high as 6.5 lb/ft3 . The difference at most test locations, however, were within the allowable variation suggested in AASHTO T 99 for test results between different laboratories. The ISU testing results showed higher rates of data outside of the target limits specified based on the available contractor QC data for cohesive materials. Also, during construction observations, wet fill materials were often observed. Several test points indicated that materials were placed and accepted at wet of the target moisture contents. The statistical analysis results indicate that the results obtained from this study showed improvements over results from previous embankment quality research projects (TR-401 Phases I through III and TR-492) in terms of the percentage of data that fell within the specification limits. Although there was evidence of improvement, QC/QA results are not consistently meeting the target limits/values. Recommendations are provided in this report for Iowa DOT consideration with three proposed options for improvements to the current specifications. Option 1 provides enhancements to current specifications in terms of material-dependent control limits, training, sampling, and process control. Option 2 addresses development of alternative specifications that incorporate dynamic cone penetrometer or light weight deflectometer testing into QC/QA. Option 3 addresses incorporating calibrated intelligent compaction measurements into QC/QA.
Resumo:
Työn päätavoitteena oli kehittää Oy Metsä-Botnia Ab:n Rauman tehtaan tilastollista laadunhallintatoimintaa ratkaisemalla toimintatapoihin ja DNA SPC -sovelluksen syy-seurauskaavion sisältöön liittyvät ongelmat. Ensimmäinen osatavoite oli erityissyiden tilaston keräysmenetelmänä ja ongelmanratkaisun apuvälineenä käytetyn syy-seurauskaavion sisällön kehittäminen ja sen päivityksen toimintamallien luominen. Toisena osatavoitteena oli aluevastaavien SPC-viikkoraportoinnin kehittäminen ja kolmantena SPC-toiminnan aikaansaaman tehtaan suorituskykymuutoksen määrittäminen. Työn kirjallisuusosassa esiteltiin prosessien kehittämiselle tärkeät työkalut sekä tilastollisen ajattelun ja SPC:n perusperiaatteet. Kirjallisuusosassa tarkasteltiin myös SPC:n käyttöä hankaloittavia selluteollisuuden erityispiirteitä. Työssä kehitettiin syy-seurauskaaviota entistä helppokäyttöisemmäksi vaihtamalla yleisten pääluokkien tilalle ohjauskortteja vastaavat pääluokat. Syy-seurauskaavion päivitykselle ja SPC-viikkoraportoinnille luotiin tehokkaat toimintamallit. SPC-viikkoraportoinnin toimintamallin pitäisi muun muassa parantaa tiedonkulkua tehtaalla. Työssä tehtyjen tilastollisten testien mukaan Rauman tehtaan suorituskyky oli useiden mittareiden osalta parantunut SPC-toiminnan myötä. Muutaman mittarin osalta Rauman tehtaan suorituskyky oli heikentynyt SPC-toiminnan aloittamisen jälkeen.
Resumo:
L’objecte del present treball és la realització d’una aplicació que permeti portar a terme el control estadístic multivariable en línia d’una planta SBR. Aquesta eina ha de permetre realitzar un anàlisi estadístic multivariable complet del lot en procés, de l’últim lot finalitzat i de la resta de lots processats a la planta. L’aplicació s’ha de realitzar en l’entorn LabVIEW. L’elecció d’aquest programa ve condicionada per l’actualització del mòdul de monitorització de la planta que s’està desenvolupant en aquest mateix entorn
Resumo:
In industrial practice, constrained steady state optimisation and predictive control are separate, albeit closely related functions within the control hierarchy. This paper presents a method which integrates predictive control with on-line optimisation with economic objectives. A receding horizon optimal control problem is formulated using linear state space models. This optimal control problem is very similar to the one presented in many predictive control formulations, but the main difference is that it includes in its formulation a general steady state objective depending on the magnitudes of manipulated and measured output variables. This steady state objective may include the standard quadratic regulatory objective, together with economic objectives which are often linear. Assuming that the system settles to a steady state operating point under receding horizon control, conditions are given for the satisfaction of the necessary optimality conditions of the steady-state optimisation problem. The method is based on adaptive linear state space models, which are obtained by using on-line identification techniques. The use of model adaptation is justified from a theoretical standpoint and its beneficial effects are shown in simulations. The method is tested with simulations of an industrial distillation column and a system of chemical reactors.
Resumo:
Grinding process is usually the last finishing process of a precision component in the manufacturing industries. This process is utilized for manufacturing parts of different materials, so it demands results such as low roughness, dimensional and shape error control, optimum tool-life, with minimum cost and time. Damages on the parts are very expensive since the previous processes and the grinding itself are useless when the part is damaged in this stage. This work aims to investigate the efficiency of digital signal processing tools of acoustic emission signals in order to detect thermal damages in grinding process. To accomplish such a goal, an experimental work was carried out for 15 runs in a surface grinding machine operating with an aluminum oxide grinding wheel and ABNT 1045 e VC131 steels. The acoustic emission signals were acquired from a fixed sensor placed on the workpiece holder. A high sampling rate acquisition system at 2.5 MHz was used to collect the raw acoustic emission instead of root mean square value usually employed. In each test AE data was analyzed off-line, with results compared to inspection of each workpiece for burn and other metallurgical anomaly. A number of statistical signal processing tools have been evaluated.