952 resultados para Process control -- Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’objecte del present treball és la realització d’una aplicació que permeti portar a terme el control estadístic multivariable en línia d’una planta SBR. Aquesta eina ha de permetre realitzar un anàlisi estadístic multivariable complet del lot en procés, de l’últim lot finalitzat i de la resta de lots processats a la planta. L’aplicació s’ha de realitzar en l’entorn LabVIEW. L’elecció d’aquest programa ve condicionada per l’actualització del mòdul de monitorització de la planta que s’està desenvolupant en aquest mateix entorn

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes research that was conducted into the potential of modeling the activities of the Data Processing Department as an aid to the computer auditor. A methodology is composed to aid in the evaluation of the Internal Controls, particularly the General Controls relative to computer processing. Consisting of three major components, the methodology enables the auditor to model the presumed activities of the Data Processing Department against the actual activities, as recorded on the Operating System Log. The first component of the methodology is the construction and loading of a model of the presumed activities of the Data Processing Department from its verbal, scheduled, and reported activities. The second component is the generation of a description of the actual activities of the Data Processing Department from the information recorded on the Operating System Log. This is effected by reducing the Operating System Log to the format described by the Standard Audit File concept. Finally, the third component in the methodology is the modeling process itself. This is in fact a new analysis technique proposed for use by the EDP auditor. The modeling process is composed of software that compares the model developed and loaded in the first component, with the description of actual activity as collated by the second component. Results from this comparison are then reviewed by the auditor, who determines if they adequately depict the situation, or whether the models description as specified in the first component requires to be altered, and the modeling process re-initiated. In conducting the research, information and data from a production installation was used. Use of the ‘real-world’ input proved both the feasibility of developing a model of the reported activities of the Data Processing Department, and the adequacy of the operating system log as a source of information to report the departments actual activities. Additionally, it enabled the involvement and comment of practicing auditors. The research involved analysis of the effect of EDP on the audit process, structure of the EDP audit process, data reduction, data structures, model formalization, and model processing software. Additionally, the Standard Audit File concept was verified through its use by practising auditors, and expanded by the development of an indexed data structure, which enabled its analysis to be conducted interactively. Results from the trial implementation of the research software and methodology at a production installation confirmed the research hypothesis that the activities of the Data Processing Department could be modelled, and that there are substantial benefits from the EDP auditor in analysing this process. The research in fact provides a new source of information, and develops a new analysis technique for the EDP auditor. It demonstrates the utilization of computer technology to monitor itself for the audit function, and reasserts auditor independence by providing access to technical detail describing the processing activities of the computer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the search for productivity increase, industry has invested on the development of intelligent, flexible and self-adjusting method, capable of controlling processes through the assistance of autonomous systems, independently whether they are hardware or software. Notwithstanding, simulating conventional computational techniques is rather challenging, regarding the complexity and non-linearity of the production systems. Compared to traditional models, the approach with Artificial Neural Networks (ANN) performs well as noise suppression and treatment of non-linear data. Therefore, the challenges in the wood industry justify the use of ANN as a tool for process improvement and, consequently, add value to the final product. Furthermore, Artificial Intelligence techniques such as Neuro-Fuzzy Networks (NFNs) have proven effective, since NFNs combine the ability to learn from previous examples and generalize the acquired information from the ANNs with the capacity of Fuzzy Logic to transform linguistic variables in rules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"January 1980."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work comprises a new theoretical development applied to aid decision making in an increasingly important commercial sector. Agile supply, where small volumes of high margin, short life cycle innovative products are offered, is increasingly carried out through a complex global supply chain network. We outline an equilibrium solution in such a supply chain network, which works through limited cooperation and coordination along edges (links) in the network. The links constitute the stochastic modelling entities rather than the nodes of the network. We utilise newly developed phase plane analysis to identify, model and predict characteristic behaviour in supply chain networks. The phase plane charts profile the flow of inventory and identify out of control conditions. They maintain quality within the network, as well as intelligently track the way the network evolves in conditions of changing variability. The methodology is essentially distribution free, relying as it does on the study of forecasting errors, and can be used to examine contractual details as well as strategic and game theoretical concepts between decision-making components (agents) of a network. We illustrate with typical data drawn from supply chain agile fashion products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To develop sedation, pain, and agitation quality measures using process control methodology and evaluate their properties in clinical practice. Design: A Sedation Quality Assessment Tool was developed and validated to capture data for 12-hour periods of nursing care. Domains included pain/discomfort and sedation-agitation behaviors; sedative, analgesic, and neuromuscular blocking drug administration; ventilation status; and conditions potentially justifying deep sedation. Predefined sedation-related adverse events were recorded daily. Using an iterative process, algorithms were developed to describe the proportion of care periods with poor limb relaxation, poor ventilator synchronization, unnecessary deep sedation, agitation, and an overall optimum sedation metric. Proportion charts described processes over time (2 monthly intervals) for each ICU. The numbers of patients treated between sedation-related adverse events were described with G charts. Automated algorithms generated charts for 12 months of sequential data. Mean values for each process were calculated, and variation within and between ICUs explored qualitatively. Setting: Eight Scottish ICUs over a 12-month period. Patients: Mechanically ventilated patients. Interventions: None. Measurements and Main Results: The Sedation Quality Assessment Tool agitation-sedation domains correlated with the Richmond Sedation Agitation Scale score (Spearman [rho] = 0.75) and were reliable in clinician-clinician (weighted kappa; [kappa] = 0.66) and clinician-researcher ([kappa] = 0.82) comparisons. The limb movement domain had fair correlation with Behavioral Pain Scale ([rho] = 0.24) and was reliable in clinician-clinician ([kappa] = 0.58) and clinician-researcher ([kappa] = 0.45) comparisons. Ventilator synchronization correlated with Behavioral Pain Scale ([rho] = 0.54), and reliability in clinician-clinician ([kappa] = 0.29) and clinician-researcher ([kappa] = 0.42) comparisons was fair-moderate. Eight hundred twenty-five patients were enrolled (range, 59-235 across ICUs), providing 12,385 care periods for evaluation (range 655-3,481 across ICUs). The mean proportion of care periods with each quality metric varied between ICUs: excessive sedation 12-38%; agitation 4-17%; poor relaxation 13-21%; poor ventilator synchronization 8-17%; and overall optimum sedation 45-70%. Mean adverse event intervals ranged from 1.5 to 10.3 patients treated. The quality measures appeared relatively stable during the observation period. Conclusions: Process control methodology can be used to simultaneously monitor multiple aspects of pain-sedation-agitation management within ICUs. Variation within and between ICUs could be used as triggers to explore practice variation, improve quality, and monitor this over time

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a knowledge-based approach is proposed for the management of temporal information in process control. A common-sense theory of temporal constraints over processes/events, allowing relative temporal knowledge, is employed here as the temporal basis for the system. This theory supports duration reasoning and consistency checking, and accepts relative temporal knowledge which is in a form normally used by human operators. An architecture for process control is proposed which centres on an historical database consisting of events and processes, together with the qualitative temporal relationships between their occurrences. The dynamics of the system is expressed by means of three types of rule: database updating rules, process control rules, and data deletion rules. An example is provided in the form of a life scheduler, to illustrate the database and the rule sets. The example demonstrates the transitions of the database over time, and identifies the procedure in terms of a state transition model for the application. The dividing instant problem for logical inference is discussed with reference to this process control example, and it is shown how the temporal theory employed can be used to deal with the problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Remote sensing airborne hyperspectral data are routinely used for applications including algorithm development for satellite sensors, environmental monitoring and atmospheric studies. Single flight lines of airborne hyperspectral data are often in the region of tens of gigabytes in size. This means that a single aircraft can collect terabytes of remotely sensed hyperspectral data during a single year. Before these data can be used for scientific analyses, they need to be radiometrically calibrated, synchronised with the aircraft's position and attitude and then geocorrected. To enable efficient processing of these large datasets the UK Airborne Research and Survey Facility has recently developed a software suite, the Airborne Processing Library (APL), for processing airborne hyperspectral data acquired from the Specim AISA Eagle and Hawk instruments. The APL toolbox allows users to radiometrically calibrate, geocorrect, reproject and resample airborne data. Each stage of the toolbox outputs data in the common Band Interleaved Lines (BILs) format, which allows its integration with other standard remote sensing software packages. APL was developed to be user-friendly and suitable for use on a workstation PC as well as for the automated processing of the facility; to this end APL can be used under both Windows and Linux environments on a single desktop machine or through a Grid engine. A graphical user interface also exists. In this paper we describe the Airborne Processing Library software, its algorithms and approach. We present example results from using APL with an AISA Eagle sensor and we assess its spatial accuracy using data from multiple flight lines collected during a campaign in 2008 together with in situ surveyed ground control points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Remote sensing airborne hyperspectral data are routinely used for applications including algorithm development for satellite sensors, environmental monitoring and atmospheric studies. Single flight lines of airborne hyperspectral data are often in the region of tens of gigabytes in size. This means that a single aircraft can collect terabytes of remotely sensed hyperspectral data during a single year. Before these data can be used for scientific analyses, they need to be radiometrically calibrated, synchronised with the aircraft's position and attitude and then geocorrected. To enable efficient processing of these large datasets the UK Airborne Research and Survey Facility has recently developed a software suite, the Airborne Processing Library (APL), for processing airborne hyperspectral data acquired from the Specim AISA Eagle and Hawk instruments. The APL toolbox allows users to radiometrically calibrate, geocorrect, reproject and resample airborne data. Each stage of the toolbox outputs data in the common Band Interleaved Lines (BILs) format, which allows its integration with other standard remote sensing software packages. APL was developed to be user-friendly and suitable for use on a workstation PC as well as for the automated processing of the facility; to this end APL can be used under both Windows and Linux environments on a single desktop machine or through a Grid engine. A graphical user interface also exists. In this paper we describe the Airborne Processing Library software, its algorithms and approach. We present example results from using APL with an AISA Eagle sensor and we assess its spatial accuracy using data from multiple flight lines collected during a campaign in 2008 together with in situ surveyed ground control points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new algorithm for training of nonlinear optimal neuro-controllers (in the form of the model-free, action-dependent, adaptive critic paradigm). Overcomes problems with existing stochastic backpropagation training: need for data storage, parameter shadowing and poor convergence, offering significant benefits for online applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper points out a serious flaw in dynamic multivariate statistical process control (MSPC). The principal component analysis of a linear time series model that is employed to capture auto- and cross-correlation in recorded data may produce a considerable number of variables to be analysed. To give a dynamic representation of the data (based on variable correlation) and circumvent the production of a large time-series structure, a linear state space model is used here instead. The paper demonstrates that incorporating a state space model, the number of variables to be analysed dynamically can be considerably reduced, compared to conventional dynamic MSPC techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Polymer extrusion is a complex process and the availability of good dynamic models is key for improved system operation. Previous modelling attempts have failed adequately to capture the non-linearities of the process or prove too complex for control applications. This work presents a novel approach to the problem by the modelling of extrusion viscosity and pressure, adopting a grey box modelling technique that combines mechanistic knowledge with empirical data using a genetic algorithm approach. The models are shown to outperform those of a much higher order generated by a conventional black box technique while providing insight into the underlying processes at work within the extruder.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anti-islanding protection is becoming increasingly important due to the rapid installation of distributed generation from renewable resources like wind, tidal and wave, solar PV, bio-fuels, as well as from other resources like diesel. Unintentional islanding presents a potential risk for damaging utility plants and equipment connected from the demand side, as well as to public and personnel in utility plants. This paper investigates automatic islanding detection. This is achieved by deploying a statistical process control approach for fault detection with the real-time data acquired through a wide area measurement system, which is based on Phasor Measurement Unit (PMU) technology. In particular, the principal component analysis (PCA) is used to project the data into principal component subspace and residual space, and two statistics are used to detect the occurrence of fault. Then a fault reconstruction method is used to identify the fault and its development over time. The proposed scheme has been used in a real system and the results have confirmed that the proposed method can correctly identify the fault and islanding site.