940 resultados para Electric power systems -- Quality control
Resumo:
At head of title FCST energy R&D goals study.
Resumo:
Mode of access: Internet.
Resumo:
"January 1980."
Resumo:
"DOE/RG-0045."
Resumo:
Cover-title.
Resumo:
At head of title: Microwave Research Institute, Polytechnic Institute of Brooklyn, Systems and Control Group, R-735, PIB-663, contract no. DA-30-069-ORD-1560.
Resumo:
"Submitted pursuant to: Section 8-405.1 of the Revised Public Utilities Act."
Resumo:
Includes index.
Resumo:
"22 April 1983."
Resumo:
Vols.1-87,1872-1940 also called no.1-258.
Resumo:
Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.
Resumo:
Workflow technology has delivered effectively for a large class of business processes, providing the requisite control and monitoring functions. At the same time, this technology has been the target of much criticism due to its limited ability to cope with dynamically changing business conditions which require business processes to be adapted frequently, and/or its limited ability to model business processes which cannot be entirely predefined. Requirements indicate the need for generic solutions where a balance between process control and flexibility may be achieved. In this paper we present a framework that allows the workflow to execute on the basis of a partially specified model where the full specification of the model is made at runtime, and may be unique to each instance. This framework is based on the notion of process constraints. Where as process constraints may be specified for any aspect of the workflow, such as structural, temporal, etc. our focus in this paper is on a constraint which allows dynamic selection of activities for inclusion in a given instance. We call these cardinality constraints, and this paper will discuss their specification and validation requirements.
Resumo:
In order to survive in the increasingly customer-oriented marketplace, continuous quality improvement marks the fastest growing quality organization’s success. In recent years, attention has been focused on intelligent systems which have shown great promise in supporting quality control. However, only a small number of the currently used systems are reported to be operating effectively because they are designed to maintain a quality level within the specified process, rather than to focus on cooperation within the production workflow. This paper proposes an intelligent system with a newly designed algorithm and the universal process data exchange standard to overcome the challenges of demanding customers who seek high-quality and low-cost products. The intelligent quality management system is equipped with the ‘‘distributed process mining” feature to provide all levels of employees with the ability to understand the relationships between processes, especially when any aspect of the process is going to degrade or fail. An example of generalized fuzzy association rules are applied in manufacturing sector to demonstrate how the proposed iterative process mining algorithm finds the relationships between distributed process parameters and the presence of quality problems.
Resumo:
Biomass production, conversion and utilization can be done locally with value addition to small farmers. However, new technical inputs are needed for profitable exploitation of biomass within the constraints related to land, water and skill availability and to provide higher quality of energy needed for rural industries. Trigeneration, which is generating energy simultaneously in three forms (electric power, heat for processing and refrigeration), helps in fully utilizing the stored energy in biomass and would be most appropriate for micro enterprises. This paper presents concepts in terms of trigeneration systems feasible for rural areas.
Resumo:
Information technology companies are having to broaden their overall strategic view in deference to the premise that it is better to be market-driven than technology-led. Cost and technical performance are no longer the only considerations, as quality and service now demand equal recognition. The production of a high volume single item has given way to that of low volume multiple items, which in turn requires some modification of production systems and brings flexible manufacturing, Just-in-Time production and total quality control into sharper focus for the achievement of corporate objectives.