956 resultados para process parameter monitoring


Relevância:

30.00% 30.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"The Corey H. Continuous Improvement Process is the ongoing collection and analysis of information for all entities serving children with disabilities to determine compliance with applicable state and federal requirements."--P. [1].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a review of modelling and control of biological nutrient removal (BNR)-activated sludge processes for wastewater treatment using distributed parameter models described by partial differential equations (PDE). Numerical methods for solution to the BNR-activated sludge process dynamics are reviewed and these include method of lines, global orthogonal collocation and orthogonal collocation on finite elements. Fundamental techniques and conceptual advances of the distributed parameter approach to the dynamics and control of activated sludge processes are briefly described. A critical analysis on the advantages of the distributed parameter approach over the conventional modelling strategy in this paper shows that the activated sludge process is more adequately described by the former and the method is recommended for application to the wastewater industry (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presence-absence surveys are a commonly used method for monitoring broad-scale changes in wildlife distributions. However, the lack of power of these surveys for detecting population trends is problematic for their application in wildlife management. Options for improving power include increasing the sampling effort or arbitrarily relaxing the type I error rate. We present an alternative, whereby targeted sampling of particular habitats in the landscape using information from a habitat model increases power. The advantage of this approach is that it does not require a trade-off with either cost or the Pr(type I error) to achieve greater power. We use a demographic model of koala (Phascolarctos cinereus) population dynamics and simulations of the monitoring process to estimate the power to detect a trend in occupancy for a range of strategies, thereby demonstrating that targeting particular habitat qualities can improve power substantially. If the objective is to detect a decline in occupancy, the optimal strategy is to sample high-quality habitats. Alternatively, if the objective is to detect an increase in occupancy, the optimal strategy is to sample intermediate-quality habitats. The strategies with the highest power remained the same under a range of parameter assumptions, although observation error had a strong influence on the optimal strategy. Our approach specifically applies to monitoring for detecting long-term trends in occupancy or abundance. This is a common and important monitoring objective for wildlife managers, and we provide guidelines for more effectively achieving it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Workflow technology has delivered effectively for a large class of business processes, providing the requisite control and monitoring functions. At the same time, this technology has been the target of much criticism due to its limited ability to cope with dynamically changing business conditions which require business processes to be adapted frequently, and/or its limited ability to model business processes which cannot be entirely predefined. Requirements indicate the need for generic solutions where a balance between process control and flexibility may be achieved. In this paper we present a framework that allows the workflow to execute on the basis of a partially specified model where the full specification of the model is made at runtime, and may be unique to each instance. This framework is based on the notion of process constraints. Where as process constraints may be specified for any aspect of the workflow, such as structural, temporal, etc. our focus in this paper is on a constraint which allows dynamic selection of activities for inclusion in a given instance. We call these cardinality constraints, and this paper will discuss their specification and validation requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The estimation of a concentration-dependent diffusion coefficient in a drying process is known as an inverse coefficient problem. The solution is sought wherein the space-average concentration is known as function of time (mass loss monitoring). The problem is stated as the minimization of a functional and gradient-based algorithms are used to solve it. Many numerical and experimental examples that demonstrate the effectiveness of the proposed approach are presented. Thin slab drying was carried out in an isothermal drying chamber built in our laboratory. The diffusion coefficients of fructose obtained with the present method are compared with existing literature results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regular monitoring of wastewater characteristics is undertaken on most wastewater treatment plants. The data acquired during this process are usually filed and forgotten. However, systematic analysis of these data can provide useful insights into plant behaviour. Conventional graphical techniques are inadequate to give a good overall picture of how wastewater characteristics vary, with time and along the lagoon system. An approach based on the use of contour plots was devised that largely overcomes this problem. Superimposition of contour plots for different parameters can be used to gain a qualitative understanding of the nature and strength of relationships between the parameters. This is illustrated in an analysis of monitoring data for lagoon 115 East at the Western Treatment Plant, near Melbourne, Australia. In this illustrative analysis, relationships between ammonia removal rates and parameters such as chlorophyll a level and temperature are explored using a contour plot superimposition approach. It is concluded that this approach can help improve our understanding, not only of lagoon systems, but of other wastewater treatment systems as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine the current workflow modelling capability from a new angle and demonstrate a weakness of current workflow specification languages in relation to execution of activities. This shortcoming is mainly due to serious limitations of the corresponding computational/execution model behind the business process modelling language constructs. The main purpose of this paper is the introduction of new specification/modelling constructs allowing for more precise representation of complex activity states during its execution. This new concept enables visibility of a new activity state–partial completion of activity, which in turn allows for a more flexible and precise enforcement/monitoring of automated business processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What this thesis proposes is a methodology to assist repetitive batch manufacturers in the adoption of certain aspects of the Lean Production principles. The methodology concentrates on the reduction of inventory through the setting of appropriate batch sizes, taking account of the effect of sequence dependent set-ups and the identification and elimination of bottlenecks. It uses a simple Pareto and modified EBQ based analysis technique to allocate items to period order day classes based on a combination of each item's annual usage value and set-up cost. The period order day classes the items are allocated to are determined by the constraints limits in the three measured dimensions, capacity, administration and finance. The methodology overcomes the limitations associated with MRP in the area of sequence dependent set-ups, and provides a simple way of setting planning parameters taking this effect into account by concentrating on the reduction of inventory through the systematic identification and elimination of bottlenecks through set-up reduction processes, so allowing batch sizes to reduce. It aims to help traditional repetitive batch manufacturers in a route to continual improvement by: Highlighting those areas where change would bring the greatest benefits. Modelling the effect of proposed changes. Quantifying the benefits that could be gained through implementing the proposed changes. Simplifying the effort required to perform the modelling process. It concentrates on increasing flexibility through managed inventory reduction through rationally decreasing batch sizes, taking account of sequence dependent set-ups and the identification and elimination of bottlenecks. This was achieved through the development of a software modelling tool, and validated through a case study approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The automatic interpolation of environmental monitoring network data such as air quality or radiation levels in real-time setting poses a number of practical and theoretical questions. Among the problems found are (i) dealing and communicating uncertainty of predictions, (ii) automatic (hyper)parameter estimation, (iii) monitoring network heterogeneity, (iv) dealing with outlying extremes, and (v) quality control. In this paper we discuss these issues, in light of the spatial interpolation comparison exercise held in 2004.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a greedy Bayesian experimental design criterion for heteroscedastic Gaussian process models. The criterion is based on the Fisher information and is optimal in the sense of minimizing parameter uncertainty for likelihood based estimators. We demonstrate the validity of the criterion under different noise regimes and present experimental results from a rabies simulator to demonstrate the effectiveness of the resulting approximately optimal designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presents a new method that combines plasma etching with extrinsic techniques to simultaneously measure matrix and surface protein and lipid deposits. The acronym for this technique is PEEMS - Plasma Etching and Emission Monitoring System. Previous work has identified the presence of proteinaceous and lipoidal deposition on the surface of contact lenses and highlighted the probability that penetration of these spoilants will occur. This technique developed here allows unambiguous identification of the depth of penetration of spoilants to be made for various material types. It is for this reason that the technique has been employed in this thesis. The technique is applied as a 'molecular' scalpel, removing known amounts of material from the target. In this case from both the anterior .and posterior surfaces of a 'soft' contact lens. The residual material is then characterised by other analytical techniques such as UV/visible .and fluorescence spectroscopy. Several studies have be.en carried out for both in vivo and in vitro spoilt materials. The analysis and identification of absorbed protein and lipid of the substrate revealed the importance of many factors in the absorption and adsorption process. The effect of the material structure, protein nature (in terms of size, shape and charge) and environment conditions were examined in order to determine the relative uptake of tear proteins. The studies were extended to real cases in order to study the. patient dependent factors and lipoidal penetration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large monitoring networks are becoming increasingly common and can generate large datasets from thousands to millions of observations in size, often with high temporal resolution. Processing large datasets using traditional geostatistical methods is prohibitively slow and in real world applications different types of sensor can be found across a monitoring network. Heterogeneities in the error characteristics of different sensors, both in terms of distribution and magnitude, presents problems for generating coherent maps. An assumption in traditional geostatistics is that observations are made directly of the underlying process being studied and that the observations are contaminated with Gaussian errors. Under this assumption, sub–optimal predictions will be obtained if the error characteristics of the sensor are effectively non–Gaussian. One method, model based geostatistics, assumes that a Gaussian process prior is imposed over the (latent) process being studied and that the sensor model forms part of the likelihood term. One problem with this type of approach is that the corresponding posterior distribution will be non–Gaussian and computationally demanding as Monte Carlo methods have to be used. An extension of a sequential, approximate Bayesian inference method enables observations with arbitrary likelihoods to be treated, in a projected process kriging framework which is less computationally intensive. The approach is illustrated using a simulated dataset with a range of sensor models and error characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An intelligent agent, operating in an external world which cannot be fully described in its internal world model, must be able to monitor the success of a previously generated plan and to respond to any errors which may have occurred. The process of error analysis requires the ability to reason in an expert fashion about time and about processes occurring in the world. Reasoning about time is needed to deal with causality. Reasoning about processes is needed since the direct effects of a plan action can be completely specified when the plan is generated, but the indirect effects cannot. For example, the action `open tap' leads with certainty to `tap open', whereas whether there will be a fluid flow and how long it might last is more difficult to predict. The majority of existing planning systems cannot handle these kinds of reasoning, thus limiting their usefulness. This thesis argues that both kinds of reasoning require a complex internal representation of the world. The use of Qualitative Process Theory and an interval-based representation of time are proposed as a representation scheme for such a world model. The planning system which was constructed has been tested on a set of realistic planning scenarios. It is shown that even simple planning problems, such as making a cup of coffee, require extensive reasoning if they are to be carried out successfully. The final Chapter concludes that the planning system described does allow the correct solution of planning problems involving complex side effects, which planners up to now have been unable to solve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer integrated monitoring is a very large area in engineering where on-line, real time data acquisition with the aid of sensors is the solution to many problems in the manufacturing industry as opposed to the old data logging method by graphics analysis. The raw data which is collected this way however is useless in the absence of a proper computerized management system. The transfer of data between the management and the shop floor processes has been impossible in the past unless all the computers in the system were totally compatible with each other. This limits the efficiency of the systems because they get governed by the limitations of the computers. General Motors of U.S.A. have recently started research on a new standard called the Manufacturing Automation Protocol (MAP) which is expected to allow data transfer between different types of computers. This is still in early development stages and also is currently very expensive. This research programme shows how such a shop floor data acquisition system and a complete management system on entirely different computers can be integrated together to form a single system by achieving data transfer communications using a cheaper but a superior alternative to MAP. Standard communication character sets and hardware such as ASCII and UARTs have been used in this method but the technique is so powerful that totally incompatible computers are shown to run different programs (in different languages) simultaneously and yet receive data from each other and process in their own CPUs with no human intervention.