913 resultados para Drilling process monitoring
Resumo:
An experiment was conceived in which we monitored degradation of GlcDGD. Independent of the fate of the [14C]glucosyl headgroup after hydrolysis from the glycerol backbone, the 14C enters the aqueous or gas phase whereas the intact lipid is insoluble and remains in the sediment phase. Total degradation of GlcDGD then is obtained by combining the increase of radioactivity in the aqueous and gaseous phases. We chose two different sediment to perform this experiment. One is from microbially actie surface sediment sampled in February 2010 from the upper tidal flat of the German Wadden Sea near Wremen (53° 38' 0N, 8° 29' 30E). The other one is deep subsurface sediments recovered from northern Cascadia Margin during Integrated Ocean Drilling Program Expedition 311 [site U1326, 138.2 meters below seafloor (mbsf), in situ temperature 20 °C, water depth 1,828 m. We performed both alive and killed control experiments for comparison. Surface and subsurface sediment slurry were incubated in the dark at in situ temperature, 4 °C and 20 °C for 300 d, respectively. The sterilized slurry was stored at 20 °C. All incubations were carried out under N2 headspace to ensure anaerobic conditions. The sampling frequency was high during the first half-month, i.e., after 1, 2, 7, and 14 d; thereafter, the sediment slurry was sampled every 2 months. At each time point, samples were taken in triplicate for radioactivity measurements. After 300 d of incubation, no significant changes of radioactivity in the aqueous phase were detected. This may be the result of either the rapid turnover of released [14C] glucose or the relatively high limit of detection caused by the slight solubility (equivalent to 2% of initial radioactivity) of GlcDGD in water. Therefore, total degradation of GlcDGD in the dataset was calculated by combining radioactivity of DIC, CH4, and CO2, leading to a minimum estimate.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
Mode of access: Internet.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
Mode of access: Internet.
Resumo:
"The Corey H. Continuous Improvement Process is the ongoing collection and analysis of information for all entities serving children with disabilities to determine compliance with applicable state and federal requirements."--P. [1].
Resumo:
Workflow technology has delivered effectively for a large class of business processes, providing the requisite control and monitoring functions. At the same time, this technology has been the target of much criticism due to its limited ability to cope with dynamically changing business conditions which require business processes to be adapted frequently, and/or its limited ability to model business processes which cannot be entirely predefined. Requirements indicate the need for generic solutions where a balance between process control and flexibility may be achieved. In this paper we present a framework that allows the workflow to execute on the basis of a partially specified model where the full specification of the model is made at runtime, and may be unique to each instance. This framework is based on the notion of process constraints. Where as process constraints may be specified for any aspect of the workflow, such as structural, temporal, etc. our focus in this paper is on a constraint which allows dynamic selection of activities for inclusion in a given instance. We call these cardinality constraints, and this paper will discuss their specification and validation requirements.
Resumo:
The estimation of a concentration-dependent diffusion coefficient in a drying process is known as an inverse coefficient problem. The solution is sought wherein the space-average concentration is known as function of time (mass loss monitoring). The problem is stated as the minimization of a functional and gradient-based algorithms are used to solve it. Many numerical and experimental examples that demonstrate the effectiveness of the proposed approach are presented. Thin slab drying was carried out in an isothermal drying chamber built in our laboratory. The diffusion coefficients of fructose obtained with the present method are compared with existing literature results.
Resumo:
Regular monitoring of wastewater characteristics is undertaken on most wastewater treatment plants. The data acquired during this process are usually filed and forgotten. However, systematic analysis of these data can provide useful insights into plant behaviour. Conventional graphical techniques are inadequate to give a good overall picture of how wastewater characteristics vary, with time and along the lagoon system. An approach based on the use of contour plots was devised that largely overcomes this problem. Superimposition of contour plots for different parameters can be used to gain a qualitative understanding of the nature and strength of relationships between the parameters. This is illustrated in an analysis of monitoring data for lagoon 115 East at the Western Treatment Plant, near Melbourne, Australia. In this illustrative analysis, relationships between ammonia removal rates and parameters such as chlorophyll a level and temperature are explored using a contour plot superimposition approach. It is concluded that this approach can help improve our understanding, not only of lagoon systems, but of other wastewater treatment systems as well.
Resumo:
We examine the current workflow modelling capability from a new angle and demonstrate a weakness of current workflow specification languages in relation to execution of activities. This shortcoming is mainly due to serious limitations of the corresponding computational/execution model behind the business process modelling language constructs. The main purpose of this paper is the introduction of new specification/modelling constructs allowing for more precise representation of complex activity states during its execution. This new concept enables visibility of a new activity state–partial completion of activity, which in turn allows for a more flexible and precise enforcement/monitoring of automated business processes.
Resumo:
Automatically generating maps of a measured variable of interest can be problematic. In this work we focus on the monitoring network context where observations are collected and reported by a network of sensors, and are then transformed into interpolated maps for use in decision making. Using traditional geostatistical methods, estimating the covariance structure of data collected in an emergency situation can be difficult. Variogram determination, whether by method-of-moment estimators or by maximum likelihood, is very sensitive to extreme values. Even when a monitoring network is in a routine mode of operation, sensors can sporadically malfunction and report extreme values. If this extreme data destabilises the model, causing the covariance structure of the observed data to be incorrectly estimated, the generated maps will be of little value, and the uncertainty estimates in particular will be misleading. Marchant and Lark [2007] propose a REML estimator for the covariance, which is shown to work on small data sets with a manual selection of the damping parameter in the robust likelihood. We show how this can be extended to allow treatment of large data sets together with an automated approach to all parameter estimation. The projected process kriging framework of Ingram et al. [2007] is extended to allow the use of robust likelihood functions, including the two component Gaussian and the Huber function. We show how our algorithm is further refined to reduce the computational complexity while at the same time minimising any loss of information. To show the benefits of this method, we use data collected from radiation monitoring networks across Europe. We compare our results to those obtained from traditional kriging methodologies and include comparisons with Box-Cox transformations of the data. We discuss the issue of whether to treat or ignore extreme values, making the distinction between the robust methods which ignore outliers and transformation methods which treat them as part of the (transformed) process. Using a case study, based on an extreme radiological events over a large area, we show how radiation data collected from monitoring networks can be analysed automatically and then used to generate reliable maps to inform decision making. We show the limitations of the methods and discuss potential extensions to remedy these.
Resumo:
The work presents a new method that combines plasma etching with extrinsic techniques to simultaneously measure matrix and surface protein and lipid deposits. The acronym for this technique is PEEMS - Plasma Etching and Emission Monitoring System. Previous work has identified the presence of proteinaceous and lipoidal deposition on the surface of contact lenses and highlighted the probability that penetration of these spoilants will occur. This technique developed here allows unambiguous identification of the depth of penetration of spoilants to be made for various material types. It is for this reason that the technique has been employed in this thesis. The technique is applied as a 'molecular' scalpel, removing known amounts of material from the target. In this case from both the anterior .and posterior surfaces of a 'soft' contact lens. The residual material is then characterised by other analytical techniques such as UV/visible .and fluorescence spectroscopy. Several studies have be.en carried out for both in vivo and in vitro spoilt materials. The analysis and identification of absorbed protein and lipid of the substrate revealed the importance of many factors in the absorption and adsorption process. The effect of the material structure, protein nature (in terms of size, shape and charge) and environment conditions were examined in order to determine the relative uptake of tear proteins. The studies were extended to real cases in order to study the. patient dependent factors and lipoidal penetration.
Resumo:
Large monitoring networks are becoming increasingly common and can generate large datasets from thousands to millions of observations in size, often with high temporal resolution. Processing large datasets using traditional geostatistical methods is prohibitively slow and in real world applications different types of sensor can be found across a monitoring network. Heterogeneities in the error characteristics of different sensors, both in terms of distribution and magnitude, presents problems for generating coherent maps. An assumption in traditional geostatistics is that observations are made directly of the underlying process being studied and that the observations are contaminated with Gaussian errors. Under this assumption, sub–optimal predictions will be obtained if the error characteristics of the sensor are effectively non–Gaussian. One method, model based geostatistics, assumes that a Gaussian process prior is imposed over the (latent) process being studied and that the sensor model forms part of the likelihood term. One problem with this type of approach is that the corresponding posterior distribution will be non–Gaussian and computationally demanding as Monte Carlo methods have to be used. An extension of a sequential, approximate Bayesian inference method enables observations with arbitrary likelihoods to be treated, in a projected process kriging framework which is less computationally intensive. The approach is illustrated using a simulated dataset with a range of sensor models and error characteristics.
Resumo:
An intelligent agent, operating in an external world which cannot be fully described in its internal world model, must be able to monitor the success of a previously generated plan and to respond to any errors which may have occurred. The process of error analysis requires the ability to reason in an expert fashion about time and about processes occurring in the world. Reasoning about time is needed to deal with causality. Reasoning about processes is needed since the direct effects of a plan action can be completely specified when the plan is generated, but the indirect effects cannot. For example, the action `open tap' leads with certainty to `tap open', whereas whether there will be a fluid flow and how long it might last is more difficult to predict. The majority of existing planning systems cannot handle these kinds of reasoning, thus limiting their usefulness. This thesis argues that both kinds of reasoning require a complex internal representation of the world. The use of Qualitative Process Theory and an interval-based representation of time are proposed as a representation scheme for such a world model. The planning system which was constructed has been tested on a set of realistic planning scenarios. It is shown that even simple planning problems, such as making a cup of coffee, require extensive reasoning if they are to be carried out successfully. The final Chapter concludes that the planning system described does allow the correct solution of planning problems involving complex side effects, which planners up to now have been unable to solve.