963 resultados para Process Monitoring
Resumo:
We examine the current workflow modelling capability from a new angle and demonstrate a weakness of current workflow specification languages in relation to execution of activities. This shortcoming is mainly due to serious limitations of the corresponding computational/execution model behind the business process modelling language constructs. The main purpose of this paper is the introduction of new specification/modelling constructs allowing for more precise representation of complex activity states during its execution. This new concept enables visibility of a new activity state–partial completion of activity, which in turn allows for a more flexible and precise enforcement/monitoring of automated business processes.
Resumo:
Automatically generating maps of a measured variable of interest can be problematic. In this work we focus on the monitoring network context where observations are collected and reported by a network of sensors, and are then transformed into interpolated maps for use in decision making. Using traditional geostatistical methods, estimating the covariance structure of data collected in an emergency situation can be difficult. Variogram determination, whether by method-of-moment estimators or by maximum likelihood, is very sensitive to extreme values. Even when a monitoring network is in a routine mode of operation, sensors can sporadically malfunction and report extreme values. If this extreme data destabilises the model, causing the covariance structure of the observed data to be incorrectly estimated, the generated maps will be of little value, and the uncertainty estimates in particular will be misleading. Marchant and Lark [2007] propose a REML estimator for the covariance, which is shown to work on small data sets with a manual selection of the damping parameter in the robust likelihood. We show how this can be extended to allow treatment of large data sets together with an automated approach to all parameter estimation. The projected process kriging framework of Ingram et al. [2007] is extended to allow the use of robust likelihood functions, including the two component Gaussian and the Huber function. We show how our algorithm is further refined to reduce the computational complexity while at the same time minimising any loss of information. To show the benefits of this method, we use data collected from radiation monitoring networks across Europe. We compare our results to those obtained from traditional kriging methodologies and include comparisons with Box-Cox transformations of the data. We discuss the issue of whether to treat or ignore extreme values, making the distinction between the robust methods which ignore outliers and transformation methods which treat them as part of the (transformed) process. Using a case study, based on an extreme radiological events over a large area, we show how radiation data collected from monitoring networks can be analysed automatically and then used to generate reliable maps to inform decision making. We show the limitations of the methods and discuss potential extensions to remedy these.
Resumo:
The work presents a new method that combines plasma etching with extrinsic techniques to simultaneously measure matrix and surface protein and lipid deposits. The acronym for this technique is PEEMS - Plasma Etching and Emission Monitoring System. Previous work has identified the presence of proteinaceous and lipoidal deposition on the surface of contact lenses and highlighted the probability that penetration of these spoilants will occur. This technique developed here allows unambiguous identification of the depth of penetration of spoilants to be made for various material types. It is for this reason that the technique has been employed in this thesis. The technique is applied as a 'molecular' scalpel, removing known amounts of material from the target. In this case from both the anterior .and posterior surfaces of a 'soft' contact lens. The residual material is then characterised by other analytical techniques such as UV/visible .and fluorescence spectroscopy. Several studies have be.en carried out for both in vivo and in vitro spoilt materials. The analysis and identification of absorbed protein and lipid of the substrate revealed the importance of many factors in the absorption and adsorption process. The effect of the material structure, protein nature (in terms of size, shape and charge) and environment conditions were examined in order to determine the relative uptake of tear proteins. The studies were extended to real cases in order to study the. patient dependent factors and lipoidal penetration.
Resumo:
Large monitoring networks are becoming increasingly common and can generate large datasets from thousands to millions of observations in size, often with high temporal resolution. Processing large datasets using traditional geostatistical methods is prohibitively slow and in real world applications different types of sensor can be found across a monitoring network. Heterogeneities in the error characteristics of different sensors, both in terms of distribution and magnitude, presents problems for generating coherent maps. An assumption in traditional geostatistics is that observations are made directly of the underlying process being studied and that the observations are contaminated with Gaussian errors. Under this assumption, sub–optimal predictions will be obtained if the error characteristics of the sensor are effectively non–Gaussian. One method, model based geostatistics, assumes that a Gaussian process prior is imposed over the (latent) process being studied and that the sensor model forms part of the likelihood term. One problem with this type of approach is that the corresponding posterior distribution will be non–Gaussian and computationally demanding as Monte Carlo methods have to be used. An extension of a sequential, approximate Bayesian inference method enables observations with arbitrary likelihoods to be treated, in a projected process kriging framework which is less computationally intensive. The approach is illustrated using a simulated dataset with a range of sensor models and error characteristics.
Resumo:
An intelligent agent, operating in an external world which cannot be fully described in its internal world model, must be able to monitor the success of a previously generated plan and to respond to any errors which may have occurred. The process of error analysis requires the ability to reason in an expert fashion about time and about processes occurring in the world. Reasoning about time is needed to deal with causality. Reasoning about processes is needed since the direct effects of a plan action can be completely specified when the plan is generated, but the indirect effects cannot. For example, the action `open tap' leads with certainty to `tap open', whereas whether there will be a fluid flow and how long it might last is more difficult to predict. The majority of existing planning systems cannot handle these kinds of reasoning, thus limiting their usefulness. This thesis argues that both kinds of reasoning require a complex internal representation of the world. The use of Qualitative Process Theory and an interval-based representation of time are proposed as a representation scheme for such a world model. The planning system which was constructed has been tested on a set of realistic planning scenarios. It is shown that even simple planning problems, such as making a cup of coffee, require extensive reasoning if they are to be carried out successfully. The final Chapter concludes that the planning system described does allow the correct solution of planning problems involving complex side effects, which planners up to now have been unable to solve.
Resumo:
Computer integrated monitoring is a very large area in engineering where on-line, real time data acquisition with the aid of sensors is the solution to many problems in the manufacturing industry as opposed to the old data logging method by graphics analysis. The raw data which is collected this way however is useless in the absence of a proper computerized management system. The transfer of data between the management and the shop floor processes has been impossible in the past unless all the computers in the system were totally compatible with each other. This limits the efficiency of the systems because they get governed by the limitations of the computers. General Motors of U.S.A. have recently started research on a new standard called the Manufacturing Automation Protocol (MAP) which is expected to allow data transfer between different types of computers. This is still in early development stages and also is currently very expensive. This research programme shows how such a shop floor data acquisition system and a complete management system on entirely different computers can be integrated together to form a single system by achieving data transfer communications using a cheaper but a superior alternative to MAP. Standard communication character sets and hardware such as ASCII and UARTs have been used in this method but the technique is so powerful that totally incompatible computers are shown to run different programs (in different languages) simultaneously and yet receive data from each other and process in their own CPUs with no human intervention.
Resumo:
Research in the present thesis is focused on the norms, strategies,and approaches which translators employ when translating humour in Children's Literature from English into Greek. It is based on process-oriented descriptive translation studies, since the focus is on investigating the process of translation. Viewing translation as a cognitive process and a problem soling activity, this thesis Think-aloud protocols (TAPs) in order to investigate translator's minds. As it is not possible to directly observe the human mind at work, an attempt is made to ask the translators themselves to reveal their mental processes in real time by verbalising their thoughts while carrying out a translation task involving humour. In this study, thirty participants at three different levels of expertise in translation competence, i.e. tn beginner, ten competent, and ten experts translators, were requested to translate two humourous extracts from the fictional diary novel The Secret Diary of Adrian Mole, Aged 13 ¾ by Sue Townsend (1982) from English into Greek. As they translated, they were asked to verbalise their thoughts and reason them, whenever possible, so that their strategies and approaches could be detected, and that subsequently, the norms that govern these strategies and approaches could be revealed. The thesis consists of four parts: the introduction, the literature review, the study, and the conclusion, and is developed in eleven chapters. the introduction contextualises the study within translation studies (TS) and presents its rationale, research questions, aims, and significance. Chapters 1 to 7 present an extensive and inclusive literature review identifying the principles axioms that guide and inform the study. In these seven chapters the following areas are critically introduced: Children's literature (Chapter 1), Children's Literature Translation (Chapter 2), Norms in Children's Literature (Chapter 3), Strategies in Children's Literature (Chapter 4), Humour in Children's Literature Translation (Chapter 5), Development of Translation Competence (Chapter 6), and Translation Process Research (Chapter 7). In Chapters 8 - 11 the fieldwork is described in detail. the piolot and the man study are described with a reference to he environments and setting, the participants, the research -observer, the data and its analysis, and limitations of the study. The findings of the study are presented and analysed in Chapter 9. Three models are then suggested for systematising translators' norms, strategies, and approaches, thus, filling the existing gap in the field. Pedagogical norms (e.g. appropriateness/correctness, famililarity, simplicity, comprehensibility, and toning down), literary norms (e.g. sound of language and fluency). and source-text norms (e.g. equivalence) were revealed to b the most prominent general and specific norms governing the translators' strategies and approaches in the process of translating humour in ChL. The data also revealed that monitoring and communication strategies (e.g. additions, omissions, and exoticism) were the prevalent strategies employed by translators. In Chapter 10 the main findings and outcomes of a potential secondary benefit (beneficial outcomes) are discussed on the basis of the research questions and aims of the study, and implications of the study are tackled in Chapter 11. In the conclusion, suggestions for future directions are given and final remarks noted.
Resumo:
The Product Service Systems, servitization, and Service Science literature continues to grow as organisations seek to protect and improve their competitive position. The potential of technology applications to deliver service delivery systems facilitated by the ability to make real time decisions based upon ‘in the field’ performance is also significant. Research identifies four key questions to be addressed. Namely: how far along the servitization continuum should the organisation go in a single strategic step? Does the organisation have the structure and infrastructure to support this transition? What level of condition monitoring should it employ? Is the product positioned correctly in the value chain to adopt condition monitoring technology? Strategy consists of three dimensions, namely content, context, and process. The literature relating to PSS, servitization, and strategy all discuss the concepts relative to content and context but none offer a process to deliver an aligned strategy to deliver a service delivery system enabled by condition based management. This paper presents a tested iterative strategy formulation methodology which is the result of a structured development programme.
Resumo:
The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is both time-wasting and expensive. A risk-based model that reduces the amount of time spent on inspection has been presented. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests an efficient design and operation philosophy, construction methodology, and logical insurance plans. The risk-based model uses the analytic hierarchy process (AHP), a multiple-attribute decision-making technique, to identify the factors that influence failure on specific segments and to analyze their effects by determining probability of risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost, and the cumulative effect of failure is determined through probability analysis. The technique does not totally eliminate subjectivity, but it is an improvement over the existing inspection method.
High stress monitoring of prestressing tendons in nuclear concrete vessels using fibre-optic sensors
Resumo:
Maintaining the structural health of prestressed concrete nuclear containments is a key element in ensuring nuclear reactors are capable of meeting their safety requirements. This paper discusses the attachment, fabrication and characterisation of optical fibre strain sensors suitable for the prestress monitoring of irradiated steel prestressing tendons. The all-metal fabrication and welding process allowed the instrumented strand to simultaneously monitor and apply stresses up to 1300 MPa (80% of steel's ultimate tensile strength). There were no adverse effects to the strand's mechanical properties or integrity. After sensor relaxation through cyclic stress treatment, strain transfer between the optical fibre sensors and the strand remained at 69%. The fibre strain sensors could also withstand the non-axial forces induced as the strand was deflected around a 4.5 m bend radius. Further development of this technology has the potential to augment current prestress monitoring practices, allowing distributed measurements of short- and long-term prestress losses in nuclear prestressed-concrete vessels. © 2014 Elsevier B.V.
Resumo:
Reactive, but not a reactant. Heterogeneous catalysts play an unseen role in many of today's processes and products. With the increasing emphasis on sustainability in both products and processes, this handbook is the first to combine the hot topics of heterogeneous catalysis and clean technology. It focuses on the development of heterogeneous catalysts for use in clean chemical synthesis, dealing with how modern spectroscopic techniques can aid the design of catalysts for use in liquid phase reactions, their application in industrially important chemistries - including selective oxidation, hydrogenation, solid acid- and base-catalyzed processes - as well as the role of process intensification and use of renewable resources in improving the sustainability of chemical processes. With its emphasis on applications, this book is of high interest to those working in the industry.
Resumo:
Growth of complexity and functional importance of integrated navigation systems (INS) leads to high losses at the equipment refusals. The paper is devoted to the INS diagnosis system development, allowing identifying the cause of malfunction. The proposed solutions permit taking into account any changes in sensors dynamic and accuracy characteristics by means of the appropriate error models coefficients. Under actual conditions of INS operation, the determination of current values of the sensor models and estimation filter parameters rely on identification procedures. The results of full-scale experiments are given, which corroborate the expediency of INS error models parametric identification in bench test process.
Resumo:
This paper presents a Web-Centric [3] extension to a previously developed glaucoma expert system that will provide access for doctors and patients from any part of the world. Once implemented, this telehealth solution will publish the services of the Glaucoma Expert System on the World Wide Web, allowing patients and doctors to interact with it from their own homes. This web-extension will also allow the expert system itself to be proactive and to send diagnosis alerts to the registered user or doctor and the patient, informing each one of any emergencies, therefore allowing them to take immediate actions. The existing Glaucoma Expert System uses fuzzy logic learning algorithms applied on historical patient data to update and improve its diagnosis rules set. This process, collectively called the learning process, would benefit greatly from a web-based framework that could provide services like patient data transfer and web- based distribution of updated rules [1].
Resumo:
Floods represent the most devastating natural hazards in the world, affecting more people and causing more property damage than any other natural phenomena. One of the important problems associated with flood monitoring is flood extent extraction from satellite imagery, since it is impractical to acquire the flood area through field observations. This paper presents a method to flood extent extraction from synthetic-aperture radar (SAR) images that is based on intelligent computations. In particular, we apply artificial neural networks, self-organizing Kohonen’s maps (SOMs), for SAR image segmentation and classification. We tested our approach to process data from three different satellite sensors: ERS-2/SAR (during flooding on Tisza river, Ukraine and Hungary, 2001), ENVISAT/ASAR WSM (Wide Swath Mode) and RADARSAT-1 (during flooding on Huaihe river, China, 2007). Obtained results showed the efficiency of our approach.
Resumo:
A distance-based inconsistency indicator, defined by the third author for the consistency-driven pairwise comparisons method, is extended to the incomplete case. The corresponding optimization problem is transformed into an equivalent linear programming problem. The results can be applied in the process of filling in the matrix as the decision maker gets automatic feedback. As soon as a serious error occurs among the matrix elements, even due to a misprint, a significant increase in the inconsistency index is reported. The high inconsistency may be alarmed not only at the end of the process of filling in the matrix but also during the completion process. Numerical examples are also provided.