893 resultados para Statistical process control
Resumo:
Continuum, partial differential equation models are often used to describe the collective motion of cell populations, with various types of motility represented by the choice of diffusion coefficient, and cell proliferation captured by the source terms. Previously, the choice of diffusion coefficient has been largely arbitrary, with the decision to choose a particular linear or nonlinear form generally based on calibration arguments rather than making any physical connection with the underlying individual-level properties of the cell motility mechanism. In this work we provide a new link between individual-level models, which account for important cell properties such as varying cell shape and volume exclusion, and population-level partial differential equation models. We work in an exclusion process framework, considering aligned, elongated cells that may occupy more than one lattice site, in order to represent populations of agents with different sizes. Three different idealizations of the individual-level mechanism are proposed, and these are connected to three different partial differential equations, each with a different diffusion coefficient; one linear, one nonlinear and degenerate and one nonlinear and nondegenerate. We test the ability of these three models to predict the population level response of a cell spreading problem for both proliferative and nonproliferative cases. We also explore the potential of our models to predict long time travelling wave invasion rates and extend our results to two dimensional spreading and invasion. Our results show that each model can accurately predict density data for nonproliferative systems, but that only one does so for proliferative systems. Hence great care must be taken to predict density data for with varying cell shape.
Resumo:
The quality of conceptual business process models is highly relevant for the design of corresponding information systems. In particular, a precise measurement of model characteristics can be beneficial from a business perspective, helping to save costs thanks to early error detection. This is just as true from a software engineering point of view. In this latter case, models facilitate stakeholder communication and software system design. Research has investigated several proposals as regards measures for business process models, from a rather correlational perspective. This is helpful for understanding, for example size and complexity as general driving forces of error probability. Yet, design decisions usually have to build on thresholds, which can reliably indicate that a certain counter-action has to be taken. This cannot be achieved only by providing measures; it requires a systematic identification of effective and meaningful thresholds. In this paper, we derive thresholds for a set of structural measures for predicting errors in conceptual process models. To this end, we use a collection of 2,000 business process models from practice as a means of determining thresholds, applying an adaptation of the ROC curves method. Furthermore, an extensive validation of the derived thresholds was conducted by using 429 EPC models from an Australian financial institution. Finally, significant thresholds were adapted to refine existing modeling guidelines in a quantitative way.
Resumo:
Crisis holds the potential for profound change in organizations and industries. The past 50 years of crisis management highlight key shifts in crisis practice, creating opportunities for multiple theories and research tracks. Defining crises such as Tylenol, Exxon Valdez, and September 11 terrorist attacks have influenced or challenged the principles of best practice of crisis communication in public relations. This study traces the development of crisis process and practice by identifying shifts in crisis research and models and mapping these against key management theories and practices. The findings define three crisis domains: crisis planning, building and testing predictive models, and mapping and measuring external environmental influences. These crisis domains mirror but lag the evolution of management theory, suggesting challenges for researchers to reshape the research agenda to close the gap and lead the next stage of development in the field of crisis communication for effective organizational outcomes.
Resumo:
Nowadays, business process management is an important approach for managing organizations from an operational perspective. As a consequence, it is common to see organizations develop collections of hundreds or even thousands of business process models. Such large collections of process models bring new challenges and provide new opportunities, as the knowledge that they encapsulate requires to be properly managed. Therefore, a variety of techniques for managing large collections of business process models is being developed. The goal of this paper is to provide an overview of the management techniques that currently exist, as well as the open research challenges that they pose.
Resumo:
The lack of satisfactory consensus for characterizing the system intelligence and structured analytical decision models has inhibited the developers and practitioners to understand and configure optimum intelligent building systems in a fully informed manner. So far, little research has been conducted in this aspect. This research is designed to identify the key intelligent indicators, and develop analytical models for computing the system intelligence score of smart building system in the intelligent building. The integrated building management system (IBMS) was used as an illustrative example to present a framework. The models presented in this study applied the system intelligence theory, and the conceptual analytical framework. A total of 16 key intelligent indicators were first identified from a general survey. Then, two multi-criteria decision making (MCDM) approaches, the analytic hierarchy process (AHP) and analytic network process (ANP), were employed to develop the system intelligence analytical models. Top intelligence indicators of IBMS include: self-diagnostic of operation deviations; adaptive limiting control algorithm; and, year-round time schedule performance. The developed conceptual framework was then transformed to the practical model. The effectiveness of the practical model was evaluated by means of expert validation. The main contribution of this research is to promote understanding of the intelligent indicators, and to set the foundation for a systemic framework that provide developers and building stakeholders a consolidated inclusive tool for the system intelligence evaluation of the proposed components design configurations.
Resumo:
Bauxite refinery residues (red mud) are derived from the Bayer process by the digestion of crushed bauxite in concentrated sodium hydroxide at elevated temperatures and pressures. This slurry residue, if untreated, is unsuitable for discharge directly into the environment and is usually stored in tailing dams. The liquid portion has the potential for discharge, but requires pre-treatment before this can occur. The seawater neutralisation treatment facilitates a significant reduction in pH and dissolved metal concentrations, through the precipitation of hydrotalcite-like compounds and some other Mg, Ca, and Al hydroxide and carbonate minerals. The hydrotalcite-like compounds, precipitated during seawater neutralisation, also remove a range of transition metals, oxy-anions and other anionic species through a combination of intercalation and adsorption reactions: smaller anions are intercalated into the hydrotalcite matrix, while larger molecules are adsorbed on the particle surfaces. A phenomenon known as ‘reversion’ can occur if the seawater neutralisation process is not properly controlled. Reversion causes an increase in the pH and dissolved impurity levels of the neutralised effluent, rendering it unsuitable for discharge. It is believed that slow dissolution of components of the red mud residue and compounds formed during the neutralisation process are responsible for reversion. This investigation looked at characterising natural hydrotalcite (Mg6Al2(OH)16(CO3)∙4H2O) and ‘Bayer’ hydrotalcite (synthesised using the seawater neutralisation process) using a variety of techniques including X-ray diffraction, infrared and Raman spectroscopy, and thermogravimetric analysis. This investigation showed that Bayer hydrotalcite is comprised of a mixture of 3:1 and 4:1 hydrotalcite structures and exhibited similar chemical characteristic to the 4:1 synthetic hydrotalcite. Hydrotalcite formed from the seawater neutralisation of Bauxite refinery residues has been found not to cause reversion. Other components in red mud were investigated to determine the cause of reversion and this investigation found three components that contributed to reversion: 1) tricalcium aluminate, 2) hydrocalumite and 3) calcium hydroxide. Increasing the amount of magnesium in the neutralisation process has been found to be successful in reducing reversion.
Resumo:
Transcending traditional national borders, the Internet is an evolving technology that has opened up many new international market opportunities. However, ambiguity remains, with limited research and understanding of how the Internet influences the firm’s internationalisation process components. As a consequence, there has been a call for further investigation of the phenomenon. Thus, the purpose of this study was to investigate the Internet’s impact on the internationalisation process components, specifically, information availability, information usage, interactive communication and international market growth. Analysis was undertaken using structural equation modelling. Findings highlight the mediating impact of the Internet on information and knowledge transference in the internationalisation process. Contributions of the study test conceptualisations and give statistical validation of interrelationships, while illuminating the Internet’s impact on firm internationalisation.
Resumo:
Accurate reliability prediction for large-scale, long lived engineering is a crucial foundation for effective asset risk management and optimal maintenance decision making. However, a lack of failure data for assets that fail infrequently, and changing operational conditions over long periods of time, make accurate reliability prediction for such assets very challenging. To address this issue, we present a Bayesian-Marko best approach to reliability prediction using prior knowledge and condition monitoring data. In this approach, the Bayesian theory is used to incorporate prior information about failure probabilities and current information about asset health to make statistical inferences, while Markov chains are used to update and predict the health of assets based on condition monitoring data. The prior information can be supplied by domain experts, extracted from previous comparable cases or derived from basic engineering principles. Our approach differs from existing hybrid Bayesian models which are normally used to update the parameter estimation of a given distribution such as the Weibull-Bayesian distribution or the transition probabilities of a Markov chain. Instead, our new approach can be used to update predictions of failure probabilities when failure data are sparse or nonexistent, as is often the case for large-scale long-lived engineering assets.
Resumo:
As organizations reach higher levels of business process management maturity, they often find themselves maintaining very large process model repositories, representing valuable knowledge about their operations. A common practice within these repositories is to create new process models, or extend existing ones, by copying and merging fragments from other models. We contend that if these duplicate fragments, a.k.a. ex- act clones, can be identified and factored out as shared subprocesses, the repository’s maintainability can be greatly improved. With this purpose in mind, we propose an indexing structure to support fast detection of clones in process model repositories. Moreover, we show how this index can be used to efficiently query a process model repository for fragments. This index, called RPSDAG, is based on a novel combination of a method for process model decomposition (namely the Refined Process Structure Tree), with established graph canonization and string matching techniques. We evaluated the RPSDAG with large process model repositories from industrial practice. The experiments show that a significant number of non-trivial clones can be efficiently found in such repositories, and that fragment queries can be handled efficiently.
Resumo:
In order to make good decisions about the design of information systems, an essential skill is to understand process models of the business domain the system is intended to support. Yet, little knowledge to date has been established about the factors that affect how model users comprehend the content of process models. In this study, we use theories of semiotics and cognitive load to theorize how model and personal factors influence how model viewers comprehend the syntactical information of process models. We then report on a four-part series of experiments, in which we examined these factors. Our results show that additional semantical information impedes syntax comprehension, and that theoretical knowledge eases syntax comprehension. Modeling experience further contributes positively to comprehension efficiency, measured as the ratio of correct answers to the time taken to provide answers. We discuss implications for practice and research.
Resumo:
The observing failure and feedback instability might happen when the partial sensors of a satellite attitude control system (SACS) go wrong. A fault diagnosis and isolation (FDI) method based on a fault observer is introduced to detect and isolate the fault sensor at first. Based on the FDI result, the object system state-space equation is transformed and divided into a corresponsive triangular canonical form to decouple the normal subsystem from the fault subsystem. And then the KX fault-tolerant observers of the system in different modes are designed and embedded into online monitoring. The outputs of all KX fault-tolerant observers are selected by the control switch process. That can make sense that the SACS is part-observed and in stable when the partial sensors break down. Simulation results demonstrate the effectiveness and superiority of the proposed method.
Resumo:
Public participate in the planning and design of major public infrastructure and construction (PIC) projects is crucial to their success, as the interests of different stakeholders can be systematically captured and built into the finalised scheme. However, public participation may not always yield a mutually acceptable solution, especially when the interests of stakeholders are diverse and conflicting. Confrontations and disputes can arise unless the concerns or needs of the community are carefully analysed and addressed. The aim of the paper is to propose a systematic method of analysing stakeholder concerns relating to PIC projects by examining the degree of consensus and/or conflict involved. The results of a questionnaire survey and a series of interviews with different entities are provided, which indicate the existence of a significant divergence of views among stakeholder groups and that conflicts arise when there is a mismatch between peoples’ perception concerning money and happiness on the one hand and development and damages on the other. Policy and decision-makers should strive to resolve at least the majority of conflicts that arise throughout the lifecycle of major PIC projects so as to maximise their chance of success.
Resumo:
Whereas many good examples can be found of the study of urban morphology informing the design of new residential areas in Europe, it is much more difficult to find examples relating to other land uses and outside of Europe. This paper addresses a particular issue, the control and coordination of large and complex development schemes within cities, and, in doing so, considers commercial and mixed-use schemes outside of Europe. It is argued that urban morphology has much to offer for both the design of such development and its implementation over time. Firstly, lessons are drawn from the work of Krier and Rossi in Berlin, the form-based guidance developed in Chelmsford, UK, and the redesign and coordination of the Melrose Arch project in Johannesburg, SA. A recent development at Boggo Road in Brisbane, Australia, is then subjected to a more detailed examination. It is argued that the scheme has been unsatisfactory in terms of both design and implementation. An alternative framework based on historical morphological studies is proposed that would overcome these deficiencies. It is proposed that this points the way to a general approach that could be incorporated within the planning process internationally.
Resumo:
This article provides a tutorial introduction to visual servo control of robotic manipulators. Since the topic spans many disciplines our goal is limited to providing a basic conceptual framework. We begin by reviewing the prerequisite topics from robotics and computer vision, including a brief review of coordinate transformations, velocity representation, and a description of the geometric aspects of the image formation process. We then present a taxonomy of visual servo control systems. The two major classes of systems, position-based and image-based systems, are then discussed in detail. Since any visual servo system must be capable of tracking image features in a sequence of images, we also include an overview of feature-based and correlation-based methods for tracking. We conclude the tutorial with a number of observations on the current directions of the research field of visual servo control.
Resumo:
This paper presents a behavioral car-following model based on empirical trajectory data that is able to reproduce the spontaneous formation and ensuing propagation of stop-and-go waves in congested traffic. By analyzing individual drivers’ car-following behavior throughout oscillation cycles it is found that this behavior is consistent across drivers and can be captured by a simple model. The statistical analysis of the model’s parameters reveals that there is a strong correlation between driver behavior before and during the oscillation, and that this correlation should not be ignored if one is interested in microscopic output. If macroscopic outputs are of interest, simulation results indicate that an existing model with fewer parameters can be used instead. This is shown for traffic oscillations caused by rubbernecking as observed in the US 101 NGSIM dataset. The same experiment is used to establish the relationship between rubbernecking behavior and the period of oscillations.