48 resultados para Business Process Model Validation
Resumo:
In this paper, we present a top down approach for integrated process modelling and distributed process execution. The integrated process model can be utilized for global monitoring and visualization and distributed process models for local execution. Our main focus in this paper is the presentation of the approach to support automatic generation and linking of distributed process models from an integrated process definition.
Resumo:
This paper presents a methodology for deriving business process descriptions based on terms in business contract. The aim is to assist process modellers in structuring collaborative interactions between parties, including their internal processes, to ensure contract-compliant behaviour. The methodology requires a formal model of contracts to facilitate process derivations and to form a basis for contract analysis tools and run-time process execution.
Resumo:
As process management projects have increased in size due to globalised and company-wide initiatives, a corresponding growth in the size of process modeling projects can be observed. Despite advances in languages, tools and methodologies, several aspects of these projects have been largely ignored by the academic community. This paper makes a first contribution to a potential research agenda in this field by defining the characteristics of large-scale process modeling projects and proposing a framework of related issues. These issues are derived from a semi -structured interview and six focus groups conducted in Australia, Germany and the USA with enterprise and modeling software vendors and customers. The focus groups confirm the existence of unresolved problems in business process modeling projects. The outcomes provide a research agenda which directs researchers into further studies in global process management, process model decomposition and the overall governance of process modeling projects. It is expected that this research agenda will provide guidance to researchers and practitioners by focusing on areas of high theoretical and practical relevance.
Resumo:
This paper addresses the problem of ensuring compliance of business processes, implemented within and across organisational boundaries, with the constraints stated in related business contracts. In order to deal with the complexity of this problem we propose two solutions that allow for a systematic and increasingly automated support for addressing two specific compliance issues. One solution provides a set of guidelines for progressively transforming contract conditions into business processes that are consistent with contract conditions thus avoiding violation of the rules in contract. Another solution compares rules in business contracts and rules in business processes to check for possible inconsistencies. Both approaches rely on a computer interpretable representation of contract conditions that embodies contract semantics. This semantics is described in terms of a logic based formalism allowing for the description of obligations, prohibitions, permissions and violations conditions in contracts. This semantics was based on an analysis of typical building blocks of many commercial, financial and government contracts. The study proved that our contract formalism provides a good foundation for describing key types of conditions in contracts, and has also given several insights into valuable transformation techniques and formalisms needed to establish better alignment between these two, traditionally separate areas of research and endeavour. The study also revealed a number of new areas of research, some of which we intend to address in near future.
Resumo:
High-frequency beach water table fluctuations due to wave run-up and rundown have been observed in the field [Waddell, 1976]. Such fluctuations affect the infiltration/exfiltration process across the beach face and the interstitial oxygenation process in the beach ecosystem. Accurate representation of high-frequency water table fluctuations is of importance in the modeling of (1) the interaction between seawater and groundwater, more important, the effects on swash sediment transport and (2) the biological activities in the beach ecosystem. Capillarity effects provide a mechanism for high-frequency water table fluctuations. Previous modeling approaches adopted the assumption of saturated flow only and failed to predict the propagation of high-frequency fluctuations in the aquifer. In this paper we develop a modified kinematic boundary condition (kbc) for the water table which incorporates capillarity effects. The application of this kbc in a boundary element model enables the simulation of high-frequency water table fluctuations due to wave run-up. Numerical tests were carried out for a rectangular domain with small-amplitude oscillations; the behavior of water table responses was found to be similar to that predicted by an analytical solution based on the one-dimensional Boussinesq equation. The model was also applied to simulate the water table response to wave run-up on a doping beach. The results showed similar features of water table fluctuations observed in the field. In particular, these fluctuations are standing wave-like with the amplitude becoming increasingly damped inland. We conclude that the modified kbc presented here is a reasonable approximation of capillarity effects on beach water table fluctuations. However, further model validation is necessary before the model can confidently be used to simulate high-frequency water table fluctuations due to wave run-up.
Resumo:
Within the information systems field, the task of conceptual modeling involves building a representation of selected phenomena in some domain. High-quality conceptual-modeling work is important because it facilitates early detection and correction of system development errors. It also plays an increasingly important role in activities like business process reengineering and documentation of best-practice data and process models in enterprise resource planning systems. Yet little research has been undertaken on many aspects of conceptual modeling. In this paper, we propose a framework to motivate research that addresses the following fundamental question: How can we model the world to better facilitate our developing, implementing, using, and maintaining more valuable information systems? The framework comprises four elements: conceptual-modeling grammars, conceptual-modeling methods, conceptual-modeling scripts, and conceptual-modeling contexts. We provide examples of the types of research that have already been undertaken on each element and illustrate research opportunities that exist.
Resumo:
In this paper, we consider testing for additivity in a class of nonparametric stochastic regression models. Two test statistics are constructed and their asymptotic distributions are established. We also conduct a small sample study for one of the test statistics through a simulated example. (C) 2002 Elsevier Science (USA).
Resumo:
Medical microbiology and virology laboratories use nucleic acid tests (NAT) to detect genomic material of infectious organisms in clinical samples. Laboratories choose to perform assembled (or in-house) NAT if commercial assays are not available or if assembled NAT are more economical or accurate. One reason commercial assays are more expensive is because extensive validation is necessary before the kit is marketed, as manufacturers must accept liability for the performance of their assays, assuming their instructions are followed. On the other hand, it is a particular laboratory's responsibility to validate an assembled NAT prior to using it for testing and reporting results on human samples. There are few published guidelines for the validation of assembled NAT. One procedure that laboratories can use to establish a validation process for an assay is detailed in this document. Before validating a method, laboratories must optimise it and then document the protocol. All instruments must be calibrated and maintained throughout the testing process. The validation process involves a series of steps including: (i) testing of dilution series of positive samples to determine the limits of detection of the assay and their linearity over concentrations to be measured in quantitative NAT; (ii) establishing the day-to-day variation of the assay's performance; (iii) evaluating the sensitivity and specificity of the assay as far as practicable, along with the extent of cross-reactivity with other genomic material; and (iv) assuring the quality of assembled assays using quality control procedures that monitor the performance of reagent batches before introducing new lots of reagent for testing.
Resumo:
Much research has been devoted over the years to investigating and advancing the techniques and tools used by analysts when they model. As opposed to what academics, software providers and their resellers promote as should be happening, the aim of this research was to determine whether practitioners still embraced conceptual modeling seriously. In addition, what are the most popular techniques and tools used for conceptual modeling? What are the major purposes for which conceptual modeling is used? The study found that the top six most frequently used modeling techniques and methods were ER diagramming, data flow diagramming, systems flowcharting, workflow modeling, UML, and structured charts. Modeling technique use was found to decrease significantly from smaller to medium-sized organizations, but then to increase significantly in larger organizations (proxying for large, complex projects). Technique use was also found to significantly follow an inverted U-shaped curve, contrary to some prior explanations. Additionally, an important contribution of this study was the identification of the factors that uniquely influence the decision of analysts to continue to use modeling, viz., communication (using diagrams) to/from stakeholders, internal knowledge (lack of) of techniques, user expectations management, understanding models' integration into the business, and tool/software deficiencies. The highest ranked purposes for which modeling was undertaken were database design and management, business process documentation, business process improvement, and software development. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Minimal representations are known to have no redundant elements, and are therefore of great importance. Based on the notions of performance and size indices and measures for process systems, the paper proposes conditions for a process model being minimal in a set of functionally equivalent models with respect to a size norm. Generalized versions of known procedures to obtain minimal process models for a given modelling goal, model reduction based on sensitivity analysis and incremental model building are proposed and discussed. The notions and procedures are illustrated and compared on a simple example, that of a simple nonlinear fermentation process with different modelling goals and on a case study of a heat exchanger modelling. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
The aim of the study presented was to implement a process model to simulate the dynamic behaviour of a pilot-scale process for anaerobic two-stage digestion of sewage sludge. The model implemented was initiated to support experimental investigations of the anaerobic two-stage digestion process. The model concept implemented in the simulation software package MATLAB(TM)/Simulink(R) is a derivative of the IWA Anaerobic Digestion Model No.1 (ADM1) that has been developed by the IWA task group for mathematical modelling of anaerobic processes. In the present study the original model concept has been adapted and applied to replicate a two-stage digestion process. Testing procedures, including balance checks and 'benchmarking' tests were carried out to verify the accuracy of the implementation. These combined measures ensured a faultless model implementation without numerical inconsistencies. Parameters for both, the thermophilic and the mesophilic process stage, have been estimated successfully using data from lab-scale experiments described in literature. Due to the high number of parameters in the structured model, it was necessary to develop a customised procedure that limited the range of parameters to be estimated. The accuracy of the optimised parameter sets has been assessed against experimental data from pilot-scale experiments. Under these conditions, the model predicted reasonably well the dynamic behaviour of a two-stage digestion process in pilot scale. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Validation procedures play an important role in establishing the credibility of models, improving their relevance and acceptability. This article reviews the testing of models relevant to environmental and natural resource management with particular emphasis on models used in multicriteria analysis (MCA). Validation efforts for a model used in a MCA catchment management study in North Queensland, Australia, are presented. Determination of face validity is found to be a useful approach in evaluating this model, and sensitivity analysis is useful in checking the stability of the model. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
The ontological analysis of conceptual modelling techniques is of increasing popularity. Related research did not only explore the ontological deficiencies of classical techniques such as ER or UML, but also business process modelling techniques such as ARIS or even Web services standards such as BPEL4WS. While the selected ontologies are reasonably mature, it is the actual process of an ontological analysis that still lacks rigor. The current procedure leaves significant room for individual interpretations and is one reason for criticism of the entire ontological analysis. This paper proposes a procedural model for the ontological analysis based on the use of meta models, the involvement of more than one coder and metrics. This model is explained with examples from various ontological analyses.
Resumo:
Current initiatives in the field of Business Process Management (BPM) strive for the development of a BPM standard notation by pushing the Business Process Modeling Notation (BPMN). However, such a proposed standard notation needs to be carefully examined. Ontological analysis is an established theoretical approach to evaluating modelling techniques. This paper reports on the outcomes of an ontological analysis of BPMN and explores identified issues by reporting on interviews conducted with BPMN users in Australia. Complementing this analysis we consolidate our findings with previous ontological analyses of process modelling notations to deliver a comprehensive assessment of BPMN.
Resumo:
E-Business Information Systems (eBIS) are Information Systems (IS) that support organizations to realize their e-Business strategy resulting in various benefits. Therefore those systems strongly focus on fulfilment of the e-business requirements. In order to realise the expected benefits, organizations need to turn to their eBIS and measure the maturity of those systems. In doing so, they need to identify the status of those systems with regards to their suitability to support the e-Business strategy, while also identifying required IS improvements. In our research we aim to develop a maturity model, particularly dedicated to the area of e-Business Information Systems, which can be used easily and objectively to measure of the current maturity of any Information System that supports e-Business. This research-in-progress paper presents initial results of our research.