853 resultados para Business process modelling
Resumo:
Besides classical criteria such as cost and overall organizational efficiency, an organization’s ability to being creative and to innovate is of increasing importance in markets that are overwhelmed with commodity products and services. Business Process Management (BPM) as an approach to model, analyze, and improve business processes has been successfully applied not only to enhance performance and reduce cost but also to facilitate business imperatives such as risk management and knowledge management. Can BPM also facilitate the management of creativity? We can find many examples where enterprises unintentionally reduced or even killed creativity and innovation for the sake of control, performance, and cost reduction. Based on the experiences we have made within case studies with organizations from the creative industries (film industry, visual effects production, etc.,) we believe that BPM can be a facilitator providing the glue between creativity management and well-established business principles. In this article we introduce the notions of creativity-intensive processes and pockets of creativity as new BPM concepts. We further propose a set of exemplary strategies that enable process owners and process managers to achieve creativity without sacrificing creativity. Our aim is to set the baseline for further discussions on what we call creativity-oriented BPM.
Resumo:
Process choreographies describe interactions between different business partners and the dependencies between these interactions. While different proposals were made for capturing choreographies at an implementation level, it remains unclear how choreographies should be described on a conceptual level.While the Business Process Modeling Notation (BPMN) is already in use for describing choreographies in terms of interconnected interface behavior models, this paper will introduce interaction modeling using BPMN. Such interaction models do not suffer from incompatibility issues and are better suited for human modelers. BPMN extensions are proposed and a mapping from interaction models to interface behavior models is presented.
Resumo:
Existing techniques for automated discovery of process models from event logs largely focus on extracting flat process models. In other words, they fail to exploit the notion of subprocess, as well as structured error handling and repetition constructs provided by contemporary process modeling notations, such as the Business Process Model and Notation (BPMN). This paper presents a technique for automated discovery of BPMN models containing subprocesses, interrupting and non-interrupting boundary events, and loop and multi-instance markers. The technique analyzes dependencies between data attributes associated with events, in order to identify subprocesses and to extract their associated logs. Parent process and subprocess models are then discovered separately using existing techniques for flat process model discovery. Finally, the resulting models and logs are heuristically analyzed in order to identify boundary events and markers. A validation with one synthetic and two real-life logs shows that process models derived using the proposed technique are more accurate and less complex than those derived with flat process model discovery techniques.
Resumo:
Recent literature on Enterprise System (ES) implementation projects highlights the importance of Knowledge Integration (KI) for implementation success. The fundamental characteristics of ES - integration of modules, business process view, and aspects of information transparency - necessitate that all frequent end-users share a reasonable amount of common knowledge and integrate their knowledge to yield new knowledge. Unfortunately, the importance of KI is often overlooked and little about the role of KI in ES success is known. In this chapter, the authors study the KI impact on ES success that is relevant to the ES post-implementation in support of organizations' returns on their ES investments. They adopt the ES post-implementation segment of ES utilization to explore whether the KI approach is causally linked to ES success. The research model was tested in a multi-industry sample in Malaysia from which data was gathered from managerial and operational employees spread across six large organizations. Consistent with the explanation by knowledge-based theory, the results show that KI was valid and significantly related to the outcome of ES that relates to an organization's performance, which the authors refer to as ES success. The KI positive impact on the success of ES drives one to highlight the importance of ontological KI in the complexity of the ES environment. The authors believe that focusing on an ontology through the KI perspective can make significant contributions to current ES problems.
Resumo:
Business Process Management has substantially matured over the last two decades. The techniques, methods and systems available to scope, model, analyze, implement, execute, monitor and even mine a process have been scientifically researched and can be in most cases deployed in practice. In fact, many of these BPM capabilities are nowadays a commodity. However, an opportunity-rich environment and rapidly emerging digital disruptions require new BPM capabilities. In light of this context, this paper proposes three future research and development directions for BPM academics and professionals. First, Ambidextrous BPM demands the shift of focus from exploitative to explorative BPM. Second, Value-driven BPM postulates a stronger focus on the desired outcomes as opposed to the available BPM methods. Third, Customer Process Management suggests complementing the dominating internal view of BPM with a stronger, design-inspired view on the process experiences of external stakeholders.
Resumo:
This research suggests information technology (IT) governance structures to manage the cloud computing services. The interest in acquiring IT resources as a utility from the cloud computing environment is gaining momentum. The cloud computing services present organizations with opportunities to manage their IT expenditure on an ongoing basis, and access to modern IT resources to innovate and manage their continuity. However, the cloud computing services are no silver bullet. Organizations would need to have appropriate governance structures and policies in place to manage the cloud computing services. The subsequent decisions from these governance structures will ensure the effective management of the cloud computing services. This management will facilitate a better fit of the cloud computing services into organizations’ existing processes to achieve the business (process-level) and the financial (firm-level) objectives. Using a triangulation approach, we suggest four governance structures for managing the cloud computing services. These structures are a chief cloud officer, a cloud management committee, a cloud service facilitation centre, and a cloud relationship centre. We also propose that these governance structures would relate directly to organizations cloud computing services-related business objectives, and indirectly to cloud computing services-related financial objectives. Perceptive field survey data from actual and prospective cloud computing service adopters suggest that the suggested governance structures would contribute directly to cloud computing-related business objectives and indirectly to cloud computing-related financial objectives.
Resumo:
In May/June 2014, the Program Committee Chairs of BPM’15 conducted a survey with present and past attendees and submitters to the BPM conference to gather feedback on the general perception of the conference. The survey is available at http://survey.qut.edu.au/f/180586/6bb1/. In particular, the survey included questions about the reputation of the conference, the reasons why survey participants submitted papers, whether they plan to submit to BPM’15, and soliciting input on a number of suggested changes and additions to the conduct of the conference series.
Resumo:
Organisations are constantly seeking new ways to improve operational efficiencies. This research study investigates a novel way to identify potential efficiency gains in business operations by observing how they are carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how they can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A genetic algorithm-based approach is proposed to explore and assess alternative process execution scenarios, where the objective function is represented by a comprehensive cost structure that captures different process dimensions. Experiments conducted with different variants of the genetic algorithm evaluate the approach's feasibility. The findings demonstrate that a genetic algorithm-based approach is able to make use of cost reduction as a way to identify improved execution scenarios in terms of reduced case durations and increased resource utilisation. The ultimate aim is to utilise cost-related insights gained from such improved scenarios to put forward recommendations for reducing process-related cost within organisations.
Resumo:
Enterprise Resource Planning (ERP) systems are integrated enterprise-wide standard information systems that automate all aspects of an organisations’ business processes. The ERP philosophy is that business systems incorporating sales, marketing, manufacturing, distribution, personnel and finance modules can be supported by a single integrated system with all of the company’s data captured in a central database. The ERP packages of vendors such as SAP, Baan, J.D. Edwards and Intentia represent more than a common systems platform for a business. They prescribe information blueprints of how organisation’s business processes should operate. In this paper, the scale and strategic importance of ERP systems is identified and the problem of ERP implementation is defined. Five company examples are analysed using a Critical Success Factors (CSFs) theoretical framework. The paper offers a framework for managers which provides the basis for developing an ERP implementation strategy. The case analysis identifies different approaches to ERP implementation, highlights the critical role of legacy systems in influencing the implementation process, and identifies the importance of business process change and software configuration in addition to factors already cited in the literature such as top management support and communication. The implications of the results and future research opportunities are outlined.
Resumo:
ABSTRACT A rapidly changing business environment and legacy IT problems has resulted in many organisations implementing standard package solutions. This 'common systems' approach establishes a common IT and business process infrastructure within organisations and its increasing dominance raises several important strategic issues. These are to what extent do common systems impose common business processes and management systems on competing firms, and what is the source of competitive advantage if the majority of firms employ almost identical information systems and business processes? A theoretical framework based on research into legacy systems and earlier IT strategy literature is used to analyse three case studies in the manufacturing, chemical and IT industries. It is shown that the organisations are treating common systems as the core of their organisations' abilities to manage business transactions. To achieve competitive advantage they are clothing these common systems with information systems designed to capture information about competitors, customers and suppliers, and to provide a basis for sharing knowledge within the organisation and ultimately with economic partners. The importance of these approaches to other organisations and industries is analysed and an attempt is made at outlining the strategic options open to firms beyond the implementation of common business systems.
Resumo:
Customer Relationship Management (CRM) packaged software has become a key contributor to attempts at aligning business and IT strategies in recent years. Throughout the 1990s there was, in many organisations strategies, a shift from the need to manage transactions and toward relationship management. Where Enterprise Resource Planning packages dominated the management of transactions era, CRM packages lead in regard to relationships. At present, balanced views of CRM packages are scantly presented instead relying on vendor rhetoric. This paper uses case study research to analyse some of the issues associated with CRM packages. These issues include the limitations of CRM packages, the need for a relationship orientation and the problems of a dominant management perspective of CRM. It is suggested that these issues could be more readily accommodated by organisational detachment from beliefs in IT as utopia, consideration of prior IS theory and practice and a more informed approach to CRM package selection.
Resumo:
Enterprise Resource Planning (ERP) software is the dominant strategic platform for supporting enterprise-wide business processes. However, it has been criticised for being inflexible and not meeting specific organisation and industry requirements. An alternative, Best of Breed (BoB), integrates components of standard package and/or custom software. The objective is to develop enterprise systems that are more closely aligned with the business processes of an organisation. A case study of a BoB implementation facilitates a comparative analysis of the issues associated with this strategy and the single vendor ERP alternative. The paper illustrates the differences in complexity of implementation, levels of functionality, business process alignment potential and associated maintenance.
Resumo:
Through the application of process mining, valuable evidence-based insights can be obtained about business processes in organisations. As a result the field has seen an increased uptake in recent years as evidenced by success stories and increased tool support. However, despite this impact, current performance analysis capabilities remain somewhat limited in the context of information-poor event logs. For example, natural daily and weekly patterns are not considered. In this paper a new framework for analysing event logs is defined which is based on the concept of event gap. The framework allows for a systematic approach to sophisticated performance-related analysis of event logs containing varying degrees of information. The paper formalises a range of event gap types and then presents an implementation as well as an evaluation of the proposed approach.
Resumo:
The question concerning what makes for good BPM is often raised. A recent call from Paul Harmon on the BPTrends Discussion LinkedIN Group for key issues in BPM received 189 answers within two months, with additional answers still appearing. I have teamed up with a number of BPM researchers and practitioners to bring together our joint experience in a BPM workshop at the University in Liechtenstein in 2013, where we developed ten principles of good BPM, later published in Business Process Management Journal (vom Brocke et al., 2014). The paper, which has received considerable attention in academia, was ranked the journal’s most downloaded paper the month it was published. Slides on Slideshare that provide a brief summary of the paper have been accessed more than 3,000 times since they were first put online in March 2014. Given the importance of the topic–what makes for good BPM–and the positive response to the ten principles, I wrote this note with the co-authors of the original BPMJ paper to outline the ten principles and illustrate how to use them in practice. We invite all readers to engage in this discussion via any channel they find appropriate.
Resumo:
Designed for undergraduate and postgraduate students, academic researchers and industrial practitioners, this book provides comprehensive case studies on numerical computing of industrial processes and step-by-step procedures for conducting industrial computing. It assumes minimal knowledge in numerical computing and computer programming, making it easy to read, understand and follow. Topics discussed include fundamentals of industrial computing, finite difference methods, the Wavelet-Collocation Method, the Wavelet-Galerkin Method, High Resolution Methods, and comparative studies of various methods. These are discussed using examples of carefully selected models from real processes of industrial significance. The step-by-step procedures in all these case studies can be easily applied to other industrial processes without a need for major changes and thus provide readers with useful frameworks for the applications of engineering computing in fundamental research problems and practical development scenarios.