853 resultados para Business Process Management, BPM life cycle, quality, root cause analysis
Resumo:
With increasing pressure to provide environmentally responsible infrastructure products and services, stakeholders are putting significant foci on the early identification of financial viability and outcome of infrastructure projects. Traditionally, there has been an imbalance between sustainable measures and project budget. On one hand, the industry tends to employ the first-cost mentality and approach to developing infrastructure projects. On the other, environmental experts and technology innovators often push for the ultimately green products and systems without much of a concern for cost. This situation is being quickly changed as the industry is under pressure to continue to return profit, while better adapting to current and emerging global issues of sustainability. For the infrastructure sector to contribute to sustainable development, it will need to increase value and efficiency. Thus, there is a great need for tools that will enable decision makers evaluate competing initiatives and identify the most sustainable approaches to procuring infrastructure projects. In order to ensure that these objectives are achieved, the concept of life-cycle costing analysis (LCCA) will play significant roles in the economics of an infrastructure project. Recently, a few research initiatives have applied the LCCA models for road infrastructure that focused on the traditional economics of a project. There is little coverage of life-cycle costing as a method to evaluate the criteria and assess the economic implications of pursuing sustainability in road infrastructure projects. To rectify this problem, this paper reviews the theoretical basis of previous LCCA models before discussing their inability to determinate the sustainability indicators in road infrastructure project. It then introduces an on-going research aimed at developing a new model to integrate the various new cost elements based on the sustainability indicators with the traditional and proven LCCA approach. It is expected that the research will generate a working model for sustainability based life-cycle cost analysis.
Resumo:
Public awareness and the nature of highway construction works demand that sustainability measures are first on the development agenda. However, in the current economic climate, individual volition and enthusiasm for such high capital investments do not present as strong cases for decision making as the financial pictures of pursuing sustainability. Some stakeholders consider sustainability to be extra work that costs additional money. Though, stakeholders realised its importance in infrastructure development. They are keen to identify the available alternatives and financial implications on a lifecycle basis. Highway infrastructure development is a complex rocess which requires expertise and tools to evaluate investment options, such as environmentally sustainable features for road and highway development. Life-cycle cost analysis (LCCA) is a valuable approach for investment decision making for construction works. However, LCCA applications in highway development are still limited. Current models, for example focus on economic issues alone and do not deal with sustainability factors, which are more difficult to quantify and encapsulate in estimation modules. This paper reports the research which identifies sustainability related factors in highway construction projects, in quantitative and qualitative forms of a multi-criteria analysis. These factors are then incorporated into past and proven LCCA models to produce a new long term decision support model. The research via questionnaire, model building, analytical hierarchy processes (AHP) and case studies have identified, evaluated and then processed highway sustainability related cost elements. These cost elements need to be verified by industry before being integrated for further development of the model. Then the Australian construction industry will have a practical tool to evaluate investment decisions which provide an optimum balance between financial viability and sustainability deliverables.
Resumo:
Creative processes, for instance, the development of visual effects or computer games, increasingly become part of the agenda of information systems researchers and practitioners. Such processes get their managerial challenges from the fact that they comprise both well-structured, transactional parts and creative parts. The latter can often not be precisely specified in terms of control flow, required resources, and outcome. The processes’ high uncertainty sets boundaries for the application of traditional business process management concepts, such as process automation, process modeling, process performance measurement, and risk management. Organizations must thus exercise caution when it comes to managing creative processes and supporting these with information technology. This, in turn, requires a profound understanding of the concept of creativity in business processes. In response to this, the present paper introduces a framework for conceptualizing creativity within business processes. The conceptual framework describes three types of uncertainty and constraints as well as the interrelationships among these. The study is grounded in the findings from three case studies that were conducted in the film and visual effects industry. Moreover, we provide initial evidence for the framework’s validity beyond this narrow focus. The framework is intended to serve as a sensitizing device that can guide further information systems research on creativity-related phenomena.
Resumo:
This paper addresses the problem of constructing consolidated business process models out of collections of process models that share common fragments. The paper considers the construction of unions of multiple models (called merged models) as well as intersections (called digests). Merged models are intended for analysts who wish to create a model that subsumes a collection of process models - typically representing variants of the same underlying process - with the aim of replacing the variants with the merged model. Digests, on the other hand, are intended for analysts who wish to identify the most recurring fragments across a collection of process models, so that they can focus their efforts on optimizing these fragments. The paper presents an algorithm for computing merged models and an algorithm for extracting digests from a merged model. The merging and digest extraction algorithms have been implemented and tested against collections of process models taken from multiple application domains. The tests show that the merging algorithm produces compact models and scales up to process models containing hundreds of nodes. Furthermore, a case study conducted in a large insurance company has demonstrated the usefulness of the merging and digest extraction operators in a practical setting.
Resumo:
As organizations reach higher levels of Business Process Management maturity, they tend to collect numerous business process models. Such models may be linked with each other or mutually overlap, supersede one another and evolve over time. Moreover, they may be represented at different abstraction levels depending on the target audience and modeling purpose, and may be available in multiple languages (e.g. due to company mergers). Thus, it is common that organizations struggle with keeping track of their process models. This demonstration introduces AProMoRe (Advanced Process Model Repository) which aims to facilitate the management of (large) process model collections.
Resumo:
Business process model repositories capture precious knowledge about an organization or a business domain. In many cases, these repositories contain hundreds or even thousands of models and they represent several man-years of effort. Over time, process model repositories tend to accumulate duplicate fragments, as new process models are created by copying and merging fragments from other models. This calls for methods to detect duplicate fragments in process models that can be refactored as separate subprocesses in order to increase readability and maintainability. This paper presents an indexing structure to support the fast detection of clones in large process model repositories. Experiments show that the algorithm scales to repositories with hundreds of models. The experimental results also show that a significant number of non-trivial clones can be found in process model repositories taken from industrial practice.
Resumo:
Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business process model repositories. For example, in some cases new process models may be derived from existing models, thus finding these models and adapting them may be more effective and less error-prone than developing them from scratch. Since process model repositories may be large, query evaluation may be time consuming. Hence, we investigate the use of indexes to speed up this evaluation process. To make our approach more applicable, we consider the semantic similarity between labels. Experiments are conducted to demonstrate that our approach is efficient.
Resumo:
As order dependencies between process tasks can get complex, it is easy to make mistakes in process model design, especially behavioral ones such as deadlocks. Notions such as soundness formalize behavioral errors and tools exist that can identify such errors. However these tools do not provide assistance with the correction of the process models. Error correction can be very challenging as the intentions of the process modeler are not known and there may be many ways in which an error can be corrected. We present a novel technique for automatic error correction in process models based on simulated annealing. Via this technique a number of process model alternatives are identified that resolve one or more errors in the original model. The technique is implemented and validated on a sample of industrial process models. The tests show that at least one sound solution can be found for each input model and that the response times are short.
Resumo:
We all live in a yellow submarine… When I go to work in the morning, in the office building that hosts our BPM research group, on the way up to our level I come by this big breakout room that hosts a number of computer scientists, working away at the next generation software algorithms and iPad applications (I assume). I have never actually been in that room, but every now and then the door is left ajar for a while and I can spot couches, lots (I mean, lots!) of monitors, the odd scientist, a number of Lara Croft posters, and the usual room equipment you’d probably expect from computer scientists (and, no, it’s not like that evil Dennis guy from the Jurassic Park movie, buried in chips, coke, and flickering code screens… It’s also not like the command room from the Nebuchadnezzar, Neo’s hovercraft in the Matrix movies, although I still strongly believe these green lines of code make a good screensaver).
Resumo:
One of the prominent topics in Business Service Management is business models for (new) services. Business models are useful for service management and engineering as they provide a broader and more holistic perspective on services. Business models are particularly relevant for service innovation as this requires paying attention to the business models that make new services viable and business model innovation can drive the innovation of new and established services. Before we can have a look at business models for services, we first need to understand what business models are. This is not straight-forward as business models are still not well comprehended and the knowledge about business models is fragmented over different disciplines, such as information systems, strategy, innovation, and entrepreneurship. This whitepaper, ‘Understanding business models,’ introduces readers to business models. This whitepaper contributes to enhancing the understanding of business models, in particular the conceptualisation of business models by discussing and integrating business model definitions, frameworks and archetypes from different disciplines. After reading this whitepaper, the reader will have a well-developed understanding about what business models are and how the concept is sometimes interpreted and used in different ways. It will help the reader in assessing their own understanding of business models and that and of others. This will contribute to a better and more beneficial use of business models, an increase in shared understanding, and making it easier to work with business model techniques and tools.
Resumo:
Business transformations are large-scale organizational change programs that, evidence suggests, are often unsuccessful. Our interest is in identifying the management capabilities required for the successful execution of these projects. We advance a service-oriented view of the enterprise, which suggests that different management services need to be identified and integrated in order to execute business transformation. In order to identify those management services that require integration, we conducted an exploratory empirical study of the demand for management services in US and Asia, and we show that two archetypes of management services exist in business transformation initiatives: transactional and transformational management services. We identify the relevant set of transactional and transformational services and discuss what the demand for these services implies for the execution of business transformations.
Resumo:
Real-world business processes rely on the availability of scarce, shared resources, both human and non-human. Current workflow management systems support allocation of individual human resources to tasks but lack support for the full range of resource types used in practice, and the inevitable constraints on their availability and applicability. Based on past experience with resource-intensive workflow applications, we derive generic requirements for a workflow system which can use its knowledge of resource capabilities and availability to help create feasible task schedules. We then define the necessary architecture for implementing such a system and demonstrate its practicality through a proof-of-concept implementation. This work is presented in the context of a real-life surgical care process observed in a number of German hospitals.
Resumo:
This paper proposes a novel approach for identifying risks in executable business processes and detecting them at run time. The approach considers risks in all phases of the business process management lifecycle, and is realized via a distributed, sensor-based architecture. At design-time, sensors are defined to specify risk conditions which when fulfilled, are a likely indicator of faults to occur. Both historical and current execution data can be used to compose such conditions. At run-time, each sensor independently notifies a sensor manager when a risk is detected. In turn, the sensor manager interacts with the monitoring component of a process automation suite to prompt the results to the user who may take remedial actions. The proposed architecture has been implemented in the YAWL system and its performance has been evaluated in practice.
Resumo:
Identifying, modelling and documenting business processes usually requires the collaboration of many stakeholders that may be spread across companies in inter-organizational business settings. While there are many process modelling tools available, the support they provide for remote collaboration is still limited. This demonstration showcases a novel prototype application that implements collaborative virtual environment and augmented reality technologies to improve remote collaborative process modelling, with an aim to assisting common collaboration tasks by providing an increased sense of immersion in an intuitive shared work and task space. Our tool is easily deployed using open source software, and commodity hardware, and is expected to assist with saving money on travel costs for large scale process modelling projects covering national and international centres within an enterprise.