695 resultados para Bologna process
Resumo:
This research contributes a fully-operational approach for managing business process risk in near real-time. The approach consists of a language for defining risks on top of process models, a technique to detect such risks as they eventuate during the execution of business processes, a recommender system for making risk-informed decisions, and a technique to automatically mitigate the detected risks when they are no longer tolerable. Through the incorporation of risk management elements in all stages of the lifecycle of business processes, this work contributes to the effective integration of the fields of Business Process Management and Risk Management.
Resumo:
The purpose of this paper is to review existing knowledge management (KM) practices within the field of asset management, identify gaps, and propose a new approach to managing knowledge for asset management. Existing approaches to KM in the field of asset management are incomplete with the focus primarily on the application of data and information systems, for example the use of an asset register. It is contended these approaches provide access to explicit knowledge and overlook the importance of tacit knowledge acquisition, sharing and application. In doing so, current KM approaches within asset management tend to neglect the significance of relational factors; whereas studies in the knowledge management field have showed that relational modes such as social capital is imperative for ef-fective KM outcomes. In this paper, we argue that incorporating a relational ap-proach to KM is more likely to contribute to the exchange of ideas and the devel-opment of creative responses necessary to improve decision-making in asset management. This conceptual paper uses extant literature to explain knowledge management antecedents and explore its outcomes in the context of asset man-agement. KM is a component in the new Integrated Strategic Asset Management (ISAM) framework developed in conjunction with asset management industry as-sociations (AAMCoG, 2012) that improves asset management performance. In this paper we use Nahapiet and Ghoshal’s (1998) model to explain antecedents of relational approach to knowledge management. Further, we develop an argument that relational knowledge management is likely to contribute to the improvement of the ISAM framework components, such as Organisational Strategic Manage-ment, Service Planning and Delivery. The main contribution of the paper is a novel and robust approach to managing knowledge that leads to the improvement of asset management outcomes.
Resumo:
Process modelling is an integral part of any process industry. Several sugar factory models have been developed over the years to simulate the unit operations. An enhanced and comprehensive milling process simulation model has been developed to analyse the performance of the milling train and to assess the impact of changes and advanced control options for improved operational efficiency. The developed model is incorporated in a proprietary software package ‘SysCAD’. As an example, the milling process model has been used to predict a significant loss of extraction by returning the cush from the juice screen before #3 mill instead of before #2 mill as is more commonly done. Further work is being undertaken to more accurately model extraction processes in a milling train, to examine extraction issues dynamically and to integrate the model into a whole factory model.
Resumo:
Purpose – Context-awareness has emerged as an important principle in the design of flexible business processes. The goal of the research is to develop an approach to extend context-aware business process modeling toward location-awareness. The purpose of this paper is to identify and conceptualize location-dependencies in process modeling. Design/methodology/approach – This paper uses a pattern-based approach to identify location-dependency in process models. The authors design specifications for these patterns. The authors present illustrative examples and evaluate the identified patterns through a literature review of published process cases. Findings – This paper introduces location-awareness as a new perspective to extend context-awareness in BPM research, by introducing relevant location concepts such as location-awareness and location-dependencies. The authors identify five basic location-dependent control-flow patterns that can be captured in process models. And the authors identify location-dependencies in several existing case studies of business processes. Research limitations/implications – The authors focus exclusively on the control-flow perspective of process models. Further work needs to extend the research to address location-dependencies in process data or resources. Further empirical work is needed to explore determinants and consequences of the modeling of location-dependencies. Originality/value – As existing literature mostly focusses on the broad context of business process, location in process modeling still is treated as “second class citizen” in theory and in practice. This paper discusses the vital role of location-dependencies within business processes. The proposed five basic location-dependent control-flow patterns are novel and useful to explain location-dependency in business process models. They provide a conceptual basis for further exploration of location-awareness in the management of business processes.
Resumo:
Empirical evidence shows that repositories of business process models used in industrial practice contain significant amounts of duplication. This duplication arises for example when the repository covers multiple variants of the same processes or due to copy-pasting. Previous work has addressed the problem of efficiently retrieving exact clones that can be refactored into shared subprocess models. This article studies the broader problem of approximate clone detection in process models. The article proposes techniques for detecting clusters of approximate clones based on two well-known clustering algorithms: DBSCAN and Hi- erarchical Agglomerative Clustering (HAC). The article also defines a measure of standardizability of an approximate clone cluster, meaning the potential benefit of replacing the approximate clones with a single standardized subprocess. Experiments show that both techniques, in conjunction with the proposed standardizability measure, accurately retrieve clusters of approximate clones that originate from copy-pasting followed by independent modifications to the copied fragments. Additional experiments show that both techniques produce clusters that match those produced by human subjects and that are perceived to be standardizable.
Resumo:
In this chapter we describe a critical fairytales unit taught to 4.5 to 5.5 year olds in a context of intensifying pressure to raise literacy achievement. The unit was infused with lessons on reinterpreted fairytales followed by process drama activities built around a sophisticated picture book, Beware of the Bears (MacDonald, 2004). The latter entailed a text analytic approach to critical literacy derived from systemic functional linguistics (Halliday, 1978; Halliday & Matthiessen, 2004). This approach provides a way of analysing how words and discourse are used to represent the world in a particular way and shape reader relations with the author in a particular field (Janks, 2010).
Resumo:
Process models describe someone’s understanding of processes. Processes can be described using unstructured, semi-formal or diagrammatic representation forms. These representations are used in a variety of task settings, ranging from understanding processes to executing or improving processes, with the implicit assumption that the chosen representation form will be appropriate for all task settings. We explore the validity of this assumption by examining empirically the preference for different process representation forms depending on the task setting and cognitive style of the user. Based on data collected from 120 business school students, we show that preferences for process representation formats vary dependent on application purpose and cognitive styles of the participants. However, users consistently prefer diagrams over other representation formats. Our research informs a broader research agenda on task-specific applications of process modeling. We offer several recommendations for further research in this area.
Resumo:
To effectively manage the challenges being faced by construction organisations in a fast changing business environment, many organisations are attempting to integrate knowledge management (KM) into their business operations. KM activities interact with each other and form a process which receives input from its internal business environment and produces outputs that should be justified by its business performance. This paper aims to provide further understanding on the dynamic nature of the KM process. Through a combination of path analysis and system dynamic simulation, this study found that: 1) an improved business performance enables active KM activities and provide feedback and guidance for formulating learning-based policies; and 2) effective human resource recruitment policies can enlarge the pool of individual knowledge, which lead to a more conducive internal business environment, as well as a higher KM activity level. Consequently, the desired business performance level can be reached within a shorter time frame.
Resumo:
Today’s information systems log vast amounts of data. These collections of data (implicitly) describe events (e.g. placing an order or taking a blood test) and, hence, provide information on the actual execution of business processes. The analysis of such data provides an excellent starting point for business process improvement. This is the realm of process mining, an area which has provided a repertoire of many analysis techniques. Despite the impressive capabilities of existing process mining algorithms, dealing with the abundance of data recorded by contemporary systems and devices remains a challenge. Of particular importance is the capability to guide the meaningful interpretation of “oceans of data” by process analysts. To this end, insights from the field of visual analytics can be leveraged. This article proposes an approach where process states are reconstructed from event logs and visualised in succession, leading to an animated history of a process. This approach is customisable in how a process state, partially defined through a collection of activity instances, is visualised: one can select a map and specify a projection of events on this map based on the properties of the events. This paper describes a comprehensive implementation of the proposal. It was realised using the open-source process mining framework ProM. Moreover, this paper also reports on an evaluation of the approach conducted with Suncorp, one of Australia’s largest insurance companies.
Resumo:
In 2007 some of us were fortunate enough to be in Dundee for the Royal College of Nursing’s Annual International Nursing Research Conference. A highlight of that conference was an enactment of the process and context debate. The chair asked for volunteers and various members of the audience came forward giving the impression that they were nurses and that it was a chance selection. The audience accepted these individuals as their representatives and once they had gathered on stage we all expected the debate to begin. But the large number of researchers in the audience gave little thought to the selection and recruitment process they had just witnessed. Then the selected representatives stood up and sang A cappella. Suddenly the context was different and we questioned the process. The point was made: process or context, or both?
Resumo:
Contemporary cities no longer offer the same types of permanent environments that we planned for in the latter part of the twentieth century. Our public spaces are increasingly temporary, transient, and ephemeral. The theories, principles and tactics with which we designed these spaces in the past are no longer appropriate. We need a new theory for understanding the creation, use, and reuse of temporary public space. Moe than a theory, we need new architectural tactics or strategies that can be reliably employed to create successful temporary public spaces. This paper will present ongoing research that starts that process through critical review and technical analysis of existing and historic temporary public spaces. Through the analysis of a number of public spaces, that were either designed for temporary use or became temporary through changing social conditions, this research identifies the tactics and heuristics used in such projects. These tactics and heuristics are then analysed to extract some broader principles for the design of temporary public space. The theories of time related building layers, a model of environmental sustainability, and the recycling of social meaning, are all explored. The paper will go on to identify a number of key questions that need to be explored and addressed by a theory for such developments: How can we retain social meaning in the fabric of the city and its public spaces while we disassemble it and recycle it into new purposes? What role will preservation have in the rapidly changing future; will exemplary temporary spaces be preserved and thereby become no longer temporary? Does the environmental advantage of recycling materials, components and spaces outweigh the removal or social loss of temporary public space? This research starts to identify the knowledge gaps and proposes a number of strategies for making public space in the age of temporary, recyclable, and repurposing of our urban infrastructure; a way of creating lighter, cheaper, quicker, and temporary interventions.
Resumo:
In-memory databases have become a mainstay of enterprise computing offering significant performance and scalability boosts for online analytical and (to a lesser extent) transactional processing as well as improved prospects for integration across different applications through an efficient shared database layer. Significant research and development has been undertaken over several years concerning data management considerations of in-memory databases. However, limited insights are available on the impacts of applications and their supportive middleware platforms and how they need to evolve to fully function through, and leverage, in-memory database capabilities. This paper provides a first, comprehensive exposition into how in-memory databases impact Business Pro- cess Management, as a mission-critical and exemplary model-driven integration and orchestration middleware. Through it, we argue that in-memory databases will render some prevalent uses of legacy BPM middleware obsolete, but also open up exciting possibilities for tighter application integration, better process automation performance and some entirely new BPM capabilities such as process-based application customization. To validate the feasibility of an in-memory BPM, we develop a surprisingly simple BPM runtime embedded into SAP HANA and providing for BPMN-based process automation capabilities.
Resumo:
Accurate process model elicitation continues to be a time consuming task, requiring skill on the part of the interviewer to extract explicit and tacit process information from the interviewee. Many errors occur in this elicitation stage that would be avoided by better activity recall, more consistent specification methods and greater engagement in the elicitation process by interviewees. Metasonic GmbH has developed a process elicitation tool for their process suite. As part of a research engagement with Metasonic, staff from QUT, Australia have developed a 3D virtual world approach to the same problem, viz. eliciting process models from stakeholders in an intuitive manner. This book chapter tells the story of how QUT staff developed a 3D Virtual World tool for process elicitation, took the outcomes of their research project to Metasonic for evaluation, and finally, Metasonic’s response to the initial proof of concept.