578 resultados para Process Re-engineering
Resumo:
Process improvement and innovation are risky endeavors, like swimming in unknown waters. In this chapter, I will discuss how process innovation through BPM can benefit from Research-as-a-Service, that is, from the application of research concepts in the processes of BPM projects. A further subject will be how innovations can be converted from confidence-based to evidence-based models due to affordances of digital infrastructures such as large-scale enterprise soft-ware or social media. I will introduce the relevant concepts, provide illustrations for digital capabilities that allow for innovation, and share a number of key takeaway lessons for how organizations can innovate on the basis of digital opportunities and principles of evidence-based BPM: the foundation of all process decisions in facts rather than fiction.
Resumo:
Organizations executing similar business processes need to understand the differences and similarities in activities performed across work environments. Presently, research interest is directed towards the potential of visualization for the display of process models, to support users in their analysis tasks. Although recent literature in process mining and comparison provide several methods and algorithms to perform process and log comparison, few contributions explore novel visualization approaches. This paper analyses process comparison from a design perspective, providing some practical visualization techniques as anal- ysis solutions (/to support process analysis). The design of the visual comparison has been tackled through three different points of view: the general model, the projected model and the side-by-side comparison in order to support the needs of business analysts. A case study is presented showing the application of process mining and visualization techniques to patient treatment across two Australian hospitals.
Resumo:
Many organizations realize that increasing amounts of data (“Big Data”) need to be dealt with intelligently in order to compete with other organizations in terms of efficiency, speed and services. The goal is not to collect as much data as possible, but to turn event data into valuable insights that can be used to improve business processes. However, data-oriented analysis approaches fail to relate event data to process models. At the same time, large organizations are generating piles of process models that are disconnected from the real processes and information systems. In this chapter we propose to manage large collections of process models and event data in an integrated manner. Observed and modeled behavior need to be continuously compared and aligned. This results in a “liquid” business process model collection, i.e. a collection of process models that is in sync with the actual organizational behavior. The collection should self-adapt to evolving organizational behavior and incorporate relevant execution data (e.g. process performance and resource utilization) extracted from the logs, thereby allowing insightful reports to be produced from factual organizational data.
Resumo:
Existing techniques for automated discovery of process models from event logs gen- erally produce flat process models. Thus, they fail to exploit the notion of subprocess as well as error handling and repetition constructs provided by contemporary process modeling notations, such as the Business Process Model and Notation (BPMN). This paper presents a technique for automated discovery of hierarchical BPMN models con- taining interrupting and non-interrupting boundary events and activity markers. The technique employs functional and inclusion dependency discovery techniques in order to elicit a process-subprocess hierarchy from the event log. Given this hierarchy and the projected logs associated to each node in the hierarchy, parent process and subprocess models are then discovered using existing techniques for flat process model discovery. Finally, the resulting models and logs are heuristically analyzed in order to identify boundary events and markers. By employing approximate dependency discovery tech- niques, it is possible to filter out noise in the event log arising for example from data entry errors or missing events. A validation with one synthetic and two real-life logs shows that process models derived by the proposed technique are more accurate and less complex than those derived with flat process discovery techniques. Meanwhile, a validation on a family of synthetically generated logs shows that the technique is resilient to varying levels of noise.
Resumo:
With the increasing competitiveness in global markets, many developing nations are striving to constantly improve their services in search for the next competitive edge. As a result, the demand and need for Business Process Management (BPM) in these regions is seeing a rapid rise. Yet there exists a lack of professional expertise and knowledge to cater to that need. Therefore, the development of well-structured BPM training/ education programs has become an urgent requirement for these industries. Furthermore, the lack of textbooks or other self-educating material, that go beyond the basics of BPM, further ratifies the need for case based teaching and related cases that enable the next generation of professionals in these countries. Teaching cases create an authentic learning environment where complexities and challenges of the ‘real world’ can be presented in a narrative, enabling students to evolve crucial skills such as problem analysis, problem solving, creativity within constraints as well as the application of appropriate tools (BPMN) and techniques (including best practices and benchmarking) within richer and real scenarios. The aim of this paper is to provide a comprehensive teaching case demonstrating the means to tackle any developing nation’s legacy government process undermined by inefficiency and ineffectiveness. The paper also includes thorough teaching notes The article is presented in three main parts: (i) Introduction - that provides a brief background setting the context of this paper, (ii) The Teaching Case, and (iii) Teaching notes.
Resumo:
This study proposes that technology adoption be considered as a multi-stage process constituting several distinct stages. Using the Theory of Planned Behaviour (TPB), Ettlie’s adoption stages and by employing data gathered from 162 owners of Small and Medium-sized Enterprises (SMEs), our findings show that the determinants of the intention to adopt packaged software fluctuate significantly across adoption stages.
Resumo:
This research examines the important emerging area of online customer experience (OCE) using data collected from an online survey of frequent and infrequent online shoppers. The study examines a model of antecedents for cognitive and affective experiential states and their influence on outcomes, such as online shopping satisfaction and repurchase intentions. The model also examines the relationships between perceived risk, trust, satisfaction and repurchase intentions. Theoretically, the study provides a broader understanding of OCE, through insights into two shopper segments identified as being important in e-retailing. For managers, the study highlights areas of OCE and their implications for ongoing management of the online channel.
Resumo:
Contemporary higher education institutions are making significant efforts to develop cohesive, meaningful and effective learning experiences for Science, Technology, Engineering and Mathematics (STEM) curricula to prepare graduates for challenges in the modern knowledge economy, thus enhancing their employability (Carnevale et al, 2011). This can inspire innovative redesign of learning experiences embedded in technology-enhanced educational environments and the development of research-informed, pedagogically reliable strategies fostering interactions between various agents of the learning-teaching process. This paper reports on the results of a project aimed at enhancing students’ learning experiences by redesigning a large, first year mathematics unit for Engineering students at a large metropolitan public university. Within the project, the current study investigates the effectiveness of selected, technology-mediated pedagogical approaches used over three semesters. Grounded in user-centred instructional design, the pedagogical approaches explored the opportunities for learning created by designing an environment containing technological, social and educational affordances. A qualitative analysis of mixed-type questionnaires distributed to students indicated important inter-relations between participants’ frames of references of the learning-teaching process and stressed the importance (and difficulty) of creating appropriate functional context. Conclusions drawn from this study may inform instructional design for blended delivery of STEM-focused programs that endeavor to enhance students’ employability by educating work-ready graduates.
Resumo:
Despite recent efforts to assess the release of nanoparticles to the workplace during different nanotechnology activities, the existence of a generalizable trend in the particle release has yet to be identified. This study aimed to characterize the release of synthetic clay nanoparticles from a laboratory-based jet milling process by quantifying the variations arising from primary particle size and surface treatment of the material used, as well as the feed rate of the machine. A broad range of materials were used in this study, and the emitted particles mass (PM2.5) and number concentrations (PNC) were measured at the release source. Analysis of variance, followed by linear mixed-effects modeling, was applied to quantify the variations in PM2.5 and PNC of the released particles caused by the abovementioned factors. The results confirmed that using materials of different primary size and surface treatment affects the release of the particles from the same process by causing statistically-significant variations in PM2.5 and PNC. The interaction of these two factors should also be taken into account as it resulted in variations in the measured particles release properties. Furthermore, the feed rate of the milling machine was confirmed to be another influencing parameter. Although this research does not identify a specific pattern in the release of synthetic clay nanoparticles from the jet milling process generalizable to other similar settings, it emphasizes that each tested case should be handled individually in terms of exposure considerations.
Resumo:
Drying of food materials offers a significant increase in the shelf life of food materials, along with the modification of quality attributes due to simultaneous heat and mass transfer. Shrinkage and variations in porosity are the common micro and microstructural changes that take place during the drying of mostly the food materials. Although extensive research has been carried out on the prediction of shrinkage and porosity over the time of drying, no single model exists which consider both material properties and process condition in the same model. In this study, an attempt has been made to develop and validate shrinkage and porosity models of food materials during drying considering both process parameters and sample properties. The stored energy within the sample, elastic potential energy, glass transition temperature and physical properties of the sample such as initial porosity, particle density, bulk density and moisture content have been taken into consideration. Physical properties and validation have been made by using a universal testing machine ( Instron 2kN), a profilometer (Nanovea) and a pycnometer. Apart from these, COMSOL Multiphysics 4.4 has been used to solve heat and mass transfer physics. Results obtained from models of shrinkage and porosity is quite consistent with the experimental data. Successful implementation of these models would ensure the use of optimum energy in the course of drying and better quality retention of dried foods.
Resumo:
This paper reflects on the critical need for an urgent transformation of higher education curriculum globally, to equip society with professionals who can address our 21st Century sustainable living challenges. Specifically it discusses a toolkit called the ‘Engineering Sustainable Solutions Program’, which is a freely available, rigorously reviewed and robust content resource for higher education institutions to access content on innovations and opportunities in the process of evolving the curriculum...
Resumo:
Australian rural landscapes are facing a crisis from land degradation due to rising salinity levels, soil acidification and soil erosion. There is growing consensus amongst the businesses community, government departments and research organisations that the real solution to these problems and the broader sustainability dilemma comes by taking a ‘whole of system’ approach to agricultural and rangelands management. This article introduces two cutting-edge concepts – Biomimicry and Natural Sequence Farming – to illustrate how whole-system thinking can effectively and profitably address the challenges facing agriculture and rangelands.
Resumo:
Organisations are constantly seeking new ways to improve operational efficiencies. This study investigates a novel way to identify potential efficiency gains in business operations by observing how they were carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how these trade-offs can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A number of optimisation techniques are proposed to explore and assess alternative execution scenarios. The objective function is represented by a cost structure that captures different process dimensions. An experimental evaluation is conducted to analyse the performance and scalability of the optimisation techniques: integer linear programming (ILP), hill climbing, tabu search, and our earlier proposed hybrid genetic algorithm approach. The findings demonstrate that the hybrid genetic algorithm is scalable and performs better compared to other techniques. Moreover, we argue that the use of ILP is unrealistic in this setup and cannot handle complex cost functions such as the ones we propose. Finally, we show how cost-related insights can be gained from improved execution scenarios and how these can be utilised to put forward recommendations for reducing process-related cost and overhead within organisations.
Resumo:
Companies standardise and automate their business processes in order to improve process eff ciency and minimise operational risks. However, it is di fficult to eliminate all process risks during the process design stage due to the fact that processes often run in complex and changeable environments and rely on human resources. Timely identification of process risks is crucial in order to insure the achievement of process goals. Business processes are often supported by information systems that record information about their executions in event logs. In this article we present an approach and a supporting tool for the evaluation of the overall process risk and for the prediction of process outcomes based on the analysis of information recorded in event logs. It can help managers evaluate the overall risk exposure of their business processes, track the evolution of overall process risk, identify changes and predict process outcomes based on the current value of overall process risk. The approach was implemented and validated using synthetic event logs and through a case study with a real event log.