829 resultados para S-PROCESS-RICH
Resumo:
This paper addresses the problem of identifying and explaining behavioral differences between two business process event logs. The paper presents a method that, given two event logs, returns a set of statements in natural language capturing behavior that is present or frequent in one log, while absent or infrequent in the other. This log delta analysis method allows users to diagnose differences between normal and deviant executions of a process or between two versions or variants of a process. The method relies on a novel approach to losslessly encode an event log as an event structure, combined with a frequency-enhanced technique for differencing pairs of event structures. A validation of the proposed method shows that it accurately diagnoses typical change patterns and can explain differences between normal and deviant cases in a real-life log, more compactly and precisely than previously proposed methods.
Resumo:
Process improvement and innovation are risky endeavors, like swimming in unknown waters. In this chapter, I will discuss how process innovation through BPM can benefit from Research-as-a-Service, that is, from the application of research concepts in the processes of BPM projects. A further subject will be how innovations can be converted from confidence-based to evidence-based models due to affordances of digital infrastructures such as large-scale enterprise soft-ware or social media. I will introduce the relevant concepts, provide illustrations for digital capabilities that allow for innovation, and share a number of key takeaway lessons for how organizations can innovate on the basis of digital opportunities and principles of evidence-based BPM: the foundation of all process decisions in facts rather than fiction.
Resumo:
Organizations executing similar business processes need to understand the differences and similarities in activities performed across work environments. Presently, research interest is directed towards the potential of visualization for the display of process models, to support users in their analysis tasks. Although recent literature in process mining and comparison provide several methods and algorithms to perform process and log comparison, few contributions explore novel visualization approaches. This paper analyses process comparison from a design perspective, providing some practical visualization techniques as anal- ysis solutions (/to support process analysis). The design of the visual comparison has been tackled through three different points of view: the general model, the projected model and the side-by-side comparison in order to support the needs of business analysts. A case study is presented showing the application of process mining and visualization techniques to patient treatment across two Australian hospitals.
Resumo:
Many organizations realize that increasing amounts of data (“Big Data”) need to be dealt with intelligently in order to compete with other organizations in terms of efficiency, speed and services. The goal is not to collect as much data as possible, but to turn event data into valuable insights that can be used to improve business processes. However, data-oriented analysis approaches fail to relate event data to process models. At the same time, large organizations are generating piles of process models that are disconnected from the real processes and information systems. In this chapter we propose to manage large collections of process models and event data in an integrated manner. Observed and modeled behavior need to be continuously compared and aligned. This results in a “liquid” business process model collection, i.e. a collection of process models that is in sync with the actual organizational behavior. The collection should self-adapt to evolving organizational behavior and incorporate relevant execution data (e.g. process performance and resource utilization) extracted from the logs, thereby allowing insightful reports to be produced from factual organizational data.
Resumo:
Existing techniques for automated discovery of process models from event logs gen- erally produce flat process models. Thus, they fail to exploit the notion of subprocess as well as error handling and repetition constructs provided by contemporary process modeling notations, such as the Business Process Model and Notation (BPMN). This paper presents a technique for automated discovery of hierarchical BPMN models con- taining interrupting and non-interrupting boundary events and activity markers. The technique employs functional and inclusion dependency discovery techniques in order to elicit a process-subprocess hierarchy from the event log. Given this hierarchy and the projected logs associated to each node in the hierarchy, parent process and subprocess models are then discovered using existing techniques for flat process model discovery. Finally, the resulting models and logs are heuristically analyzed in order to identify boundary events and markers. By employing approximate dependency discovery tech- niques, it is possible to filter out noise in the event log arising for example from data entry errors or missing events. A validation with one synthetic and two real-life logs shows that process models derived by the proposed technique are more accurate and less complex than those derived with flat process discovery techniques. Meanwhile, a validation on a family of synthetically generated logs shows that the technique is resilient to varying levels of noise.
Resumo:
By definition, regulatory rules (in legal context called norms) intend to achieve specific behaviour from business processes, and might be relevant to the whole or part of a business process. They can impose conditions on different aspects of process models, e.g., control-flow, data and resources etc. Based on the rules sets, norms can be classified into various classes and sub-classes according to their effects. This paper presents an abstract framework consisting of a list of norms and a generic compliance checking approach on the idea of (possible) execution of processes. The proposed framework is independent of any existing formalism, and provides a conceptually rich and exhaustive ontology and semantics of norms needed for business process compliance checking. The possible uses of the proposed framework include to compare different compliance management frameworks (CMFs).
Resumo:
Perhaps no other patient safety intervention depends so acutely on effective interprofessional teamwork for patient survival than the hospital rapid response system (RRS). Yet little is known about nurse-physician relationships when rescuing at-risk patients. This study compared nursing and medical staff perceptions of a mature RRS at a large tertiary hospital. Findings indicate the RRS may be failing to address a hierarchical culture and systems-level barriers to early recognition and response to patient deterioration.
Resumo:
Objectives The objective of this study was to develop process quality indicators (PQIs) to support the improvement of care services for older people with cognitive impairment in emergency departments (ED). Methods A structured research approach was taken for the development of PQIs for the care of older people with cognitive impairment in EDs, including combining available evidence with expert opinion (phase 1), a field study (phase 2), and formal voting (phase 3). A systematic review of the literature identified ED processes targeting the specific care needs of older people with cognitive impairment. Existing relevant PQIs were also included. By integrating the scientific evidence and clinical expertise, new PQIs were drafted and, along with the existing PQIs, extensively discussed by an advisory panel. These indicators were field tested in eight hospitals using a cohort of older persons aged 70 years and older. After analysis of the field study data (indicator prevalence, variability across sites), in a second meeting, the advisory panel further defined the PQIs. The advisory panel formally voted for selection of those PQIs that were most appropriate for care evaluation. Results In addition to seven previously published PQIs relevant to the care of older persons, 15 new indicators were created. These 22 PQIs were then field tested. PQIs designed specifically for the older ED population with cognitive impairment were only scored for patients with identified cognitive impairment. Following formal voting, a total of 11 PQIs were included in the set. These PQIs targeted cognitive screening, delirium screening, delirium risk assessment, evaluation of acute change in mental status, delirium etiology, proxy notification, collateral history, involvement of a nominated support person, pain assessment, postdischarge follow-up, and ED length of stay. Conclusions This article presents a set of PQIs for the evaluation of the care for older people with cognitive impairment in EDs. The variation in indicator triggering across different ED sites suggests that there are opportunities for quality improvement in care for this vulnerable group. Applied PQIs will identify an emergency services' implementation of care strategies for cognitively impaired older ED patients. Awareness of the PQI triggers at an ED level enables implementation of targeted interventions to improve any suboptimal processes of care. Further validation and utility of the indicators in a wider population is now indicated.
Resumo:
The thesis develops a summative evaluation framework of the NOURISH Randomised Controlled Trial, and applies this to a selection of maternal feeding choice outcomes of the programme. The research is based on an ecological model of the complex set of factors that need to be acted upon to address childhood obesity. The novel approach extends the standard RCT effect evaluation, while also conducting an in-depth evaluation of the process of programme development and implementation. Research findings identify adaptations of intervention delivery that could improve its effectiveness and translatability, in a new cycle of the programme.
Resumo:
With the increasing competitiveness in global markets, many developing nations are striving to constantly improve their services in search for the next competitive edge. As a result, the demand and need for Business Process Management (BPM) in these regions is seeing a rapid rise. Yet there exists a lack of professional expertise and knowledge to cater to that need. Therefore, the development of well-structured BPM training/ education programs has become an urgent requirement for these industries. Furthermore, the lack of textbooks or other self-educating material, that go beyond the basics of BPM, further ratifies the need for case based teaching and related cases that enable the next generation of professionals in these countries. Teaching cases create an authentic learning environment where complexities and challenges of the ‘real world’ can be presented in a narrative, enabling students to evolve crucial skills such as problem analysis, problem solving, creativity within constraints as well as the application of appropriate tools (BPMN) and techniques (including best practices and benchmarking) within richer and real scenarios. The aim of this paper is to provide a comprehensive teaching case demonstrating the means to tackle any developing nation’s legacy government process undermined by inefficiency and ineffectiveness. The paper also includes thorough teaching notes The article is presented in three main parts: (i) Introduction - that provides a brief background setting the context of this paper, (ii) The Teaching Case, and (iii) Teaching notes.
Resumo:
This study proposes that technology adoption be considered as a multi-stage process constituting several distinct stages. Using the Theory of Planned Behaviour (TPB), Ettlie’s adoption stages and by employing data gathered from 162 owners of Small and Medium-sized Enterprises (SMEs), our findings show that the determinants of the intention to adopt packaged software fluctuate significantly across adoption stages.
Resumo:
QUT Library Research Support has simplified and streamlined the process of research data management planning, storage, discovery and reuse through collaboration and the use of integrated and tailored online tools, and a simplification of the metadata schema. This poster presents the integrated data management services a QUT, including QUT’s Data Management Planning Tool, Research Data Finder, Spatial Data Finder and Software Finder, and information on the simplified Registry Interchange Format – Collections and Services (RIF-CS) Schema. The QUT Data Management Planning (DMP) Tool was built using the Digital Curation Centre’s DMP Online Tool and modified to QUT’s needs and policies. The tool allows researchers and Higher Degree Research students to plan how to handle research data throughout the active phase of their research. The plan is promoted as a ‘live’ document’ and researchers are encouraged to update it as required. The information entered into the plan can be made private or shared with supervisors, project members and external examiners. A plan is mandatory when requesting storage space on the QUT Research Data Storage Service. QUT’s Research Data Finder is integrated with QUT’s Academic Profiles and the Data Management Planning Tool to create a seamless data management process. This process aims to encourage the creation of high quality rich records which facilitate discovery and reuse of quality data. The Registry Interchange Format – Collections and Services (RIF-CS) Schema that is used in the QUT Research Data Finder was simplified to “RIF-CS lite” to reflect mandatory and optional metadata requirements. RIF-CS lite removed schema fields that were underused or extra to the needs of the users and system. This has reduced the amount of metadata fields required from users and made integration of systems a far more simple process where field content is easily shared across services making the process of collecting metadata as transparent as possible.
Resumo:
This book represents a landmark effort to probe and analyze the theory and empirics of designing water disaster management policies. It consists of seven chapters that examine, in-depth and comprehensively, issues that are central to crafting effective policies for water disaster management. The authors use historical surveys, institutional analysis, econometric investigations, empirical case studies, and conceptual-theoretical discussions to clarify and illuminate the complex policy process. The specific topics studied in this book include a review and analysis of key policy areas and research priority areas associated with water disaster management, community participation in disaster risk reduction, the economics and politics of ‘green’ flood control, probabilistic flood forecasting for flood risk management, polycentric governance and flood risk management, drought management with the aid of dynamic inter-generational preferences, and how social resilience can inform SA/SIA for adaptive planning for climate change in vulnerable areas. A unique feature of this book is its analysis of the causes and consequences of water disasters and efforts to address them successfully through policy-rich, cross-disciplinary and transnational papers. This book is designed to help enrich the sparse discourse on water disaster management policies and galvanize water professionals to craft creative solutions to tackle water disasters efficiently, equitably, and sustainably. This book should also be of considerable use to disaster management professionals, in general, and natural resource policy analysts.
Resumo:
Christina Waterson interviews Queensland interior designer Marisha McAuliffe about her PhD research into process, and why some people are drawn to create.
Resumo:
Despite recent efforts to assess the release of nanoparticles to the workplace during different nanotechnology activities, the existence of a generalizable trend in the particle release has yet to be identified. This study aimed to characterize the release of synthetic clay nanoparticles from a laboratory-based jet milling process by quantifying the variations arising from primary particle size and surface treatment of the material used, as well as the feed rate of the machine. A broad range of materials were used in this study, and the emitted particles mass (PM2.5) and number concentrations (PNC) were measured at the release source. Analysis of variance, followed by linear mixed-effects modeling, was applied to quantify the variations in PM2.5 and PNC of the released particles caused by the abovementioned factors. The results confirmed that using materials of different primary size and surface treatment affects the release of the particles from the same process by causing statistically-significant variations in PM2.5 and PNC. The interaction of these two factors should also be taken into account as it resulted in variations in the measured particles release properties. Furthermore, the feed rate of the milling machine was confirmed to be another influencing parameter. Although this research does not identify a specific pattern in the release of synthetic clay nanoparticles from the jet milling process generalizable to other similar settings, it emphasizes that each tested case should be handled individually in terms of exposure considerations.