270 resultados para 319-C0010A
Resumo:
Existing techniques for automated discovery of process models from event logs largely focus on extracting flat process models. In other words, they fail to exploit the notion of subprocess, as well as structured error handling and repetition constructs provided by contemporary process modeling notations, such as the Business Process Model and Notation (BPMN). This paper presents a technique for automated discovery of BPMN models containing subprocesses, interrupting and non-interrupting boundary events, and loop and multi-instance markers. The technique analyzes dependencies between data attributes associated with events, in order to identify subprocesses and to extract their associated logs. Parent process and subprocess models are then discovered separately using existing techniques for flat process model discovery. Finally, the resulting models and logs are heuristically analyzed in order to identify boundary events and markers. A validation with one synthetic and two real-life logs shows that process models derived using the proposed technique are more accurate and less complex than those derived with flat process model discovery techniques.
Resumo:
Researchers are increasingly grappling with ways of theorizing social media and its use. This review essay proposes that the theory of Information Grounds (IG) may provide a valuable lens for understanding how social media fosters collaboration and social engagement among information professionals. The paper presents literature that helps us understand how social media can be seen as IG, and maps the characteristics of social media to the seven propositions of IG theory. This work is part of a wider study investigating the ways in which Information Technology (IT) professionals experience social media.
Resumo:
The use of Wireless Sensor Networks (WSNs) for vibration-based Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data asynchronicity and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. Based on a brief review, this paper first reveals that Data Synchronization Error (DSE) is the most inherent factor amongst uncertainties of SHM-oriented WSNs. Effects of this factor are then investigated on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when merging data from multiple sensor setups. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as benchmark data after being added with a certain level of noise to account for the higher presence of this factor in SHM-oriented WSNs. From this source, a large number of simulations have been made to generate multiple DSE-corrupted datasets to facilitate statistical analyses. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with DSE at a relaxed level. Finally, the combination of preferred OMA techniques and the use of the channel projection for the time-domain OMA technique to cope with DSE are recommended.
Resumo:
Digital signatures are often used by trusted authorities to make unique bindings between a subject and a digital object; for example, certificate authorities certify a public key belongs to a domain name, and time-stamping authorities certify that a certain piece of information existed at a certain time. Traditional digital signature schemes however impose no uniqueness conditions, so a trusted authority could make multiple certifications for the same subject but different objects, be it intentionally, by accident, or following a (legal or illegal) coercion. We propose the notion of a double-authentication-preventing signature, in which a value to be signed is split into two parts: a subject and a message. If a signer ever signs two different messages for the same subject, enough information is revealed to allow anyone to compute valid signatures on behalf of the signer. This double-signature forgeability property discourages signers from misbehaving---a form of self-enforcement---and would give binding authorities like CAs some cryptographic arguments to resist legal coercion. We give a generic construction using a new type of trapdoor functions with extractability properties, which we show can be instantiated using the group of sign-agnostic quadratic residues modulo a Blum integer.
Resumo:
Business Process Management has substantially matured over the last two decades. The techniques, methods and systems available to scope, model, analyze, implement, execute, monitor and even mine a process have been scientifically researched and can be in most cases deployed in practice. In fact, many of these BPM capabilities are nowadays a commodity. However, an opportunity-rich environment and rapidly emerging digital disruptions require new BPM capabilities. In light of this context, this paper proposes three future research and development directions for BPM academics and professionals. First, Ambidextrous BPM demands the shift of focus from exploitative to explorative BPM. Second, Value-driven BPM postulates a stronger focus on the desired outcomes as opposed to the available BPM methods. Third, Customer Process Management suggests complementing the dominating internal view of BPM with a stronger, design-inspired view on the process experiences of external stakeholders.
Resumo:
A significant number of privatizations utilized to operate and maintain critical networked infrastructures have failed to meet contractual expectations and the expectations of the community. The author carried out empirical research ex-ploring four urban water systems. This research revealed that of the four forms of privatization the alliance form was particularly suited to the stewardship of an ur-ban water system. The question then is whether these findings from urban water can be generalised to O&M of infrastructure generally. The answer is increasingly important as governments seek financial sustainability through reapplying the contestability strategy and outsource and privatise further services and activities. This paper first examines the issues encountered with O & M privatisations. Second the findings as to the stewardship achieved by the four case study water systems are unpacked with particular focus upon the alliance form. Third the key variables which were found to have distinct causal links to the stewardship-like behaviour of the private participants in the Alliance case study are described. Fourth the variables which may be crucial to the successful application of the alliance form to the broader range of infrastructures are separated out. Fifth this paper then sets the path for research into these crucial features of the alliance form.
Resumo:
The Common Scrambling Algorithm Stream Cipher (CSASC) is a shift register based stream cipher designed to encrypt digital video broadcast. CSA-SC produces a pseudo-random binary sequence that is used to mask the contents of the transmission. In this paper, we analyse the initialisation process of the CSA-SC keystream generator and demonstrate weaknesses which lead to state convergence, slid pairs and shifted keystreams. As a result, the cipher may be vulnerable to distinguishing attacks, time-memory-data trade-off attacks or slide attacks.
Resumo:
Outdoor robots such as planetary rovers must be able to navigate safely and reliably in order to successfully perform missions in remote or hostile environments. Mobility prediction is critical to achieving this goal due to the inherent control uncertainty faced by robots traversing natural terrain. We propose a novel algorithm for stochastic mobility prediction based on multi-output Gaussian process regression. Our algorithm considers the correlation between heading and distance uncertainty and provides a predictive model that can easily be exploited by motion planning algorithms. We evaluate our method experimentally and report results from over 30 trials in a Mars-analogue environment that demonstrate the effectiveness of our method and illustrate the importance of mobility prediction in navigating challenging terrain.
Resumo:
This paper presents a full system demonstration of dynamic sensorbased reconfiguration of a networked robot team. Robots sense obstacles in their environment locally and dynamically adapt their global geometric configuration to conform to an abstract goal shape. We present a novel two-layer planning and control algorithm for team reconfiguration that is decentralised and assumes local (neighbour-to-neighbour) communication only. The approach is designed to be resource-efficient and we show experiments using a team of nine mobile robots with modest computation, communication, and sensing. The robots use acoustic beacons for localisation and can sense obstacles in their local neighbourhood using IR sensors. Our results demonstrate globally-specified reconfiguration from local information in a real robot network, and highlight limitations of standard mesh networks in implementing decentralised algorithms.
Resumo:
A better understanding of the behaviour of prepared cane and bagasse, and the ability to model the mechanical behaviour of bagasse as it is squeezed in a milling unit to extract juice, would help identify how to improve the current process, for example to reduce final bagasse moisture. Previous investigations have proven that juice flow through bagasse obeys Darcy’s permeability law, that the grip of the rough surface of the grooves on the bagasse can be represented by the Mohr-Coulomb failure criterion for soils, and that the internal mechanical behaviour of the bagasse is critical state behaviour similar to that for sand and clay. Current Finite Element Models (FEM) available in commercial software have adequate permeability models. However, no commercially available software seems to contain an adequate mechanical model for bagasse. The same software contains a few material models for soil and other materials, while the coding of hundreds of developed models for soil and other materials remains confidential at universities and government research centres. Progress has been made in the last ten years towards implementing a mechanical model for bagasse in finite element software code. This paper builds on that progress and carries out a further step towards obtaining an adequate material model. The fifth and final loading condition outlined previously, shearing of heavily over-consolidated bagasse, is outlined.
Resumo:
This book constitutes the proceedings of the Second Asia Pacific Conference on Business Process Management held in Brisbane, QLD, Australia, in July 2014. In all, 33 contributions from 12 countries were submitted. After each submission was reviewed by at least three Program Committee members, nine full papers were accepted for publication in this volume. These nine papers cover various topics that can be categorized under four main research focuses in BPM: process mining, process modeling and repositories, process model comparison, and process analysis.
Resumo:
Variations that exist in the treatment of patients (with similar symptoms) across different hospitals do substantially impact the quality and costs of healthcare. Consequently, it is important to understand the similarities and differences between the practices across different hospitals. This paper presents a case study on the application of process mining techniques to measure and quantify the differences in the treatment of patients presenting with chest pain symptoms across four South Australian hospitals. Our case study focuses on cross-organisational benchmarking of processes and their performance. Techniques such as clustering, process discovery, performance analysis, and scientific workflows were applied to facilitate such comparative analyses. Lessons learned in overcoming unique challenges in cross-organisational process mining, such as ensuring population comparability, data granularity comparability, and experimental repeatability are also presented.
Resumo:
The use of Portable Medical Devices (PMDs) has become increasingly widespread over the last few years. A combination of factors; including advances in technology, the pressure to reduce public health costs and the desire to make health solutions accessible to a wider patient base are contributing to the growth in the PMD market. Design has a clear role to play in the current and future context of the PMD landscape. In this paper, we identify emerging trends in the design of PMDs; including changes in the form, purpose and mode of use, and explore how these trends are likely to fundamentally impact the nature of healthcare and the patient experience from an experience design perspective. We conclude by identifying a research opportunity for design within the healthcare and PMD context.
Resumo:
This book is about understanding the nature and application of reflection in higher education. It provides a theoretical model to guide the implementation of reflective learning and reflective practice across multiple disciplines and international contexts in higher education. The book presents research into the ways in which reflection is both considered and implemented in different ways across different professional disciplines, while maintaining a common purpose to transform and improve learning and/or practice. Chapter 13 'Refining a Teaching Pattern: Reflection Around Artefacts' explores reflective practices of an artefact, in this case fashion design garment samples.
Resumo:
This paper reports on the 2nd ShARe/CLEFeHealth evaluation lab which continues our evaluation resource building activities for the medical domain. In this lab we focus on patients' information needs as opposed to the more common campaign focus of the specialised information needs of physicians and other healthcare workers. The usage scenario of the lab is to ease patients and next-of-kins' ease in understanding eHealth information, in particular clinical reports. The 1st ShARe/CLEFeHealth evaluation lab was held in 2013. This lab consisted of three tasks. Task 1 focused on named entity recognition and normalization of disorders; Task 2 on normalization of acronyms/abbreviations; and Task 3 on information retrieval to address questions patients may have when reading clinical reports. This year's lab introduces a new challenge in Task 1 on visual-interactive search and exploration of eHealth data. Its aim is to help patients (or their next-of-kin) in readability issues related to their hospital discharge documents and related information search on the Internet. Task 2 then continues the information extraction work of the 2013 lab, specifically focusing on disorder attribute identification and normalization from clinical text. Finally, this year's Task 3 further extends the 2013 information retrieval task, by cleaning the 2013 document collection and introducing a new query generation method and multilingual queries. De-identified clinical reports used by the three tasks were from US intensive care and originated from the MIMIC II database. Other text documents for Tasks 1 and 3 were from the Internet and originated from the Khresmoi project. Task 2 annotations originated from the ShARe annotations. For Tasks 1 and 3, new annotations, queries, and relevance assessments were created. 50, 79, and 91 people registered their interest in Tasks 1, 2, and 3, respectively. 24 unique teams participated with 1, 10, and 14 teams in Tasks 1, 2 and 3, respectively. The teams were from Africa, Asia, Canada, Europe, and North America. The Task 1 submission, reviewed by 5 expert peers, related to the task evaluation category of Effective use of interaction and targeted the needs of both expert and novice users. The best system had an Accuracy of 0.868 in Task 2a, an F1-score of 0.576 in Task 2b, and Precision at 10 (P@10) of 0.756 in Task 3. The results demonstrate the substantial community interest and capabilities of these systems in making clinical reports easier to understand for patients. The organisers have made data and tools available for future research and development.