345 resultados para Audit processes
Resumo:
Business process analysis and process mining, particularly within the health care domain, remain under-utilised. Applied research that employs such techniques to routinely collected, health care data enables stakeholders to empirically investigate care as it is delivered by different health providers. However, cross-organisational mining and the comparative analysis of processes present a set of unique challenges in terms of ensuring population and activity comparability, visualising the mined models and interpreting the results. Without addressing these issues, health providers will find it difficult to use process mining insights, and the potential benefits of evidence-based process improvement within health will remain unrealised. In this paper, we present a brief introduction on the nature of health care processes; a review of the process mining in health literature; and a case study conducted to explore and learn how health care data, and cross-organisational comparisons with process mining techniques may be approached. The case study applies process mining techniques to administrative and clinical data for patients who present with chest pain symptoms at one of four public hospitals in South Australia. We demonstrate an approach that provides detailed insights into clinical (quality of patient health) and fiscal (hospital budget) pressures in health care practice. We conclude by discussing the key lessons learned from our experience in conducting business process analysis and process mining based on the data from four different hospitals.
Resumo:
This study describes the first aid used and clinical outcomes of all patients who presented to the Royal Children's Hospital, Brisbane, Australia in 2005 with an acute burn injury. A retrospective audit was performed with the charts of 459 patients and information concerning burn injury, first-aid treatment, and clinical outcomes was collected. First aid was used on 86.1% of patients, with 8.7% receiving no first aid and unknown treatment in 5.2% of cases. A majority of patients had cold water as first aid (80.2%), however, only 12.1% applied the cold water for the recommended 20 minutes or longer. Recommended first aid (cold water for >or=20 minutes) was associated with significantly reduced reepithelialization time for children with contact injuries (P=.011). Superficial depth burns were significantly more likely to be associated with the use of recommended first aid (P=.03). Suboptimal treatment was more common for children younger than 3.5 years (P<.001) and for children with friction burns. This report is one of the few publications to relate first-aid treatment to clinical outcomes. Some positive clinical outcomes were associated with recommended first-aid use; however, wound outcomes were more strongly associated with burn depth and mechanism of injury. There is also a need for more public awareness of recommended first-aid treatment.
Resumo:
In nature, the interactions between agents in a complex system (fish schools; colonies of ants) are governed by information that is locally created. Each agent self-organizes (adjusts) its behaviour, not through a central command centre, but based on variables that emerge from the interactions with other system agents in the neighbourhood. Self-organization has been proposed as a mechanism to explain the tendencies for individual performers to interact with each other in field-invasion sports teams, displaying functional co-adaptive behaviours, without the need for central control. The relevance of self-organization as a mechanism that explains pattern-forming dynamics within attacker-defender interactions in field-invasion sports has been sustained in the literature. Nonetheless, other levels of interpersonal coordination, such as intra-team interactions, still raise important questions, particularly with reference to the role of leadership or match strategies that have been prescribed in advance by a coach. The existence of key properties of complex systems, such as system degeneracy, nonlinearity or contextual dependency, suggests that self-organization is a functional mechanism to explain the emergence of interpersonal coordination tendencies within intra-team interactions. In this opinion article we propose how leadership may act as a key constraint on the emergent, self-organizational tendencies of performers in field-invasion sports.
Resumo:
The integration of separate, yet complimentary, cortical pathways appears to play a role in visual perception and action when intercepting objects. The ventral system is responsible for object recognition and identification, while the dorsal system facilitates continuous regulation of action. This dual-system model implies that empirically manipulating different visual information sources during performance of an interceptive action might lead to the emergence of distinct gaze and movement pattern profiles. To test this idea, we recorded hand kinematics and eye movements of participants as they attempted to catch balls projected from a novel apparatus that synchronised or de-synchronised accompanying video images of a throwing action and ball trajectory. Results revealed that ball catching performance was less successful when patterns of hand movements and gaze behaviours were constrained by the absence of advanced perceptual information from the thrower's actions. Under these task constraints, participants began tracking the ball later, followed less of its trajectory, and adapted their actions by initiating movements later and moving the hand faster. There were no performance differences when the throwing action image and ball speed were synchronised or de-synchronised since hand movements were closely linked to information from ball trajectory. Results are interpreted relative to the two-visual system hypothesis, demonstrating that accurate interception requires integration of advanced visual information from kinematics of the throwing action and from ball flight trajectory.
Resumo:
An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.
Resumo:
We report a new approach that uses the single beam Z-scan technique, to discriminate between excited state absorption (ESA) and two and three photon nonlinear absorption. By measuring the apparent delay or advance of the pulse in reaching the detector, the nonlinear absorption can be unambiguously identified as either instantaneous or transient. The simple method does not require a large range of input fluences or sophisticated pulse-probe experimental apparatus. The technique is easily extended to any absorption process dependent on pulse width and to nonlinear refraction measurements. We demonstrate in particular, that the large nonlinear absorption in ZnO nanocones when exposed to nanosecond 532 nm pulses, is due mostly to ESA, not pure two-photon absorption.
Resumo:
Dispute resolution in strata schemes in Peninsular Malaysia should focus on more than just "settlement." The quality of the outcome, its sustainability and its relevance in supporting the basic principles of a good neighbourhood and self-governance in a strata scheme are also fundamental. Based on the comprehensive law movement, this thesis develops a theoretical framework for strata scheme disputes within the parameters of therapeutic jurisprudence, preventive law, alternative dispute resolution (ADR) and problem-solving courts. The therapeutic orientation of this model offers approaches that promote positive communication between disputing parties, preserve neighbour relations and optimise people's psychological and emotional well-being.
Resumo:
Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)
Resumo:
Balancing the demands of research and ethics is always challenging and even more so when recruiting vulnerable groups. Within the context of current legislation and international human rights declarations, it is strongly advocated that research can and must be undertaken with all recipients of health care services. Research in the field of intellectual disability presents particular challenges in regard to consenting processes. This paper is a critical reflection and analysis of the complex processes undertaken and events that occurred in gaining informed consent from people with intellectual disability to participate in a study exploring their experiences of being an inpatient in mental health hospitals within Aotearoa/New Zealand. A framework based on capacity, information and voluntariness is presented with excerpts from the field provided to explore consenting processes. The practical implications of the processes utilised are then discussed in order to stimulate debate regarding clearer and enhanced methods of gaining informed consent from people with intellectual disability.
Resumo:
Purpose The paper examines the impact of internal auditors’ involvement in Enterprise Risk Management (ERM) on perceptions of their willingness to report a breakdown in risk procedures and whether a strong relationship with the audit committee affects such willingness to report. The study also investigates the use of ERM and the role of internal audit in ERM in Australian private and public sector entities. Design/methodology/approach The study uses an experimental design, manipulating (i) the internal auditor’s involvement in ERM and (ii) the strength of the relationship between internal audit and the audit committee. Participants are 117 certified internal auditors. The study also gathers descriptive data on the use of ERM. Findings The study indicates that a high involvement in ERM impacts the perceptions of internal auditors’ willingness to report a breakdown in risk procedures to the audit committee. However, a strong relationship with the audit committee does not appear to affect their perceived willingness to report. The study also finds that the majority of organisations have recently adopted ERM. Internal auditors are involved in ERM assurance activities but some also engage in activities that could compromise objectivity.
Resumo:
In the last years, the trade-o between exibility and sup- port has become a leading issue in work ow technology. In this paper we show how an imperative modeling approach used to de ne stable and well-understood processes can be complemented by a modeling ap- proach that enables automatic process adaptation and exploits planning techniques to deal with environmental changes and exceptions that may occur during process execution. To this end, we designed and imple- mented a Custom Service that allows the Yawl execution environment to delegate the execution of subprocesses and activities to the SmartPM execution environment, which is able to automatically adapt a process to deal with emerging changes and exceptions. We demonstrate the fea- sibility and validity of the approach by showing the design and execution of an emergency management process de ned for train derailments.
Resumo:
Process models are usually depicted as directed graphs, with nodes representing activities and directed edges control flow. While structured processes with pre-defined control flow have been studied in detail, flexible processes including ad-hoc activities need further investigation. This paper presents flexible process graph, a novel approach to model processes in the context of dynamic environment and adaptive process participants’ behavior. The approach allows defining execution constraints, which are more restrictive than traditional ad-hoc processes and less restrictive than traditional control flow, thereby balancing structured control flow with unstructured ad-hoc activities. Flexible process graph focuses on what can be done to perform a process. Process participants’ routing decisions are based on the current process state. As a formal grounding, the approach uses hypergraphs, where each edge can associate any number of nodes. Hypergraphs are used to define execution semantics of processes formally. We provide a process scenario to motivate and illustrate the approach.
Resumo:
Service processes such as financial advice, booking a business trip or conducting a consulting project have emerged as units of analysis of high interest for the business process and service management communities in practice and academia. While the transactional nature of production processes is relatively well understood and deployed, the less predictable and highly interactive nature of service processes still lacks in many areas appropriate methodological grounding. This paper proposes a framework of a process laboratory as a new IT artefact in order to facilitate the holistic analysis and simulation of such service processes. Using financial services as an example, it will be shown how such a process laboratory can be used to reduce the complexity of service process analysis and facilitate operational service process control.
Resumo:
Conceptual modeling is an important tool for understanding and revealing weaknesses of business processes. Yet, the current practice in reengineering projects often considers simply the as-is process model as a brain-storming tool. This approach heavily relies on the intuition of the participants and misses a clear description of the quality requirements. Against this background, we identify four generic quality categories of business process quality, and populate them with quality requirements from related research. We refer to the resulting framework as the Quality of Business Process (QoBP) framework. Furthermore, we present the findings from applying the QoBP framework in a case study with a major Australian bank, showing that it helps to systematically fill the white space between as-is and to-be process modeling.