425 resultados para Internal processes
Resumo:
The care processes of healthcare providers are typically considered as human-centric, flexible, evolving, complex and multi-disciplinary. Consequently, acquiring an insight in the dynamics of these care processes can be an arduous task. A novel event log based approach for extracting valuable medical and organizational information on past executions of the care processes is presented in this study. Care processes are analyzed with the help of a preferential set of process mining techniques in order to discover recurring patterns, analyze and characterize process variants and identify adverse medical events.
Resumo:
Since their inception in 1962, Petri nets have been used in a wide variety of application domains. Although Petri nets are graphical and easy to understand, they have formal semantics and allow for analysis techniques ranging from model checking and structural analysis to process mining and performance analysis. Over time Petri nets emerged as a solid foundation for Business Process Management (BPM) research. The BPM discipline develops methods, techniques, and tools to support the design, enactment, management, and analysis of operational business processes. Mainstream business process modeling notations and workflow management systems are using token-based semantics borrowed from Petri nets. Moreover, state-of-the-art BPM analysis techniques are using Petri nets as an internal representation. Users of BPM methods and tools are often not aware of this. This paper aims to unveil the seminal role of Petri nets in BPM.
Resumo:
Crashes at any particular transport network location consist of a chain of events arising from a multitude of potential causes and/or contributing factors whose nature is likely to reflect geometric characteristics of the road, spatial effects of the surrounding environment, and human behavioural factors. It is postulated that these potential contributing factors do not arise from the same underlying risk process, and thus should be explicitly modelled and understood. The state of the practice in road safety network management applies a safety performance function that represents a single risk process to explain crash variability across network sites. This study aims to elucidate the importance of differentiating among various underlying risk processes contributing to the observed crash count at any particular network location. To demonstrate the principle of this theoretical and corresponding methodological approach, the study explores engineering (e.g. segment length, speed limit) and unobserved spatial factors (e.g. climatic factors, presence of schools) as two explicit sources of crash contributing factors. A Bayesian Latent Class (BLC) analysis is used to explore these two sources and to incorporate prior information about their contribution to crash occurrence. The methodology is applied to the state controlled roads in Queensland, Australia and the results are compared with the traditional Negative Binomial (NB) model. A comparison of goodness of fit measures indicates that the model with a double risk process outperforms the single risk process NB model, and thus indicating the need for further research to capture all the three crash generation processes into the SPFs.
Resumo:
This chapter interrogates what recognition of prior learning (RPL) can and does mean in the higher education sector—a sector in the grip of the widening participation agenda and an open access age. The chapter discusses how open learning is making inroads into recognition processes and examines two studies in open learning recognition. A case study relating to e-portfolio-style RPL for entry into a Graduate Certificate in Policy and Governance at a metropolitan university in Queensland is described. In the first instance, candidates who do not possess a relevant Bachelor degree need to demonstrate skills in governmental policy work in order to be eligible to gain entry to a Graduate Certificate (at Australian Qualifications Framework Level 8) (Australian Qualifications Framework Council, 2013, p. 53). The chapter acknowledges the benefits and limitations of recognition in open learning and those of more traditional RPL, anticipating future developments in both (or their convergence).
Resumo:
The power to influence others in ever-expanding social networks in the new knowledge economy is tied to capabilities with digital media production. This chapter draws on research in elementary classrooms to examine the repertoires of cross-disciplinary knowledge that literacy learners need to produce innovative digital media via the “social web”. It focuses on the knowledge processes that occurred when elementary students engaged in multimodal text production with new digital media. It draws on Kalantzis and Cope’s (2008) heuristic for theorizing “Knowledge Processes” in the Learning by Design approach to pedagogy. Learners demonstrate eight “Knowledge Processes” across different subject domains, skills areas, and sensibilities. Drawing data from media-based lessons across several classroom and schools, this chapter examines what kinds of knowledge students utilize when they produce digital, multimodal texts in the classroom. The Learning by Design framework is used as an analytic tool to theorize how students learn when they engaged in a specific domain of learning – digital media production.
Resumo:
This article elaborates the impact that crises of authority provoked by animal magnetism, mesmerism, and hypnosis in the 19th century had for field formation in American education. Four layers of analysis elucidate how curriculum history’s repetitive focus on public school policy and classroom practice became possible. First, the article surveys external conditions of possibility for the enactment of compulsory public schooling. Second, “internal” conditions of possibility for the formation of educational objects (e.g., types of children) are documented via the processes of différance that were generated from within the experiences of confinement. Third, the article maps how these were interpenetrated by animal magnetic debates that were lustered and planished in education’s emerging field, including impact upon behavior management practices, the contouring of expertise and authority, the role of Will in intelligence testing and child development theories, and the redefinition of public and private. Last, the article examines implications for curriculum history, whether policy- or practice-oriented, especially around the question of influence, the theorization of child mind, and philosophies of Being.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on workshops where process stakeholders together with modeling experts create a graphical visualization of a process in a model. Within these workshops, stakeholders are mostly limited to verbal contributions, which are integrated into a process model by a modeling expert using traditional input devices. This limitation negatively affects the collaboration outcome and also the perception of the collaboration itself. In order to overcome this problem we created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. Using this system for collaborative modeling, we expect to provide a more effective collaboration environment thus improving modeling performance and collaboration.
Resumo:
This is the fourth TAProViz workshop being run at the 13th International Conference on Business Process Management (BPM). The intention this year is to consolidate on the results of the previous successful workshops by further developing this important topic, identifying the key research topics of interest to the BPM visualization community. Towards this goal, the workshop topics were extended to human computer interaction and related domains. Submitted papers were evaluated by at least three program committee members, in a double blind manner, on the basis of significance, originality, technical quality and exposition. Three full and one position papers were accepted for presentation at the workshop. In addition, we invited a keynote speaker, Jakob Pinggera, a postdoctoral researcher at the Business Process Management Research Cluster at the University of Innsbruck, Austria.
Resumo:
Pseudo-marginal methods such as the grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms have been introduced in the literature as an approach to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we propose to use Gaussian processes (GP) to accelerate the GIMH method, whilst using a short pilot run of MCWM to train the GP. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model.
Resumo:
Overprocessing waste occurs in a business process when effort is spent in a way that does not add value to the customer nor to the business. Previous studies have identied a recurrent overprocessing pattern in business processes with so-called "knockout checks", meaning activities that classify a case into "accepted" or "rejected", such that if the case is accepted it proceeds forward, while if rejected, it is cancelled and all work performed in the case is considered unnecessary. Thus, when a knockout check rejects a case, the effort spent in other (previous) checks becomes overprocessing waste. Traditional process redesign methods propose to order knockout checks according to their mean effort and rejection rate. This paper presents a more fine-grained approach where knockout checks are ordered at runtime based on predictive machine learning models. Experiments on two real-life processes show that this predictive approach outperforms traditional methods while incurring minimal runtime overhead.
Resumo:
One of the distinctive features of Gold Coast urbanisation is its historically ad hoc approach to development with little or no strategic planning to guide it. Many have commented on the lack of planning on the Gold Coast calling it ‘an experiment in freedom’ or ‘free enterprise city’. Following a major restructuring of the Queensland’s local councils, the 1990s witnessed a shift from ad hoc decision making to more systematic planning on the Gold Coast. Understanding the past is important for shaping the future. This paper reviews the history of regulatory planning on the Gold Coast, encompassing decisions affecting the form and development of its earliest settlements through to its periods of greatest construction and most streamlined decision–making. It focuses mainly on past planning processes, the problems identified in each planning exercise and the interventions introduced, asking whether these were implemented or not and why. The paper positions the Gold Coast as a physical embodiment of this history of decision making, assessing the effects on the city as a whole of specific measures either affording freedoms or insisting on accountability to various levels of regulation. It examines how the absence of some planning measures influenced the form of the city and its internal arrangements and considers how the shift from ad hoc decision making towards more systematic planning efforts affected the city’s urbanisation. The lessons that the Gold Coast example provides will resonate with places elsewhere in Australia and the world, if not always in scale definitely in substance.
Resumo:
The emphasis on collegiality and collaboration in the literature on teachers' work and school reform has tended to underplay the significance of teacher autonomy. This thesis explores the dynamics of teachers' understandings and experiences of individual teacher autonomy (as contrasted with collective autonomy) in an independent school in Queensland which promoted itself as a 'teachers' school' with a strong commitment to individual teacher autonomy. The research was a case study which drew on methodological signposts from critical, feminist and traditional ethnography. Intensive fieldwork in the school over five months incorporated the ethnographic techniques of observation, interviews and document analysis. Teachers at Thornton College understood their experience of individual autonomy at three interrelated levels--in terms of their work in the classroom, their working life in the school, and their voice in the decision-making processes of the school. They felt that they experienced a great deal of individual autonomy at each of these three levels. These understandings and experiences of autonomy were encumbered or enabled by a range of internal and external stakeholder groups. There were also a number of structural influences (community perceptions, market forces, school size, time and bureaucracy) emerging from the economic, social and political structures in Australian society which influenced the experience of autonomy by teachers. The experience of individual teacher autonomy was constantly shifting, but there were some emergent patterns. Consensus on educational goals and vision, and strong expressions of trust and respect between teachers and stakeholders in the school, characterised the contexts in which teachers felt they experienced high levels of autonomy in their work. The demand for accountability and desire for relatedness motivated stakeholders and structural forces to influence teacher autonomy. Some significant gaps emerged between the rhetoric of a commitment to individual teacher autonomy and decision-making practices in the school, that gave ultimate power to the co-principals. Despite the rhetoric and promotion of non-hierarchical structures and collaborative decision-making processes, many teachers perceived that their experience of individual autonomy remained subject to the exercise of 'partial democracy' by school leaders.
Resumo:
Analytical solutions of partial differential equation (PDE) models describing reactive transport phenomena in saturated porous media are often used as screening tools to provide insight into contaminant fate and transport processes. While many practical modelling scenarios involve spatially variable coefficients, such as spatially variable flow velocity, v(x), or spatially variable decay rate, k(x), most analytical models deal with constant coefficients. Here we present a framework for constructing exact solutions of PDE models of reactive transport. Our approach is relevant for advection-dominant problems, and is based on a regular perturbation technique. We present a description of the solution technique for a range of one-dimensional scenarios involving constant and variable coefficients, and we show that the solutions compare well with numerical approximations. Our general approach applies to a range of initial conditions and various forms of v(x) and k(x). Instead of simply documenting specific solutions for particular cases, we present a symbolic worksheet, as supplementary material, which enables the solution to be evaluated for different choices of the initial condition, v(x) and k(x). We also discuss how the technique generalizes to apply to models of coupled multispecies reactive transport as well as higher dimensional problems.