261 resultados para Aeolian processes
Resumo:
The power to influence others in ever-expanding social networks in the new knowledge economy is tied to capabilities with digital media production. This chapter draws on research in elementary classrooms to examine the repertoires of cross-disciplinary knowledge that literacy learners need to produce innovative digital media via the “social web”. It focuses on the knowledge processes that occurred when elementary students engaged in multimodal text production with new digital media. It draws on Kalantzis and Cope’s (2008) heuristic for theorizing “Knowledge Processes” in the Learning by Design approach to pedagogy. Learners demonstrate eight “Knowledge Processes” across different subject domains, skills areas, and sensibilities. Drawing data from media-based lessons across several classroom and schools, this chapter examines what kinds of knowledge students utilize when they produce digital, multimodal texts in the classroom. The Learning by Design framework is used as an analytic tool to theorize how students learn when they engaged in a specific domain of learning – digital media production.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on workshops where process stakeholders together with modeling experts create a graphical visualization of a process in a model. Within these workshops, stakeholders are mostly limited to verbal contributions, which are integrated into a process model by a modeling expert using traditional input devices. This limitation negatively affects the collaboration outcome and also the perception of the collaboration itself. In order to overcome this problem we created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. Using this system for collaborative modeling, we expect to provide a more effective collaboration environment thus improving modeling performance and collaboration.
Resumo:
This is the fourth TAProViz workshop being run at the 13th International Conference on Business Process Management (BPM). The intention this year is to consolidate on the results of the previous successful workshops by further developing this important topic, identifying the key research topics of interest to the BPM visualization community. Towards this goal, the workshop topics were extended to human computer interaction and related domains. Submitted papers were evaluated by at least three program committee members, in a double blind manner, on the basis of significance, originality, technical quality and exposition. Three full and one position papers were accepted for presentation at the workshop. In addition, we invited a keynote speaker, Jakob Pinggera, a postdoctoral researcher at the Business Process Management Research Cluster at the University of Innsbruck, Austria.
Resumo:
Pseudo-marginal methods such as the grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms have been introduced in the literature as an approach to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we propose to use Gaussian processes (GP) to accelerate the GIMH method, whilst using a short pilot run of MCWM to train the GP. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model.
Resumo:
Overprocessing waste occurs in a business process when effort is spent in a way that does not add value to the customer nor to the business. Previous studies have identied a recurrent overprocessing pattern in business processes with so-called "knockout checks", meaning activities that classify a case into "accepted" or "rejected", such that if the case is accepted it proceeds forward, while if rejected, it is cancelled and all work performed in the case is considered unnecessary. Thus, when a knockout check rejects a case, the effort spent in other (previous) checks becomes overprocessing waste. Traditional process redesign methods propose to order knockout checks according to their mean effort and rejection rate. This paper presents a more fine-grained approach where knockout checks are ordered at runtime based on predictive machine learning models. Experiments on two real-life processes show that this predictive approach outperforms traditional methods while incurring minimal runtime overhead.
Resumo:
Analytical solutions of partial differential equation (PDE) models describing reactive transport phenomena in saturated porous media are often used as screening tools to provide insight into contaminant fate and transport processes. While many practical modelling scenarios involve spatially variable coefficients, such as spatially variable flow velocity, v(x), or spatially variable decay rate, k(x), most analytical models deal with constant coefficients. Here we present a framework for constructing exact solutions of PDE models of reactive transport. Our approach is relevant for advection-dominant problems, and is based on a regular perturbation technique. We present a description of the solution technique for a range of one-dimensional scenarios involving constant and variable coefficients, and we show that the solutions compare well with numerical approximations. Our general approach applies to a range of initial conditions and various forms of v(x) and k(x). Instead of simply documenting specific solutions for particular cases, we present a symbolic worksheet, as supplementary material, which enables the solution to be evaluated for different choices of the initial condition, v(x) and k(x). We also discuss how the technique generalizes to apply to models of coupled multispecies reactive transport as well as higher dimensional problems.
Resumo:
Over the last two decades, there has been an increasing awareness of, and interest in, the use of spatial moment techniques to provide insight into a range of biological and ecological processes. Models that incorporate spatial moments can be viewed as extensions of mean-field models. These mean-field models often consist of systems of classical ordinary differential equations and partial differential equations, whose derivation, at some point, hinges on the simplifying assumption that individuals in the underlying stochastic process encounter each other at a rate that is proportional to the average abundance of individuals. This assumption has several implications, the most striking of which is that mean-field models essentially neglect any impact of the spatial structure of individuals in the system. Moment dynamics models extend traditional mean-field descriptions by accounting for the dynamics of pairs, triples and higher n-tuples of individuals. This means that moment dynamics models can, to some extent, account for how the spatial structure affects the dynamics of the system in question.
Resumo:
This article presents a method for checking the conformance between an event log capturing the actual execution of a business process, and a model capturing its expected or normative execution. Given a business process model and an event log, the method returns a set of statements in natural language describing the behavior allowed by the process model but not observed in the log and vice versa. The method relies on a unified representation of process models and event logs based on a well-known model of concurrency, namely event structures. Specifically, the problem of conformance checking is approached by folding the input event log into an event structure, unfolding the process model into another event structure, and comparing the two event structures via an error-correcting synchronized product. Each behavioral difference detected in the synchronized product is then verbalized as a natural language statement. An empirical evaluation shows that the proposed method scales up to real-life datasets while producing more concise and higher-level difference descriptions than state-of-the-art conformance checking methods.
Resumo:
Increasing, there is growing acknowledgement of the importance of franchising within all modern global economies. Despite this, little is understood with regards the actual impact of franchising on local economies. This research aims to reframe the contribution of franchising by considering the process of franchisation. This study employed a mixed-method approach, utilizing critical realism to facilitate an outcomes-based explanation of firm survival. The focus of the study was upon generative mechanisms that were assumed to give rise to particular events from which (pizza) firm survival was enhanced vis-à-vis all other community members. A database of 2440 firms (or in excess of 21,000 company years) combined with archival records, interviews and the researcher’s observations provided the researcher with access to the nature of interaction occurring between firms. It was found that the survival of local firms was influenced positively by the day-to-day actions of franchise operators. However, it is argued that to understand how any such advantage my fall to local independent firms, we need too better appreciate the multitude of local processes related to such industries. This research re-examines several ecological concepts with the view of enabling a clearer investigation of underlying local processes. It also represents an authentic autecological approach to the study of firms.
Resumo:
Accounting information systems (AIS) capture and process accounting data and provide valuable information for decision-makers. However, in a rapidly changing environment, continual management of the AIS is necessary for organizations to optimise performance outcomes. We suggest that building a dynamic AIS capability enables accounting process and organizational performance. Using the dynamic capabilities framework (Teece 2007) we propose that a dynamic AIS capability can be developed through the synergy of three competencies: a flexible AIS, having a complementary business intelligence system and accounting professionals with IT technical competency. Using survey data, we find evidence of a positive association between a dynamic AIS capability, accounting process performance, and overall firm performance. The results suggest that developing a dynamic AIS resource can add value to an organization. This study provides guidance for organizations looking to leverage the performance outcomes of their AIS environment.
Resumo:
We explore how a standardization effort (i.e., when a firm pursues standards to further innovation) involves different search processes for knowledge and innovation outcomes. Using an inductive case study of Vanke, a leading Chinese property developer, we show how varying degrees of knowledge complexity and codification combine to produce a typology of four types of search process: active, integrative, decentralized and passive, resulting in four types of innovation outcome: modular, radical, incremental and architectural. We argue that when the standardization effort in a firm involves highly codified knowledge, incremental and architectural innovation outcomes are fostered, while modular and radical innovations are hindered. We discuss how standardization efforts can result in a second-order innovation capability, and conclude by calling for comparative research in other settings to understand how standardization efforts can be suited to different types of search process in different industry contexts.
Resumo:
This paper reports on the outcomes from a preliminary evaluation of technologies and processes intended to support the Assurance of Learning initiative in the business faculty of an Australian university. The study investigated how existing institutional information systems and operational processes could be used to support direct measures of student learning and the attainment of intended learning goals. The levels at which learning outcomes had been attained were extracted from the University Learning Management System (LMS), based on rubric data for three assessments in two units. Spreadsheets were used to link rubric criteria to the learning goals associated with the assessments as identified in a previous curriculum mapping exercise, and to aggregate the outcomes. Recommendations arising from this preliminary study are made to inform a more comprehensive pilot based on this approach, and manage the quality of student learning experiences in the context of existing processes and reporting structures.
Resumo:
Assessing build-up and wash-off process uncertainty is important for accurate interpretation of model outcomes to facilitate informed decision making for developing effective stormwater pollution mitigation strategies. Uncertainty inherent to pollutant build-up and wash-off processes influences the variations in pollutant loads entrained in stormwater runoff from urban catchments. However, build-up and wash-off predictions from stormwater quality models do not adequately represent such variations due to poor characterisation of the variability of these processes in mathematical models. The changes to the mathematical form of current models with the incorporation of process variability, facilitates accounting for process uncertainty without significantly affecting the model prediction performance. Moreover, the investigation of uncertainty propagation from build-up to wash-off confirmed that uncertainty in build-up process significantly influences wash-off process uncertainty. Specifically, the behaviour of particles <150µm during build-up primarily influences uncertainty propagation, resulting in appreciable variations in the pollutant load and composition during a wash-off event.