358 resultados para QUANTUM PROCESSES
Resumo:
An efficient method for the analysis of hydroquinone at trace levels in water samples has been developed in the form of a fluorescent probe based on graphene quantum dots (GQDs). The analytical variable, fluorescence quenching, was generated from the formation of benzoquinone intermediates, which formed during the catalytic oxidation of hydroquinone by horseradish peroxidase (HRP). In general, the reaction mechanism involved hydroquinone, as an electron acceptor, which affected the surface state of GQDs via an electron transfer effect. The water-soluble GQDs were directly prepared by the pyrolysis of citric acid and with the use of the mentioned hybrid enzyme system, the detection limit for hydroquinone was as low as 8.4 × 10−8 M. Furthermore, this analysis was almost unaffected by other phenol and quinine compounds, such as phenol, resorcinol and other quinines, and therefore, the developed GQD method produced satisfactory results for the analysis of hydroquinone in several different lake water samples.
Resumo:
Crashes at any particular transport network location consist of a chain of events arising from a multitude of potential causes and/or contributing factors whose nature is likely to reflect geometric characteristics of the road, spatial effects of the surrounding environment, and human behavioural factors. It is postulated that these potential contributing factors do not arise from the same underlying risk process, and thus should be explicitly modelled and understood. The state of the practice in road safety network management applies a safety performance function that represents a single risk process to explain crash variability across network sites. This study aims to elucidate the importance of differentiating among various underlying risk processes contributing to the observed crash count at any particular network location. To demonstrate the principle of this theoretical and corresponding methodological approach, the study explores engineering (e.g. segment length, speed limit) and unobserved spatial factors (e.g. climatic factors, presence of schools) as two explicit sources of crash contributing factors. A Bayesian Latent Class (BLC) analysis is used to explore these two sources and to incorporate prior information about their contribution to crash occurrence. The methodology is applied to the state controlled roads in Queensland, Australia and the results are compared with the traditional Negative Binomial (NB) model. A comparison of goodness of fit measures indicates that the model with a double risk process outperforms the single risk process NB model, and thus indicating the need for further research to capture all the three crash generation processes into the SPFs.
Resumo:
This chapter interrogates what recognition of prior learning (RPL) can and does mean in the higher education sector—a sector in the grip of the widening participation agenda and an open access age. The chapter discusses how open learning is making inroads into recognition processes and examines two studies in open learning recognition. A case study relating to e-portfolio-style RPL for entry into a Graduate Certificate in Policy and Governance at a metropolitan university in Queensland is described. In the first instance, candidates who do not possess a relevant Bachelor degree need to demonstrate skills in governmental policy work in order to be eligible to gain entry to a Graduate Certificate (at Australian Qualifications Framework Level 8) (Australian Qualifications Framework Council, 2013, p. 53). The chapter acknowledges the benefits and limitations of recognition in open learning and those of more traditional RPL, anticipating future developments in both (or their convergence).
Resumo:
The power to influence others in ever-expanding social networks in the new knowledge economy is tied to capabilities with digital media production. This chapter draws on research in elementary classrooms to examine the repertoires of cross-disciplinary knowledge that literacy learners need to produce innovative digital media via the “social web”. It focuses on the knowledge processes that occurred when elementary students engaged in multimodal text production with new digital media. It draws on Kalantzis and Cope’s (2008) heuristic for theorizing “Knowledge Processes” in the Learning by Design approach to pedagogy. Learners demonstrate eight “Knowledge Processes” across different subject domains, skills areas, and sensibilities. Drawing data from media-based lessons across several classroom and schools, this chapter examines what kinds of knowledge students utilize when they produce digital, multimodal texts in the classroom. The Learning by Design framework is used as an analytic tool to theorize how students learn when they engaged in a specific domain of learning – digital media production.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on workshops where process stakeholders together with modeling experts create a graphical visualization of a process in a model. Within these workshops, stakeholders are mostly limited to verbal contributions, which are integrated into a process model by a modeling expert using traditional input devices. This limitation negatively affects the collaboration outcome and also the perception of the collaboration itself. In order to overcome this problem we created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. Using this system for collaborative modeling, we expect to provide a more effective collaboration environment thus improving modeling performance and collaboration.
Resumo:
A quantum-spin-Hall (QSH) state was achieved experimentally, albeit at a low critical temperature because of the narrow band gap of the bulk material. Twodimensional topological insulators are critically important for realizing novel topological applications. Using density functional theory (DFT), we demonstrated that hydrogenated GaBi bilayers (HGaBi) form a stable topological insulator with a large nontrivial band gap of 0.320 eV, based on the state-of-the-art hybrid functional method, which is implementable for achieving QSH states at room temperature. The nontrivial topological property of the HGaBi lattice can also be confirmed from the appearance of gapless edge states in the nanoribbon structure. Our results provide a versatile platform for hosting nontrivial topological states usable for important nanoelectronic device applications.
Resumo:
This is the fourth TAProViz workshop being run at the 13th International Conference on Business Process Management (BPM). The intention this year is to consolidate on the results of the previous successful workshops by further developing this important topic, identifying the key research topics of interest to the BPM visualization community. Towards this goal, the workshop topics were extended to human computer interaction and related domains. Submitted papers were evaluated by at least three program committee members, in a double blind manner, on the basis of significance, originality, technical quality and exposition. Three full and one position papers were accepted for presentation at the workshop. In addition, we invited a keynote speaker, Jakob Pinggera, a postdoctoral researcher at the Business Process Management Research Cluster at the University of Innsbruck, Austria.
Resumo:
Pseudo-marginal methods such as the grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms have been introduced in the literature as an approach to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we propose to use Gaussian processes (GP) to accelerate the GIMH method, whilst using a short pilot run of MCWM to train the GP. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model.
Resumo:
Overprocessing waste occurs in a business process when effort is spent in a way that does not add value to the customer nor to the business. Previous studies have identied a recurrent overprocessing pattern in business processes with so-called "knockout checks", meaning activities that classify a case into "accepted" or "rejected", such that if the case is accepted it proceeds forward, while if rejected, it is cancelled and all work performed in the case is considered unnecessary. Thus, when a knockout check rejects a case, the effort spent in other (previous) checks becomes overprocessing waste. Traditional process redesign methods propose to order knockout checks according to their mean effort and rejection rate. This paper presents a more fine-grained approach where knockout checks are ordered at runtime based on predictive machine learning models. Experiments on two real-life processes show that this predictive approach outperforms traditional methods while incurring minimal runtime overhead.
Resumo:
Analytical solutions of partial differential equation (PDE) models describing reactive transport phenomena in saturated porous media are often used as screening tools to provide insight into contaminant fate and transport processes. While many practical modelling scenarios involve spatially variable coefficients, such as spatially variable flow velocity, v(x), or spatially variable decay rate, k(x), most analytical models deal with constant coefficients. Here we present a framework for constructing exact solutions of PDE models of reactive transport. Our approach is relevant for advection-dominant problems, and is based on a regular perturbation technique. We present a description of the solution technique for a range of one-dimensional scenarios involving constant and variable coefficients, and we show that the solutions compare well with numerical approximations. Our general approach applies to a range of initial conditions and various forms of v(x) and k(x). Instead of simply documenting specific solutions for particular cases, we present a symbolic worksheet, as supplementary material, which enables the solution to be evaluated for different choices of the initial condition, v(x) and k(x). We also discuss how the technique generalizes to apply to models of coupled multispecies reactive transport as well as higher dimensional problems.
Resumo:
Over the last two decades, there has been an increasing awareness of, and interest in, the use of spatial moment techniques to provide insight into a range of biological and ecological processes. Models that incorporate spatial moments can be viewed as extensions of mean-field models. These mean-field models often consist of systems of classical ordinary differential equations and partial differential equations, whose derivation, at some point, hinges on the simplifying assumption that individuals in the underlying stochastic process encounter each other at a rate that is proportional to the average abundance of individuals. This assumption has several implications, the most striking of which is that mean-field models essentially neglect any impact of the spatial structure of individuals in the system. Moment dynamics models extend traditional mean-field descriptions by accounting for the dynamics of pairs, triples and higher n-tuples of individuals. This means that moment dynamics models can, to some extent, account for how the spatial structure affects the dynamics of the system in question.
Resumo:
This article presents a method for checking the conformance between an event log capturing the actual execution of a business process, and a model capturing its expected or normative execution. Given a business process model and an event log, the method returns a set of statements in natural language describing the behavior allowed by the process model but not observed in the log and vice versa. The method relies on a unified representation of process models and event logs based on a well-known model of concurrency, namely event structures. Specifically, the problem of conformance checking is approached by folding the input event log into an event structure, unfolding the process model into another event structure, and comparing the two event structures via an error-correcting synchronized product. Each behavioral difference detected in the synchronized product is then verbalized as a natural language statement. An empirical evaluation shows that the proposed method scales up to real-life datasets while producing more concise and higher-level difference descriptions than state-of-the-art conformance checking methods.
Resumo:
Plasmonics is a recently emerged technology that enables the compression of electromagnetic waves into miniscule metallic structures, thus enabling the focusing and routing of light on the nanoscale. Plasmonic waveguides can be used to miniaturise the size of integrated chip circuits while increasing the data transmission speed. Plasmonic waveguides are used to route the plasmons around a circuit and are a major focus of this thesis. Also, plasmons are highly sensitive to the surrounding dielectric environment. Using this property we have experimentally realised a refractive index sensor to detect refractive index change in solutions.