13 resultados para Process-dissociation Framework
em University of Queensland eSpace - Australia
Resumo:
Following study, participants received 2 tests. The 1st was a recognition test; the 2nd was designed to tap recollection. The objective was to examine performance on Test I conditional on Test 2 performance. In Experiment 1, contrary to process dissociation assumptions, exclusion errors better predicted subsequent recollection than did inclusion errors. In Experiments 2 and 3, with alternate questions posed on Test 2, words having high estimates of recollection with one question had high estimates of familiarity with the other question. Results supported the following: (a) the 2-test procedure has considerable potential for elucidating the relationship between recollection and familiarity; (b) there is substantial evidence for dependency between such processes when estimates are obtained using the process dissociation and remember-know procedures; and (c) order of information access appears to depend on the question posed to the memory system.
Resumo:
Two studies investigated the context deletion effect, the attenuation of priming in implicit memory tests of words when words have been studied in text rather than in isolation. In Experiment 1, stem completion for single words was primed to a greater extent by words studied alone than in sentence contexts, and a higher proportion of completions from studied words was produced under direct instructions (cued recall) than under indirect instructions (produce the first completion that comes to mind). The effect of a sentence context was eliminated when participants were instructed to attend to the target word during the imagery generation task used in the study phase. In Experiment 2, the effect of a sentence context at study was reduced when the target word was presented in distinctive format within the sentence, and the study task (grammatical judgment) was directed at a word other than the target. The results implicate conceptual and perceptual processes that distinguish a word from its context in priming in word stem completion.
Resumo:
Three experiments are reported that examined the process by which trainees learn decision-making skills during a critical incident training program. Formal theories of category learning were used to identify two processes that may be responsible for the acquisition of decision-making skills: rule learning and exemplar learning. Experiments I and 2 used the process dissociation procedure (L. L. Jacoby, 1998) to evaluate the contribution of these processes to performance. The results suggest that trainees used a mixture of rule and exemplar learning. Furthermore, these learning processes were influenced by different aspects of training structure and design. The goal of Experiment 3 was to develop training techniques that enable trainees to use a rule adaptively. Trainees were tested on cases that represented exceptions to the rule. Unexpectedly, the results suggest that providing general instruction regarding the kinds of conditions in which a decision rule does not apply caused them to fixate on the specific conditions mentioned and impaired their ability to identify other conditions in which the rule might not apply. The theoretical, methodological, and practical implications of the results are discussed.
Resumo:
As process management projects have increased in size due to globalised and company-wide initiatives, a corresponding growth in the size of process modeling projects can be observed. Despite advances in languages, tools and methodologies, several aspects of these projects have been largely ignored by the academic community. This paper makes a first contribution to a potential research agenda in this field by defining the characteristics of large-scale process modeling projects and proposing a framework of related issues. These issues are derived from a semi -structured interview and six focus groups conducted in Australia, Germany and the USA with enterprise and modeling software vendors and customers. The focus groups confirm the existence of unresolved problems in business process modeling projects. The outcomes provide a research agenda which directs researchers into further studies in global process management, process model decomposition and the overall governance of process modeling projects. It is expected that this research agenda will provide guidance to researchers and practitioners by focusing on areas of high theoretical and practical relevance.
Resumo:
Adsorption of pure nitrogen, argon, acetone, chloroform and acetone-chloroform mixture on graphitized thermal carbon black is considered at sub-critical conditions by means of molecular layer structure theory (MLST). In the present version of the MLST an adsorbed fluid is considered as a sequence of 2D molecular layers, whose Helmholtz free energies are obtained directly from the analysis of experimental adsorption isotherm of pure components. The interaction of the nearest layers is accounted for in the framework of mean field approximation. This approach allows quantitative correlating of experimental nitrogen and argon adsorption isotherm both in the monolayer region and in the range of multi-layer coverage up to 10 molecular layers. In the case of acetone and chloroform the approach also leads to excellent quantitative correlation of adsorption isotherms, while molecular approaches such as the non-local density functional theory (NLDFT) fail to describe those isotherms. We extend our new method to calculate the Helmholtz free energy of an adsorbed mixture using a simple mixing rule, and this allows us to predict mixture adsorption isotherms from pure component adsorption isotherms. The approach, which accounts for the difference in composition in different molecular layers, is tested against the experimental data of acetone-chloroform mixture (non-ideal mixture) adsorption on graphitized thermal carbon black at 50 degrees C. (C) 2005 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, a new control design method is proposed for stable processes which can be described using Hammerstein-Wiener models. The internal model control (IMC) framework is extended to accommodate multiple IMC controllers, one for each subsystem. The concept of passive systems is used to construct the IMC controllers which approximate the inverses of the subsystems to achieve dynamic control performance. The Passivity Theorem is used to ensure the closed-loop stability. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The principles of sustainable development (or ecologically sustainable development as it is known in Australia) are now accepted as the foundation for natural resource management worldwide and there are increasing community expectations that they will be implemented explicitly. Previous attempts to assess sustainable development for fisheries have mostly failed because the methods have been too restrictive, often attempting to develop a single set of indicators. In 2000, all the fishery agencies and major stakeholder groups in Australia supported the development of a National ESD Framework. This initiative resulted in a practical system being generated through the results of a series of case studies and stakeholder workshops. The Australian National ESD Framework divides ESD into eight major components within the three main categories of ecological well-being, human well-being and ability to contribute: Four main steps are used to complete an ESD report for a fishery: (1) identify relevant issues, (2) prioritise these using risk assessment, (3) complete appropriately detailed reports on each issue and (4) compile the material into a report. The tools to assist this process are now available and have been used to generate reports for many Australian fisheries. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Workflow systems have traditionally focused on the so-called production processes which are characterized by pre-definition, high volume, and repetitiveness. Recently, the deployment of workflow systems in non-traditional domains such as collaborative applications, e-learning and cross-organizational process integration, have put forth new requirements for flexible and dynamic specification. However, this flexibility cannot be offered at the expense of control, a critical requirement of business processes. In this paper, we will present a foundation set of constraints for flexible workflow specification. These constraints are intended to provide an appropriate balance between flexibility and control. The constraint specification framework is based on the concept of pockets of flexibility which allows ad hoc changes and/or building of workflows for highly flexible processes. Basically, our approach is to provide the ability to execute on the basis of a partially specified model, where the full specification of the model is made at runtime, and may be unique to each instance. The verification of dynamically built models is essential. Where as ensuring that the model conforms to specified constraints does not pose great difficulty, ensuring that the constraint set itself does not carry conflicts and redundancy is an interesting and challenging problem. In this paper, we will provide a discussion on both the static and dynamic verification aspects. We will also briefly present Chameleon, a prototype workflow engine that implements these concepts. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
A methodological framework for conducting a systematic, mostly qualitative, meta-synthesis of community-based rehabilitation (CBR) project evaluation reports is described. Developed in the course of an international pilot study, the framework proposes a systematic review process in phases which are strongly collaborative, methodologically rigorous and detailed. Through this suggested process, valuable descriptive data about CBR practice, strategies and outcomes may be synthesized. It is anticipated that future application of this methodology will contribute to an improved evidence base for CBR, which will facilitate the development of more appropriate policy and practice guidelines for disability service delivery in developing countries. The methodology will also have potential applications in areas beyond CBR, which are similarly. evidence poor' (lacking empirical research) but 'data rich' (with plentiful descriptive and evaluative reports).
Resumo:
We investigate the quantum many-body dynamics of dissociation of a Bose-Einstein condensate of molecular dimers into pairs of constituent bosonic atoms and analyze the resulting atom-atom correlations. The quantum fields of both the molecules and atoms are simulated from first principles in three dimensions using the positive-P representation method. This allows us to provide an exact treatment of the molecular field depletion and s-wave scattering interactions between the particles, as well as to extend the analysis to nonuniform systems. In the simplest uniform case, we find that the major source of atom-atom decorrelation is atom-atom recombination which produces molecules outside the initially occupied condensate mode. The unwanted molecules are formed from dissociated atom pairs with nonopposite momenta. The net effect of this process-which becomes increasingly significant for dissociation durations corresponding to more than about 40% conversion-is to reduce the atom-atom correlations. In addition, for nonuniform systems we find that mode mixing due to inhomogeneity can result in further degradation of the correlation signal. We characterize the correlation strength via the degree of squeezing of particle number-difference fluctuations in a certain momentum-space volume and show that the correlation strength can be increased if the signals are binned into larger counting volumes.
Resumo:
Workflow technology has delivered effectively for a large class of business processes, providing the requisite control and monitoring functions. At the same time, this technology has been the target of much criticism due to its limited ability to cope with dynamically changing business conditions which require business processes to be adapted frequently, and/or its limited ability to model business processes which cannot be entirely predefined. Requirements indicate the need for generic solutions where a balance between process control and flexibility may be achieved. In this paper we present a framework that allows the workflow to execute on the basis of a partially specified model where the full specification of the model is made at runtime, and may be unique to each instance. This framework is based on the notion of process constraints. Where as process constraints may be specified for any aspect of the workflow, such as structural, temporal, etc. our focus in this paper is on a constraint which allows dynamic selection of activities for inclusion in a given instance. We call these cardinality constraints, and this paper will discuss their specification and validation requirements.
Resumo:
Achieving consistency between a specification and its implementation is an important part of software development. In this paper, we present a method for generating passive test oracles that act as self-checking implementations. The implementation is verified using an animation tool to check that the behavior of the implementation matches the behavior of the specification. We discuss how to integrate this method into a framework developed for systematically animating specifications, which means a tester can significantly reduce testing time and effort by reusing work products from the animation. One such work product is a testgraph: a directed graph that partially models the states and transitions of the specification. Testgraphs are used to generate sequences for animation, and during testing, to execute these same sequences on the implementation.