18 resultados para Human-computer interaction.
em University of Queensland eSpace - Australia
Resumo:
Experiments with simulators allow psychologists to better understand the causes of human errors and build models of cognitive processes to be used in human reliability assessment (HRA). This paper investigates an approach to task failure analysis based on patterns of behaviour, by contrast to more traditional event-based approaches. It considers, as a case study, a formal model of an air traffic control (ATC) system which incorporates controller behaviour. The cognitive model is formalised in the CSP process algebra. Patterns of behaviour are expressed as temporal logic properties. Then a model-checking technique is used to verify whether the decomposition of the operator's behaviour into patterns is sound and complete with respect to the cognitive model. The decomposition is shown to be incomplete and a new behavioural pattern is identified, which appears to have been overlooked in the analysis of the data provided by the experiments with the simulator. This illustrates how formal analysis of operator models can yield fresh insights into how failures may arise in interactive systems.
Resumo:
This study extends previous media equation research, which showed that the effects of flattery from a computer can produce the same general effects as flattery from humans. Specifically, the study explored the potential moderating effect of experience on the impact of flattery from a computer. One hundred and fifty-eight students from the University of Queensland voluntarily participated in the study. Participants interacted with a computer and were exposed to one of three kinds of feedback: praise (sincere praise), flattery (insincere praise), or control (generic feedback). Questionnaire measures assessing participants' affective state. attitudes and opinions were taken. Participants of high experience, but not low experience, displayed a media equation pattern of results, reacting to flattery from a computer in a manner congruent with peoples' reactions to flattery from other humans. High experience participants tended to believe that the computer spoke the truth, experienced more positive affect as a result of flattery, and judged the computer's performance more favourably. These findings are interpreted in light of previous research and the implications for software design in fields such as entertainment and education are considered. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
This paper describes methods used to support collaboration and communication between practitioners, designers and engineers when designing ubiquitous computing systems. We tested methods such as “Wizard of Oz” and design games in a real domain, the dental surgery, in an attempt to create a system that is: affordable; minimally disruptive of the natural flow of work; and improves human-computer interaction. In doing so we found that such activities allowed the practitioners to be on a ‘level playing ground’ with designers and engineers. The findings we present suggest that dentists are willing to engage in detailed exploration and constructive critique of technical design possibilities if the design ideas and prototypes are presented in the context of their work practice and are of a resolution and relevance that allow them to jointly explore and question with the design time. This paper is an extension of a short paper submitted to the Participatory Design Conference, 2004.
Resumo:
The Operator Choice Model (OCM) was developed to model the behaviour of operators attending to complex tasks involving interdependent concurrent activities, such as in Air Traffic Control (ATC). The purpose of the OCM is to provide a flexible framework for modelling and simulation that can be used for quantitative analyses in human reliability assessment, comparison between human computer interaction (HCI) designs, and analysis of operator workload. The OCM virtual operator is essentially a cycle of four processes: Scan Classify Decide Action Perform Action. Once a cycle is complete, the operator will return to the Scan process. It is also possible to truncate a cycle and return to Scan after each of the processes. These processes are described using Continuous Time Probabilistic Automata (CTPA). The details of the probability and timing models are specific to the domain of application, and need to be specified using domain experts. We are building an application of the OCM for use in ATC. In order to develop a realistic model we are calibrating the probability and timing models that comprise each process using experimental data from a series of experiments conducted with student subjects. These experiments have identified the factors that influence perception and decision making in simplified conflict detection and resolution tasks. This paper presents an application of the OCM approach to a simple ATC conflict detection experiment. The aim is to calibrate the OCM so that its behaviour resembles that of the experimental subjects when it is challenged with the same task. Its behaviour should also interpolate when challenged with scenarios similar to those used to calibrate it. The approach illustrated here uses logistic regression to model the classifications made by the subjects. This model is fitted to the calibration data, and provides an extrapolation to classifications in scenarios outside of the calibration data. A simple strategy is used to calibrate the timing component of the model, and the results for reaction times are compared between the OCM and the student subjects. While this approach to timing does not capture the full complexity of the reaction time distribution seen in the data from the student subjects, the mean and the tail of the distributions are similar.
Resumo:
Even when data repositories exhibit near perfect data quality, users may formulate queries that do not correspond to the information requested. Users’ poor information retrieval performance may arise from either problems understanding of the data models that represent the real world systems, or their query skills. This research focuses on users’ understanding of the data structures, i.e., their ability to map the information request and the data model. The Bunge-Wand-Weber ontology was used to formulate three sets of hypotheses. Two laboratory experiments (one using a small data model and one using a larger data model) tested the effect of ontological clarity on users’ performance when undertaking component, record, and aggregate level tasks. The results indicate for the hypotheses associated with different representations but equivalent semantics that parsimonious data model participants performed better for component level tasks but that ontologically clearer data model participants performed better for record and aggregate level tasks.
Resumo:
The following topics are dealt with: Requirements engineering; components; design; formal specification analysis; education; model checking; human computer interaction; software design and architecture; formal methods and components; software maintenance; software process; formal methods and design; server-based applications; review and testing; measurement; documentation; management and knowledge-based approaches.
GABA(A) receptor beta isoform protein expression in human alcoholic brain: interaction with genotype
Resumo:
This paper describes a series of design games, specifically aimed at exploring shifts in human agency in order to inform the design of context-aware applications. The games focused on understanding information handling issues in dental practice with participants from a university dental school playing an active role in the activities. Participatory design activities help participants to reveal potential implicit technical resources that can be presented explicitly in technologies in order to assist humans in managing their interactions with and amidst technical systems gracefully.
Resumo:
There is still a great deal of opportunity for research on contextual interactive immersion in virtual heritage environments. The general failure of virtual environment technology to create engaging and educational experiences may be attributable not just to deficiencies in technology or in visual fidelity, but also to a lack of contextual and performative-based interaction, such as that found in games. However, there is little written so far on exactly how game-style interaction can help improve virtual learning environments.