209 resultados para Human Machine Interface (HMI)
em University of Queensland eSpace - Australia
Resumo:
Users of safety-critical systems are expected to effectively control or monitor complex systems, with errors potentially leading to catastrophe. For such systems, safety is of paramount importance and must be designed into the human-machine interface. While many case studies show how inadequate design practice led to poor safety and usability, concrete guidance on good design practices is scarce. The paper argues that the pattern language paradigm, widely used in the software design community, is a suitable means of documenting appropriate design strategies. We discuss how typical usability-related properties (e.g., flexibility) need some adjustment to be used for assessing safety-critical systems, and document a pattern language, based on corresponding "safety-usability" principles
Resumo:
Ecological interface design (EID) is proving to be a promising approach to the design of interfaces for complex dynamic systems. Although the principles of EID and examples of its effective use are widely available, few readily available examples exist of how the individual displays that constitute an ecological interface are developed. This paper presents the semantic mapping process within EID in the context of prior theoretical work in this area. The semantic mapping process that was used in developing an ecological interface for the Pasteurizer II microworld is outlined, and the results of an evaluation of the ecological interface against a more conventional interface are briefly presented. Subjective reports indicate features of the ecological interface that made it particularly valuable for participants. Finally, we outline the steps of an analytic process for using EID. The findings presented here can be applied in the design of ecological interfaces or of configural displays for dynamic processes.
Resumo:
In this paper we use sensor-annotated abstraction hierarchies (Reising & Sanderson, 1996, 2002a,b) to show that unless appropriately instrumented, configural displays designed according to the principles of ecological interface design (EID) might be vulnerable to misinterpretation when sensors become unreliable or are unavailable. Building on foundations established in Reising and Sanderson (2002a) we use a pasteurization process control example to show how sensor-annotated AHs help the analyst determine the impact of different instrumentation engineering policies on a configural display that is part of an ecological interface. Our analyses suggest that configural displays showing higher-order properties of a system are especially vulnerable under some conservative instrumentation configurations. However, sensor-annotated AHs can be used to indicate where corrective instrumentation might be placed. We argue that if EID is to be effectively employed in the design of displays for complex systems, then the information needs of the human operator need to be considered while instrumentation requirements are being formulated. Rasmussen's abstraction hierarchy-and particularly its extension to the analysis of information captured by sensors and derived from sensors-may therefore be a useful adjunct to up-stream instrumentation design. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
In this paper we establish a foundation for understanding the instrumentation needs of complex dynamic systems if ecological interface design (EID)-based interfaces are to be robust in the face of instrumentation failures. EID-based interfaces often include configural displays which reveal the higher-order properties of complex systems. However, concerns have been expressed that such displays might be misleading when instrumentation is unreliable or unavailable. Rasmussen's abstraction hierarchy (AH) formalism can be extended to include representations of sensors near the functions or properties about which they provide information, resulting in what we call a sensor-annotated abstraction hierarchy. Sensor-annotated AHs help the analyst determine the impact of different instrumentation engineering policies on higher-order system information by showing how the data provided from individual sensors propagates within and across levels of abstraction in the AH. The use of sensor-annotated AHs with a configural display is illustrated with a simple water reservoir example. We argue that if EID is to be effectively employed in the design of interfaces for complex systems, then the information needs of the human operator need to be considered at the earliest stages of system development while instrumentation requirements are being formulated. In this way, Rasmussen's AH promotes a formative approach to instrumentation engineering. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
As the use of technological devices in everyday environments becomes more prevalent, it is clear that access to these devices has become an important aspect of occupational performance. Children are increasingly required to competently manipulate technology such as the computer to fulfil occupational roles of student and player. Occupational therapists are in a position to facilitate the successful interface between children and standard computer technologies. The literature has supported the use of direct manipulation interfaces in computing that requires mastery of devices such as the mouse. Identification of children likely to experience difficulties with mouse use will inform the development of appropriate methods of intervention promoting mouse skill and further enhance participation in occupational tasks. The aim of this paper is to discuss the development of an assessment of mouse proficiency for children. It describes the construction of the assessment, the content of the test, and its content validity.
Resumo:
The Test of Mouse Proficiency (TOMP) was developed to assist occupational therapists and education professionals assess computer mouse competency skills in children from preschool to upper primary (elementary) school age. The preliminary reliability and validity of TOMP are reported in this paper. Methods used to examine the internal consistency, test-retest reliability, and criterion- and construct-related validity of the test are elaborated. In the continuing process of test refinement, these preliminary studies support to varying degrees the reliability and validity of TOMP. Recommendations for further validation of the assessment are discussed along with indications for potential clinical application.
Resumo:
'Free will' and its corollary, the concept of individual responsibility are keystones of the justice system. This paper shows that if we accept a physics that disallows time reversal, the concept of 'free will' is undermined by an integrated understanding of the influence of genetics and environment on human behavioural responses. Analysis is undertaken by modelling life as a novel statistico-deterministic version of a Turing machine, i.e. as a series of transitions between states at successive instants of time. Using this model it is proven by induction that the entire course of life is independent of the action of free will. Although determined by prior state, the probability of transitions between states in response to a standard environmental stimulus is not equal to 1 and the transitions may differ quantitatively at the molecular level and qualitatively at the level of the whole organism. Transitions between states correspond to behaviours. It is shown that the behaviour of identical twins (or clones), although determined, would be incompletely predictable and non-identical, creating an illusion of the operation of 'free will'. 'Free will' is a convenient construct for current judicial systems and social control because it allows rationalization of punishment for those whose behaviour falls outside socially defined norms. Indeed, it is conceivable that maintenance of ideas of free will has co-evolved with community morality to reinforce its operation. If the concept is free will is to be maintained it would require revision of our current physical theories.
Resumo:
Recent structural studies of proteins mediating membrane fusion reveal intriguing similarities between diverse viral and mammalian systems. Particularly striking is the close similarity between the transmembrane envelope glycoproteins from the retrovirus HTLV-1 and the filovirus Ebola. These similarities suggest similar mechanisms of membrane fusion. The model that fits most currently available data suggests fusion activation in viral systems is driven by a symmetrical conformational change triggered by an activation event such as receptor binding or a pH change. The mammalian vesicle fusion mediated by the SNARE protein complex most likely occurs by a similar mechanism but without symmetry constraints.
Resumo:
The ligand-binding region of the low-density lipoprotein (LDL) receptor is formed by seven N-terminal, imperfect, cysteine-rich (LB) modules. This segment is followed by an epidermal growth factor precursor homology domain with two N-terminal, tandem, EGF-like modules that are thought to participate in LDL binding and recycling of the endocytosed receptor to the cell surface. EGF-A and the concatemer, EGF-AB, of these modules were expressed in Escherichia coli. Correct protein folding of EGF-A and the concatemer EGF-AB was achieved in the presence or absence of calcium ions, in contrast to the LB modules, which require them for correct folding. Homonuclear and heteronuclear H-1-N-15 NMR spectroscopy at 17.6 T was used to determine the three-dimensional structure of the concatemer. Both modules are formed by two pairs of short, anti-parallel beta -strands. In the concatemer, these modules have a fixed relative orientation, stabilized by calcium ion-binding and hydrophobic interactions at the interface. N-15 longitudinal and transverse relaxation rates, and {H-1}-N-15 heteronuclear NOEs were used to derive a model-free description of the backbone dynamics of the molecule. The concatemer appears relatively rigid, particularly near the calcium ion-binding site at the module interface, with an average generalized order parameter of 0.85 +/- 0.11. Some mutations causing familial hypercholesterolemia may now be rationalized. Mutations of D41, D43 and E44 in the EGF-B calcium ion-binding region may affect the stability of the linker and thus the orientation of the tandem modules. The diminutive core also provides little structural stabilization, necessitating the presence of disulfide bonds. The structure and dynamics of EGF-AB contrast with the N-terminal LB modules, which require calcium ions both for folding to form the correct disulfide connectivities and for maintenance of the folded structure, and are connected by highly mobile linking peptides. (C) 2001 Academic Press.
Resumo:
We report here a validated method for the quantification of a new immunosuppressant drug, everolimus (SDZ RAD), using HPLC-tandem mass spectrometry. Whole blood samples (500 mul) were prepared by protein precipitation, followed by C-18 solid-phase extraction. Mass spectrometric detection was by selected reaction monitoring with an electrospray interface operating in positive ionization mode. The assay was linear from 0.5 to 100 mug/l (r(2) > 0.996, n = 9). The analytical recovery and inter-day imprecision, determined using whole blood quality control samples (n = 5) at 0.5, 1.2, 20.0, and 75.0 mug/l, was 100.3-105.4% and less than or equal to7.6%, respectively. The assay had a mean relative recovery of 94.8 +/- 3.8%. Extracted samples were stable for up to 24 h. Fortified everolimus blood samples were stable at -80 degreesC for at least 8 months and everolimus was found to be stable in blood when taken through at least three freeze-thaw cycles. The reported method provides accurate, precise and specific measurement of everolimus in blood over a wide analytical range and is currently supporting phase 11 and III clinical trials. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Interfaces designed according to ecological interface design (EID) display higher-order relations and properties of a work domain so that adaptive operator problem solving can be better supported under unanticipated system conditions. Previous empirical studies of EID have assumed that the raw data required to derive and communicate higher-order information would be available and reliable. The present research examines the relative advantages of an EID interface over a conventional piping-and-instrumentation diagram (PID) when instrumentation is maximally or only minimally adequate. Results show an interaction between interface and the adequacy of the instrumentation. Failure diagnosis performance with the EID interface with maximally adequate instrumentation is best overall. Performance with the EID interface drops more drastically from maximally to minimally adequate instrumentation than does performance with the PID interface, to the point where the EID interface with minimally adequate instrumentation supports nonsignificantly worse performance than does the equivalent PID interface. Actual or potential applications of this research include design of instrumentation and displays for complex industrial processes.
Resumo:
Promiscuous human leukocyte antigen (HLA) binding peptides are ideal targets for vaccine development. Existing computational models for prediction of promiscuous peptides used hidden Markov models and artificial neural networks as prediction algorithms. We report a system based on support vector machines that outperforms previously published methods. Preliminary testing showed that it can predict peptides binding to HLA-A2 and -A3 super-type molecules with excellent accuracy, even for molecules where no binding data are currently available.
Resumo:
Machine learning techniques have been recognized as powerful tools for learning from data. One of the most popular learning techniques, the Back-Propagation (BP) Artificial Neural Networks, can be used as a computer model to predict peptides binding to the Human Leukocyte Antigens (HLA). The major advantage of computational screening is that it reduces the number of wet-lab experiments that need to be performed, significantly reducing the cost and time. A recently developed method, Extreme Learning Machine (ELM), which has superior properties over BP has been investigated to accomplish such tasks. In our work, we found that the ELM is as good as, if not better than, the BP in term of time complexity, accuracy deviations across experiments, and most importantly - prevention from over-fitting for prediction of peptide binding to HLA.
Resumo:
Web interface agent is used with web browsers to assist users in searching and interactions with the WWW. It is used for a variety of purposes, such as web-enabled remote control, web interactive visualization, and e-commerce activities. User may be aware or unaware of its existence. The intelligence of interface agent consists in its capability of learning and decision-making in performing interactive functions on behalf of a user. However, since web is an open system environment, the reasoning mechanism in an agent should be able to adapt changes and make decisions on exceptional situations, and therefore use meta knowledge. This paper proposes a framework of Reflective Web Interface Agent (RWIA) that is to provide causal connections between the application interfaces and the knowledge model of the interface agent. A prototype is also implemented for the purpose of demonstration.