918 resultados para Human-computer systems
Resumo:
Issued Mar. 1979.
Resumo:
Thesis--George Washington University.
Resumo:
Includes bibliographies and indexes.
Resumo:
Title from caption.
Resumo:
Title from caption.
Resumo:
"Sponsored jointly by the USAEC and Columbia University."
Resumo:
"B-279427"--P. 1.
Resumo:
"Under the auspices of the U.S. Department of Energy by the Los Alamos National Laboratory under contract W-7405-Eng.36"--P. [3] of cover.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Three different, well established systems for e-referral were examined. They ranged from a system in a single country handling a large number of cases (60,000 per year) to a global system covering many countries which handled fewer cases (150 per year). Nonetheless, there appeared to be a number of common features. Whether the purpose is e-transfer or e-consultation, the underlying model of the e-referral process is: the referrer initiates an e-request; the organization managing the process receives it, the organization allocates it for reply; the responder replies to the initiator. Various things can go wrong and the organization managing the e-referral process needs to be able to track requests through the system; this requires various performance metrics. E-referral can be conducted using email, or as messages passed either directly between computer systems or via a Web-link to a server. The experience of the three systems studied shows that significant changes in work practice are needed to launch an e-referral service successfully. The use of e-referral between primary and secondary care improves access to services and can be shown to be cost-effective.
Resumo:
This study extends previous media equation research, which showed that the effects of flattery from a computer can produce the same general effects as flattery from humans. Specifically, the study explored the potential moderating effect of experience on the impact of flattery from a computer. One hundred and fifty-eight students from the University of Queensland voluntarily participated in the study. Participants interacted with a computer and were exposed to one of three kinds of feedback: praise (sincere praise), flattery (insincere praise), or control (generic feedback). Questionnaire measures assessing participants' affective state. attitudes and opinions were taken. Participants of high experience, but not low experience, displayed a media equation pattern of results, reacting to flattery from a computer in a manner congruent with peoples' reactions to flattery from other humans. High experience participants tended to believe that the computer spoke the truth, experienced more positive affect as a result of flattery, and judged the computer's performance more favourably. These findings are interpreted in light of previous research and the implications for software design in fields such as entertainment and education are considered. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
This article presents the proceedings of a symposium held at the meeting of the International Society for Biomedical Research on Alcoholism (ISBRA) in Mannheim, Germany, in October, 2004. Chronic alcoholism follows a fluctuating course, which provides a naturalistic experiment in vulnerability, resilience, and recovery of human neural systems in response to presence, absence, and history of the neurotoxic effects of alcoholism. Alcohol dependence is a progressive chronic disease that is associated with changes in neuroanatomy, neurophysiology, neural gene expression, psychology, and behavior. Specifically, alcohol dependence is characterized by a neuropsychological profile of mild to moderate impairment in executive functions, visuospatial abilities, and postural stability, together with relative sparing of declarative memory, language skills, and primary motor and perceptual abilities. Recovery from alcoholism is associated with a partial reversal of CNS deficits that occur in alcoholism. The reversal of deficits during recovery from alcoholism indicates that brain structure is capable of repair and restructuring in response to insult in adulthood. Indirect support of this repair model derives from studies of selective neuropsychological processes, structural and functional neuroimaging studies, and preclinical studies on degeneration and regeneration during the development of alcohol dependence and recovery from dependence. Genetics and brain regional specificity contribute to unique changes in neuropsychology and neuroanatomy in alcoholism and recovery. This symposium includes state-of-the-art presentations on changes that occur during active alcoholism as well as those that may occur during recovery-abstinence from alcohol dependence. Included are human neuroimaging and neuropsychological assessments, changes in human brain gene expression, allelic combinations of genes associated with alcohol dependence and preclinical studies investigating mechanisms of alcohol induced neurotoxicity, and neuroprogenetor cell expansion during recovery from alcohol dependence.
Resumo:
An appreciation of the physical mechanisms which cause observed seismicity complexity is fundamental to the understanding of the temporal behaviour of faults and single slip events. Numerical simulation of fault slip can provide insights into fault processes by allowing exploration of parameter spaces which influence microscopic and macroscopic physics of processes which may lead towards an answer to those questions. Particle-based models such as the Lattice Solid Model have been used previously for the simulation of stick-slip dynamics of faults, although mainly in two dimensions. Recent increases in the power of computers and the ability to use the power of parallel computer systems have made it possible to extend particle-based fault simulations to three dimensions. In this paper a particle-based numerical model of a rough planar fault embedded between two elastic blocks in three dimensions is presented. A very simple friction law without any rate dependency and no spatial heterogeneity in the intrinsic coefficient of friction is used in the model. To simulate earthquake dynamics the model is sheared in a direction parallel to the fault plane with a constant velocity at the driving edges. Spontaneous slip occurs on the fault when the shear stress is large enough to overcome the frictional forces on the fault. Slip events with a wide range of event sizes are observed. Investigation of the temporal evolution and spatial distribution of slip during each event shows a high degree of variability between the events. In some of the larger events highly complex slip patterns are observed.
Resumo:
The Operator Choice Model (OCM) was developed to model the behaviour of operators attending to complex tasks involving interdependent concurrent activities, such as in Air Traffic Control (ATC). The purpose of the OCM is to provide a flexible framework for modelling and simulation that can be used for quantitative analyses in human reliability assessment, comparison between human computer interaction (HCI) designs, and analysis of operator workload. The OCM virtual operator is essentially a cycle of four processes: Scan Classify Decide Action Perform Action. Once a cycle is complete, the operator will return to the Scan process. It is also possible to truncate a cycle and return to Scan after each of the processes. These processes are described using Continuous Time Probabilistic Automata (CTPA). The details of the probability and timing models are specific to the domain of application, and need to be specified using domain experts. We are building an application of the OCM for use in ATC. In order to develop a realistic model we are calibrating the probability and timing models that comprise each process using experimental data from a series of experiments conducted with student subjects. These experiments have identified the factors that influence perception and decision making in simplified conflict detection and resolution tasks. This paper presents an application of the OCM approach to a simple ATC conflict detection experiment. The aim is to calibrate the OCM so that its behaviour resembles that of the experimental subjects when it is challenged with the same task. Its behaviour should also interpolate when challenged with scenarios similar to those used to calibrate it. The approach illustrated here uses logistic regression to model the classifications made by the subjects. This model is fitted to the calibration data, and provides an extrapolation to classifications in scenarios outside of the calibration data. A simple strategy is used to calibrate the timing component of the model, and the results for reaction times are compared between the OCM and the student subjects. While this approach to timing does not capture the full complexity of the reaction time distribution seen in the data from the student subjects, the mean and the tail of the distributions are similar.