16 resultados para Interaction Man-Computer
em Aston University Research Archive
Resumo:
Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.
Resumo:
The following thesis describes the computer modelling of radio frequency capacitively coupled methane/hydrogen plasmas and the consequences for the reactive ion etching of (100) GaAs surfaces. In addition a range of etching experiments was undertaken over a matrix of pressure, power and methane concentration. The resulting surfaces were investigated using X-ray photoelectron spectroscopy and the results were discussed in terms of physical and chemical models of particle/surface interactions in addition to the predictions for energies, angles and relative fluxes to the substrate of the various plasma species. The model consisted of a Monte Carlo code which followed electrons and ions through the plasma and sheath potentials whilst taking account of collisions with background neutral gas molecules. The ionisation profile output from the electron module was used as input for the ionic module. Momentum scattering interactions of ions with gas molecules were investigated via different models and compared against results given by quantum mechanical code. The interactions were treated as central potential scattering events and the resulting neutral cascades were followed. The resulting predictions for ion energies at the cathode compared well to experimental ion energy distributions and this verified the particular form of the electrical potentials used and their applicability in the particular geometry plasma cell used in the etching experiments. The final code was used to investigate the effect of external plasma parameters on the mass distribution, energy and angles of all species impingent on the electrodes. Comparisons of electron energies in the plasma also agreed favourably with measurements made using a Langmuir electric probe. The surface analysis showed the surfaces all to be depleted in arsenic due to its preferential removal and the resultant Ga:As ratio in the surface was found to be directly linked to the etch rate. The etch rate was determined by the methane flux which was predicted by the code.
Resumo:
This thesis initially presents an 'assay' of the literature pertaining to individual differences in human-computer interaction. A series of experiments is then reported, designed to investigate the association between a variety of individual characteristics and various computer task and interface factors. Predictor variables included age, computer expertise, and psychometric tests of spatial visualisation, spatial memory, logical reasoning, associative memory, and verbal ability. These were studied in relation to a variety of computer-based tacks, including: (1) word processing and its component elements; (ii) the location of target words within passages of text; (iii) the navigation of networks and menus; (iv) command generation using menus and command line interfaces; (v) the search and selection of icons and text labels; (vi) information retrieval. A measure of self-report workload was also included in several of these experiments. The main experimental findings included: (i) an interaction between spatial ability and the manipulation of semantic but not spatial interface content; (ii) verbal ability being only predictive of certain task components of word processing; (iii) age differences in word processing and information retrieval speed but not accuracy; (iv) evidence of compensatory strategies being employed by older subjects; (v) evidence of performance strategy differences which disadvantaged high spatial subjects in conditions of low spatial information content; (vi) interactive effects of associative memory, expertise and command strategy; (vii) an association between logical reasoning and word processing but not information retrieval; (viii) an interaction between expertise and cognitive demand; and (ix) a stronger association between cognitive ability and novice performance than expert performance.
Resumo:
Handheld and mobile technologies have witnessed significant advances in functionality, leading to their widespread use as both business and social networking tools. Human-Computer Interaction and Innovation in Handheld, Mobile and Wearable Technologies reviews concepts relating to the design, development, evaluation, and application of mobile technologies. Studies on mobile user interfaces, mobile learning, and mobile commerce contribute to the growing body of knowledge on this expanding discipline.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
This paper examines the application of commercial and non-invasive electroencephalography (EEG)-based brain-computer (BCIs) interfaces with serious games. Two different EEG-based BCI devices were used to fully control the same serious game. The first device (NeuroSky MindSet) uses only a single dry electrode and requires no calibration. The second device (Emotiv EPOC) uses 14 wet sensors requiring additional training of a classifier. User testing was performed on both devices with sixty-two participants measuring the player experience as well as key aspects of serious games, primarily learnability, satisfaction, performance and effort. Recorded feedback indicates that the current state of BCIs can be used in the future as alternative game interfaces after familiarisation and in some cases calibration. Comparative analysis showed significant differences between the two devices. The first device provides more satisfaction to the players whereas the second device is more effective in terms of adaptation and interaction with the serious game.
Resumo:
Support Vector Machines (SVMs) are widely used classifiers for detecting physiological patterns in Human-Computer Interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the application of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables, and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.
Resumo:
Flow control in Computer Communication systems is generally a multi-layered structure, consisting of several mechanisms operating independently at different levels. Evaluation of the performance of networks in which different flow control mechanisms act simultaneously is an important area of research, and is examined in depth in this thesis. This thesis presents the modelling of a finite resource computer communication network equipped with three levels of flow control, based on closed queueing network theory. The flow control mechanisms considered are: end-to-end control of virtual circuits, network access control of external messages at the entry nodes and the hop level control between nodes. The model is solved by a heuristic technique, based on an equivalent reduced network and the heuristic extensions to the mean value analysis algorithm. The method has significant computational advantages, and overcomes the limitations of the exact methods. It can be used to solve large network models with finite buffers and many virtual circuits. The model and its heuristic solution are validated by simulation. The interaction between the three levels of flow control are investigated. A queueing model is developed for the admission delay on virtual circuits with end-to-end control, in which messages arrive from independent Poisson sources. The selection of optimum window limit is considered. Several advanced network access schemes are postulated to improve the network performance as well as that of selected traffic streams, and numerical results are presented. A model for the dynamic control of input traffic is developed. Based on Markov decision theory, an optimal control policy is formulated. Numerical results are given and throughput-delay performance is shown to be better with dynamic control than with static control.
Resumo:
In the present work the neutron emission spectra from a graphite cube, and from natural uranium, lithium fluoride, graphite, lead and steel slabs bombarded with 14.1 MeV neutrons were measured to test nuclear data and calculational methods for D - T fusion reactor neutronics. The neutron spectra measured were performed by an organic scintillator using a pulse shape discrimination technique based on a charge comparison method to reject the gamma rays counts. A computer programme was used to analyse the experimental data by the differentiation unfolding method. The 14.1 MeV neutron source was obtained from T(d,n)4He reaction by the bombardment of T - Ti target with a deuteron beam of energy 130 KeV. The total neutron yield was monitored by the associated particle method using a silicon surface barrier detector. The numerical calculations were performed using the one-dimensional discrete-ordinate neutron transport code ANISN with the ZZ-FEWG 1/ 31-1F cross section library. A computer programme based on Gaussian smoothing function was used to smooth the calculated data and to match the experimental data. There was general agreement between measured and calculated spectra for the range of materials studied. The ANISN calculations carried out with P3 - S8 calculations together with representation of the slab assemblies by a hollow sphere with no reflection at the internal boundary were adequate to model the experimental data and hence it appears that the cross section set is satisfactory and for the materials tested needs no modification in the range 14.1 MeV to 2 MeV. Also it would be possible to carry out a study on fusion reactor blankets, using cylindrical geometry and including a series of concentric cylindrical shells to represent the torus wall, possible neutron converter and breeder regions, and reflector and shielding regions.
Resumo:
Owing to the rise in the volume of literature, problems arise in the retrieval of required information. Various retrieval strategies have been proposed, but most of that are not flexible enough for their users. Specifically, most of these systems assume that users know exactly what they are looking for before approaching the system, and that users are able to precisely express their information needs according to l aid- down specifications. There has, however, been described a retrieval program THOMAS which aims at satisfying incompletely- defined user needs through a man- machine dialogue which does not require any rigid queries. Unlike most systems, Thomas attempts to satisfy the user's needs from a model which it builds of the user's area of interest. This model is a subset of the program's "world model" - a database in the form of a network where the nodes represent concepts since various concepts have various degrees of similarities and associations, this thesis contends that instead of models which assume equal levels of similarities between concepts, the links between the concepts should have values assigned to them to indicate the degree of similarity between the concepts. Furthermore, the world model of the system should be structured such that concepts which are related to one another be clustered together, so that a user- interaction would involve only the relevant clusters rather than the entire database such clusters being determined by the system, not the user. This thesis also attempts to link the design work with the current notion in psychology centred on the use of the computer to simulate human cognitive processes. In this case, an attempt has been made to model a dialogue between two people - the information seeker and the information expert. The system, called Thomas-II, has been implemented and found to require less effort from the user than Thomas.
Resumo:
The present study describes a pragmatic approach to the implementation of production planning and scheduling techniques in foundries of all types and looks at the use of `state-of-the-art' management control and information systems. Following a review of systems for the classification of manufacturing companies, a definitive statement is made which highlights the important differences between foundries (i.e. `component makers') and other manufacturing companies (i.e. `component buyers'). An investigation of the manual procedures which are used to plan and control the manufacture of components reveals the inherent problems facing foundry production management staff, which suggests the unsuitability of many manufacturing techniques which have been applied to general engineering companies. From the literature it was discovered that computer-assisted systems are required which are primarily `information-based' rather than `decision based', whilst the availability of low-cost computers and `packaged-software' has enabled foundries to `get their feet wet' without the financial penalties which characterized many of the early attempts at computer-assistance (i.e. pre-1980). Moreover, no evidence of a single methodology for foundry scheduling emerged from the review. A philosophy for the development of a CAPM system is presented, which details the essential information requirements and puts forward proposals for the subsequent interactions between types of information and the sub-system of CAPM which they support. The work developed was oriented specifically at the functions of production planning and scheduling and introduces the concept of `manual interaction' for effective scheduling. The techniques developed were designed to use the information which is readily available in foundries and were found to be practically successful following the implementation of the techniques into a wide variety of foundries. The limitations of the techniques developed are subsequently discussed within the wider issues which form a CAPM system, prior to a presentation of the conclusions which can be drawn from the study.
Resumo:
This work attempts to create a systemic design framework for man-machine interfaces which is self consistent, compatible with other concepts, and applicable to real situations. This is tackled by examining the current architecture of computer applications packages. The treatment in the main is philosophical and theoretical and analyses the origins, assumptions and current practice of the design of applications packages. It proposes that the present form of packages is fundamentally contradictory to the notion of packaging itself. This is because as an indivisible ready-to-implement solution, current package architecture displays the following major disadvantages. First, it creates problems as a result of user-package interactions, in which the designer tries to mould all potential individual users, no matter how diverse they are, into one model. This is worsened by the minute provision, if any, of important properties such as flexibility, independence and impartiality. Second, it displays rigid structure that reduces the variety and/or multi-use of the component parts of such a package. Third, it dictates specific hardware and software configurations which probably results in reducing the number of degrees of freedom of its user. Fourth, it increases the dependence of its user upon its supplier through inadequate documentation and understanding of the package. Fifth, it tends to cause a degeneration of the expertise of design of the data processing practitioners. In view of this understanding an alternative methodological design framework which is both consistent with systems approach and the role of a package in its likely context is proposed. The proposition is based upon an extension of the identified concept of the hierarchy of holons* which facilitates the examination of the complex relationships of a package with its two principal environments. First, the user characteristics and his decision making practice and procedures; implying an examination of the user's M.I.S. network. Second, the software environment and its influence upon a package regarding support, control and operation of the package. The framework is built gradually as discussion advances around the central theme of a compatible M.I.S., software and model design. This leads to the formation of the alternative package architecture that is based upon the design of a number of independent, self-contained small parts. Such is believed to constitute the nucleus around which not only packages can be more effectively designed, but is also applicable to many man-machine systems design.
Resumo:
Der vorliegende Beitrag untersucht die Frage, in welchem Maße sich Institutionen, die niederdeutsche Kulturszene und individuelle Sprecher des Niederdeutschen moderne Kommunikationstechnologien wie das Internet zunutze machen und ob computervermittelte Kommunikation helfen kann, dem Rückgang des Niederdeutschen Einhalt zu gebieten. Die grundsätzliche Herangehensweise ist eine soziolinguistische, die das Internet als sozialen Handlungsraum versteht, in dem Individuen und Institutionen kommunizieren. Für eine derartige Perspektive stehen weniger das Medium oder das Genre im Mittelpunkt des Interesses als vielmehr das kommunizierende Individuum und die Sprachgemeinschaft, in diesem Fall die virtuelle Sprachgemeinschaft. Based on studies that analyse the potential of computer-mediated communication (cmc) to help fight language shift in lesser-used languages, this paper discusses the situation of Low German in Northern Germany. Over the last three decades, Low German has lost more than half of its active speakers. The article raises the question of whether and, if so, how Low German speakers make use of cmc to stem this tide. Following a sociolinguistic approach focussed on the individual speakers who use the Internet as a space for social interaction, it gives an overview of the discursive field of Low German on the internet and analyses in detail the most popular Low German discussion board. It shows that one of the main obstacles to a more successful use of cmc can be found in speakers' complex attitude toward written Low German. © Franz Steiner Verlag Stuttgart.
Resumo:
This paper presents the design and results of a task-based user study, based on Information Foraging Theory, on a novel user interaction framework - uInteract - for content-based image retrieval (CBIR). The framework includes a four-factor user interaction model and an interactive interface. The user study involves three focused evaluations, 12 simulated real life search tasks with different complexity levels, 12 comparative systems and 50 subjects. Information Foraging Theory is applied to the user study design and the quantitative data analysis. The systematic findings have not only shown how effective and easy to use the uInteract framework is, but also illustrate the value of Information Foraging Theory for interpreting user interaction with CBIR. © 2011 Springer-Verlag Berlin Heidelberg.
Resumo:
In order to bridge the “Semantic gap”, a number of relevance feedback (RF) mechanisms have been applied to content-based image retrieval (CBIR). However current RF techniques in most existing CBIR systems still lack satisfactory user interaction although some work has been done to improve the interaction as well as the search accuracy. In this paper, we propose a four-factor user interaction model and investigate its effects on CBIR by an empirical evaluation. Whilst the model was developed for our research purposes, we believe the model could be adapted to any content-based search system.