4 resultados para Multimodal Interaction
em Aston University Research Archive
Resumo:
Mobile and wearable computers present input/output prob-lems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds re-duced task completion time, perceived annoyance, and al-lowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' ges-tures were more accurate when dynamically guided by au-dio-feedback. These novel interaction techniques demon-strate effective alternatives to visual-centric interface de-signs on mobile devices.
Resumo:
Mobile technologies have yet to be widely adopted by the Architectural, Engineering, and Construction (AEC) industry despite being one of the major growth areas in computing in recent years. This lack of uptake in the AEC industry is likely due, in large part, to the combination of small screen size and inappropriate interaction demands of current mobile technologies. This paper discusses the scope for multimodal interaction design with a specific focus on speech-based interaction to enhance the suitability of mobile technology use within the AEC industry by broadening the field data input capabilities of such technologies. To investigate the appropriateness of using multimodal technology for field data collection in the AEC industry, we have developed a prototype Multimodal Field Data Entry (MFDE) application. This application, which allows concrete testing technicians to record quality control data in the field, has been designed to support two different modalities of data input speech-based data entry and stylus-based data entry. To compare the effectiveness or usability of, and user preference for, the different input options, we have designed a comprehensive lab-based evaluation of the application. To appropriately reflect the anticipated context of use within the study design, careful consideration had to be given to the key elements of a construction site that would potentially influence a test technician's ability to use the input techniques. These considerations and the resultant evaluation design are discussed in detail in this paper.
Resumo:
Desktop user interface design originates from the fact that users are stationary and can devote all of their visual resource to the application with which they are interacting. In contrast, users of mobile and wearable devices are typically in motion whilst using their device which means that they cannot devote all or any of their visual resource to interaction with the mobile application -- it must remain with the primary task, often for safety reasons. Additionally, such devices have limited screen real estate and traditional input and output capabilities are generally restricted. Consequently, if we are to develop effective applications for use on mobile or wearable technology, we must embrace a paradigm shift with respect to the interaction techniques we employ for communication with such devices.This paper discusses why it is necessary to embrace a paradigm shift in terms of interaction techniques for mobile technology and presents two novel multimodal interaction techniques which are effective alternatives to traditional, visual-centric interface designs on mobile devices as empirical examples of the potential to achieve this shift.
Resumo:
The studies in this project have investigated the ongoing neuronal network oscillatory activity found in the sensorimotor cortex using two modalities: magnetoencephalography (MEG) and in vitro slice recordings. The results have established that ongoing sensorimotor oscillations span the mu and beta frequency region both in vitro and in MEG recordings, with distinct frequency profiles for each recorded laminae in vitro, while MI and SI show less difference in humans. In addition, these studies show that connections between MI and SI modulate the ongoing neuronal network activity in these areas. The stimulation studies indicate that specific frequencies of stimulation affect the ongoing activity in the sensorimotor cortex. The continuous theta burst stimulation (cTBS) study demonstrates that cTBS predominantly enhances the power of the local ongoing activity. The stimulation studies in this project show limited comparison between modalities, which is informative of the role of connectivity in these effects. However, independently these studies provide novel information on the mechanisms on sensorimotor oscillatory interaction. The pharmacological studies reveal that GABAergic modulation with zolpidem changes the neuronal oscillatory network activity in both healthy and pathological MI. Zolpidem enhances the power of ongoing oscillatory activity in both sensorimotor laminae and in healthy subjects. In contrast, zolpidem attenuates the “abnormal” beta oscillatory activity in the affected hemisphere in Parkinsonian patients, while restoring the hemispheric beta power ratio and frequency variability and thereby improving motor symptomatology. Finally we show that independent signals from MI laminae can be integrated in silico to resemble the aggregate MEG MI oscillatory signals. This highlights the usefulness of combining these two methods when elucidating neuronal network oscillations in the sensorimotor cortex and any interventions.