1 resultado para Brian Cadd and Tim Gaze
em Digital Peer Publishing
Filtro por publicador
- JISC Information Environment Repository (3)
- Repository Napier (2)
- Aberdeen University (5)
- Abertay Research Collections - Abertay University’s repository (1)
- Aberystwyth University Repository - Reino Unido (11)
- Academic Archive On-line (Karlstad University; Sweden) (1)
- AMS Tesi di Dottorato - Alm@DL - Università di Bologna (1)
- Aquatic Commons (58)
- Archive of European Integration (1)
- Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco (4)
- Aston University Research Archive (14)
- Biblioteca de Teses e Dissertações da USP (1)
- Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (2)
- Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP) (2)
- Biblioteca Digital de la Universidad Católica Argentina (1)
- Biodiversity Heritage Library, United States (2)
- Blue Tiger Commons - Lincoln University - USA (1)
- BORIS: Bern Open Repository and Information System - Berna - Suiça (26)
- Boston University Digital Common (3)
- Brock University, Canada (3)
- CaltechTHESIS (5)
- Cambridge University Engineering Department Publications Database (10)
- CentAUR: Central Archive University of Reading - UK (25)
- Chinese Academy of Sciences Institutional Repositories Grid Portal (10)
- Comissão Econômica para a América Latina e o Caribe (CEPAL) (1)
- CORA - Cork Open Research Archive - University College Cork - Ireland (23)
- Corvinus Research Archive - The institutional repository for the Corvinus University of Budapest (1)
- DI-fusion - The institutional repository of Université Libre de Bruxelles (1)
- Digital Archives@Colby (4)
- Digital Commons - Michigan Tech (1)
- Digital Commons @ DU | University of Denver Research (1)
- Digital Commons at Florida International University (3)
- Digital Peer Publishing (1)
- DigitalCommons@University of Nebraska - Lincoln (2)
- DRUM (Digital Repository at the University of Maryland) (5)
- Duke University (6)
- eResearch Archive - Queensland Department of Agriculture; Fisheries and Forestry (10)
- Greenwich Academic Literature Archive - UK (21)
- Helda - Digital Repository of University of Helsinki (5)
- Helvia: Repositorio Institucional de la Universidad de Córdoba (1)
- Indian Institute of Science - Bangalore - Índia (21)
- Instituto Politécnico do Porto, Portugal (2)
- Massachusetts Institute of Technology (4)
- Ministerio de Cultura, Spain (1)
- National Center for Biotechnology Information - NCBI (1)
- Nottingham eTheses (5)
- Portal de Revistas Científicas Complutenses - Espanha (5)
- Publishing Network for Geoscientific & Environmental Data (2)
- QSpace: Queen's University - Canada (1)
- QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast (143)
- Queensland University of Technology - ePrints Archive (416)
- ReCiL - Repositório Científico Lusófona - Grupo Lusófona, Portugal (2)
- Repositório Digital da UNIVERSIDADE DA MADEIRA - Portugal (1)
- Repositório Institucional da Universidade Estadual de São Paulo - UNESP (1)
- Repositorio Institucional de la Universidad Nacional Agraria (1)
- Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho" (6)
- Research Open Access Repository of the University of East London. (1)
- RUN (Repositório da Universidade Nova de Lisboa) - FCT (Faculdade de Cienecias e Technologia), Universidade Nova de Lisboa (UNL), Portugal (1)
- School of Medicine, Washington University, United States (3)
- Universidad del Rosario, Colombia (4)
- Universidad Politécnica de Madrid (1)
- Universitätsbibliothek Kassel, Universität Kassel, Germany (1)
- Université de Lausanne, Switzerland (1)
- Université de Montréal, Canada (2)
- University of Michigan (14)
- University of Queensland eSpace - Australia (12)
- University of Southampton, United Kingdom (5)
- University of Washington (2)
- WestminsterResearch - UK (1)
Resumo:
Tracking user’s visual attention is a fundamental aspect in novel human-computer interaction paradigms found in Virtual Reality. For example, multimodal interfaces or dialogue-based communications with virtual and real agents greatly benefit from the analysis of the user’s visual attention as a vital source for deictic references or turn-taking signals. Current approaches to determine visual attention rely primarily on monocular eye trackers. Hence they are restricted to the interpretation of two-dimensional fixations relative to a defined area of projection. The study presented in this article compares precision, accuracy and application performance of two binocular eye tracking devices. Two algorithms are compared which derive depth information as required for visual attention-based 3D interfaces. This information is further applied to an improved VR selection task in which a binocular eye tracker and an adaptive neural network algorithm is used during the disambiguation of partly occluded objects.