1 resultado para eye burning
em Digital Peer Publishing
Filtro por publicador
- Aberystwyth University Repository - Reino Unido (2)
- Acceda, el repositorio institucional de la Universidad de Las Palmas de Gran Canaria. España (2)
- AMS Tesi di Dottorato - Alm@DL - Università di Bologna (2)
- AMS Tesi di Laurea - Alm@DL - Università di Bologna (10)
- Aquatic Commons (6)
- ArchiMeD - Elektronische Publikationen der Universität Mainz - Alemanha (2)
- Archive of European Integration (11)
- Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco (1)
- Aston University Research Archive (1)
- Avian Conservation and Ecology - Eletronic Cientific Hournal - Écologie et conservation des oiseaux: (2)
- Biblioteca Digital | Sistema Integrado de Documentación | UNCuyo - UNCUYO. UNIVERSIDAD NACIONAL DE CUYO. (1)
- Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (22)
- Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP) (10)
- BORIS: Bern Open Repository and Information System - Berna - Suiça (151)
- Boston University Digital Common (6)
- Brock University, Canada (4)
- Bucknell University Digital Commons - Pensilvania - USA (10)
- CaltechTHESIS (1)
- Cambridge University Engineering Department Publications Database (12)
- CentAUR: Central Archive University of Reading - UK (63)
- Center for Jewish History Digital Collections (2)
- Chinese Academy of Sciences Institutional Repositories Grid Portal (35)
- Cochin University of Science & Technology (CUSAT), India (1)
- Dalarna University College Electronic Archive (3)
- Digital Archives@Colby (4)
- Digital Commons - Michigan Tech (1)
- Digital Commons - Montana Tech (1)
- Digital Commons @ DU | University of Denver Research (1)
- Digital Peer Publishing (1)
- Digital Repository at Iowa State University (1)
- DigitalCommons - The University of Maine Research (1)
- DigitalCommons@The Texas Medical Center (9)
- DigitalCommons@University of Nebraska - Lincoln (2)
- Digitale Sammlungen - Goethe-Universität Frankfurt am Main (1)
- Duke University (14)
- eResearch Archive - Queensland Department of Agriculture; Fisheries and Forestry (11)
- Harvard University (2)
- Helda - Digital Repository of University of Helsinki (7)
- Indian Institute of Science - Bangalore - Índia (32)
- Instituto Politécnico do Porto, Portugal (4)
- Ministerio de Cultura, Spain (2)
- National Center for Biotechnology Information - NCBI (25)
- Plymouth Marine Science Electronic Archive (PlyMSEA) (9)
- Publishing Network for Geoscientific & Environmental Data (2)
- QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast (93)
- Queensland University of Technology - ePrints Archive (107)
- ReCiL - Repositório Científico Lusófona - Grupo Lusófona, Portugal (7)
- Repositório Digital da UNIVERSIDADE DA MADEIRA - Portugal (1)
- Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho" (57)
- Research Open Access Repository of the University of East London. (1)
- Royal College of Art Research Repository - Uninet Kingdom (1)
- RUN (Repositório da Universidade Nova de Lisboa) - FCT (Faculdade de Cienecias e Technologia), Universidade Nova de Lisboa (UNL), Portugal (1)
- School of Medicine, Washington University, United States (3)
- Universidad de Alicante (10)
- Universidad del Rosario, Colombia (1)
- Universidad Politécnica de Madrid (5)
- Universidade Federal do Pará (1)
- Université de Lausanne, Switzerland (2)
- Université de Montréal, Canada (7)
- University of Connecticut - USA (2)
- University of Michigan (100)
- University of Queensland eSpace - Australia (25)
- University of Southampton, United Kingdom (3)
Resumo:
Tracking user’s visual attention is a fundamental aspect in novel human-computer interaction paradigms found in Virtual Reality. For example, multimodal interfaces or dialogue-based communications with virtual and real agents greatly benefit from the analysis of the user’s visual attention as a vital source for deictic references or turn-taking signals. Current approaches to determine visual attention rely primarily on monocular eye trackers. Hence they are restricted to the interpretation of two-dimensional fixations relative to a defined area of projection. The study presented in this article compares precision, accuracy and application performance of two binocular eye tracking devices. Two algorithms are compared which derive depth information as required for visual attention-based 3D interfaces. This information is further applied to an improved VR selection task in which a binocular eye tracker and an adaptive neural network algorithm is used during the disambiguation of partly occluded objects.