1 resultado para Virtual Reality (VR)
em Abertay Research Collections - Abertay University’s repository
Filtro por publicador
- Repository Napier (1)
- ABACUS. Repositorio de Producción Científica - Universidad Europea (1)
- Abertay Research Collections - Abertay University’s repository (1)
- Academic Archive On-line (Mid Sweden University; Sweden) (1)
- Acceda, el repositorio institucional de la Universidad de Las Palmas de Gran Canaria. España (1)
- AMS Tesi di Dottorato - Alm@DL - Università di Bologna (3)
- AMS Tesi di Laurea - Alm@DL - Università di Bologna (5)
- ArchiMeD - Elektronische Publikationen der Universität Mainz - Alemanha (1)
- Archive of European Integration (1)
- Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco (2)
- Aston University Research Archive (9)
- B-Digital - Universidade Fernando Pessoa - Portugal (1)
- Biblioteca Digital - Universidad Icesi - Colombia (1)
- Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (2)
- Biblioteca Digital de Teses e Dissertações Eletrônicas da UERJ (8)
- BORIS: Bern Open Repository and Information System - Berna - Suiça (28)
- Boston University Digital Common (3)
- Bucknell University Digital Commons - Pensilvania - USA (1)
- Bulgarian Digital Mathematics Library at IMI-BAS (4)
- Cambridge University Engineering Department Publications Database (13)
- CentAUR: Central Archive University of Reading - UK (98)
- Chinese Academy of Sciences Institutional Repositories Grid Portal (16)
- Coffee Science - Universidade Federal de Lavras (1)
- Comissão Econômica para a América Latina e o Caribe (CEPAL) (1)
- CUNY Academic Works (1)
- Department of Computer Science E-Repository - King's College London, Strand, London (1)
- Digital Archives@Colby (1)
- Digital Commons - Michigan Tech (3)
- Digital Commons at Florida International University (6)
- Digital Peer Publishing (100)
- Diposit Digital de la UB - Universidade de Barcelona (1)
- Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland (2)
- Duke University (2)
- FUNDAJ - Fundação Joaquim Nabuco (3)
- Glasgow Theses Service (1)
- Greenwich Academic Literature Archive - UK (8)
- Helda - Digital Repository of University of Helsinki (2)
- Indian Institute of Science - Bangalore - Índia (1)
- Instituto Politécnico do Porto, Portugal (3)
- Lume - Repositório Digital da Universidade Federal do Rio Grande do Sul (1)
- Martin Luther Universitat Halle Wittenberg, Germany (1)
- Massachusetts Institute of Technology (3)
- Ministerio de Cultura, Spain (1)
- Portal de Revistas Científicas Complutenses - Espanha (2)
- QSpace: Queen's University - Canada (1)
- QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast (65)
- Queensland University of Technology - ePrints Archive (253)
- ReCiL - Repositório Científico Lusófona - Grupo Lusófona, Portugal (4)
- Repositório Aberto da Universidade Aberta de Portugal (2)
- Repositorio Académico de la Universidad Nacional de Costa Rica (1)
- Repositório Científico da Universidade de Évora - Portugal (1)
- Repositório Científico do Instituto Politécnico de Lisboa - Portugal (3)
- Repositório digital da Fundação Getúlio Vargas - FGV (1)
- Repositório Digital da UNIVERSIDADE DA MADEIRA - Portugal (2)
- Repositório Institucional da Universidade de Aveiro - Portugal (1)
- Repositório Institucional da Universidade Federal do Rio Grande - FURG (1)
- Repositório Institucional da Universidade Federal do Rio Grande do Norte (1)
- Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho" (53)
- Repositorio Institucional Universidad de Medellín (1)
- Repositorio Institucional Universidad EAFIT - Medelin - Colombia (3)
- Research Open Access Repository of the University of East London. (4)
- Royal College of Art Research Repository - Uninet Kingdom (1)
- RUN (Repositório da Universidade Nova de Lisboa) - FCT (Faculdade de Cienecias e Technologia), Universidade Nova de Lisboa (UNL), Portugal (1)
- Universidad de Alicante (3)
- Universidad del Rosario, Colombia (1)
- Universidad Politécnica de Madrid (25)
- Universidade de Lisboa - Repositório Aberto (1)
- Universidade de Madeira (1)
- Universidade Federal de Uberlândia (3)
- Universidade Federal do Pará (5)
- Universidade Federal do Rio Grande do Norte (UFRN) (13)
- Universidade Metodista de São Paulo (6)
- Universitat de Girona, Spain (4)
- Universitätsbibliothek Kassel, Universität Kassel, Germany (1)
- Université de Lausanne, Switzerland (1)
- Université de Montréal (4)
- Université de Montréal, Canada (15)
- University of Queensland eSpace - Australia (36)
- University of Southampton, United Kingdom (1)
- University of Washington (1)
- WestminsterResearch - UK (3)
- Worcester Research and Publications - Worcester Research and Publications - UK (3)
Resumo:
Fully articulated hand tracking promises to enable fundamentally new interactions with virtual and augmented worlds, but the limited accuracy and efficiency of current systems has prevented widespread adoption. Today's dominant paradigm uses machine learning for initialization and recovery followed by iterative model-fitting optimization to achieve a detailed pose fit. We follow this paradigm, but make several changes to the model-fitting, namely using: (1) a more discriminative objective function; (2) a smooth-surface model that provides gradients for non-linear optimization; and (3) joint optimization over both the model pose and the correspondences between observed data points and the model surface. While each of these changes may actually increase the cost per fitting iteration, we find a compensating decrease in the number of iterations. Further, the wide basin of convergence means that fewer starting points are needed for successful model fitting. Our system runs in real-time on CPU only, which frees up the commonly over-burdened GPU for experience designers. The hand tracker is efficient enough to run on low-power devices such as tablets. We can track up to several meters from the camera to provide a large working volume for interaction, even using the noisy data from current-generation depth cameras. Quantitative assessments on standard datasets show that the new approach exceeds the state of the art in accuracy. Qualitative results take the form of live recordings of a range of interactive experiences enabled by this new approach.