1 resultado para Jay Sah
em Digital Peer Publishing
Filtro por publicador
- AMS Tesi di Dottorato - Alm@DL - Università di Bologna (1)
- Applied Math and Science Education Repository - Washington - USA (9)
- Aquatic Commons (15)
- ArchiMeD - Elektronische Publikationen der Universität Mainz - Alemanha (3)
- Archive of European Integration (2)
- Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco (1)
- Avian Conservation and Ecology - Eletronic Cientific Hournal - Écologie et conservation des oiseaux: (5)
- Biblioteca Digital da Câmara dos Deputados (1)
- Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (9)
- Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP) (7)
- Bibloteca do Senado Federal do Brasil (1)
- Biodiversity Heritage Library, United States (2)
- BORIS: Bern Open Repository and Information System - Berna - Suiça (89)
- Boston University Digital Common (1)
- Bucknell University Digital Commons - Pensilvania - USA (3)
- CaltechTHESIS (6)
- CentAUR: Central Archive University of Reading - UK (23)
- Center for Jewish History Digital Collections (1)
- Chinese Academy of Sciences Institutional Repositories Grid Portal (8)
- Comissão Econômica para a América Latina e o Caribe (CEPAL) (1)
- Cornell: DigitalCommons@ILR (1)
- CUNY Academic Works (3)
- Digital Commons @ DU | University of Denver Research (1)
- Digital Commons at Florida International University (40)
- Digital Peer Publishing (1)
- DigitalCommons - The University of Maine Research (1)
- DigitalCommons@The Texas Medical Center (19)
- DigitalCommons@University of Nebraska - Lincoln (20)
- Digitale Sammlungen - Goethe-Universität Frankfurt am Main (5)
- DRUM (Digital Repository at the University of Maryland) (4)
- Duke University (3)
- Gallica, Bibliotheque Numerique - Bibliothèque nationale de France (French National Library) (BnF), France (17)
- Harvard University (4)
- Helda - Digital Repository of University of Helsinki (7)
- Indian Institute of Science - Bangalore - Índia (19)
- Instituto Politécnico do Porto, Portugal (1)
- Lume - Repositório Digital da Universidade Federal do Rio Grande do Sul (1)
- Massachusetts Institute of Technology (7)
- Memoria Académica - FaHCE, UNLP - Argentina (8)
- Ministerio de Cultura, Spain (14)
- Plymouth Marine Science Electronic Archive (PlyMSEA) (1)
- Portal de Revistas Científicas Complutenses - Espanha (1)
- Publishing Network for Geoscientific & Environmental Data (165)
- QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast (47)
- Queensland University of Technology - ePrints Archive (201)
- ReCiL - Repositório Científico Lusófona - Grupo Lusófona, Portugal (1)
- Repositório digital da Fundação Getúlio Vargas - FGV (4)
- Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho" (24)
- School of Medicine, Washington University, United States (6)
- Universidad Autónoma de Nuevo León, Mexico (4)
- Universidad del Rosario, Colombia (24)
- Universidade Federal do Pará (1)
- Universidade Federal do Rio Grande do Norte (UFRN) (1)
- Universitat de Girona, Spain (1)
- Universitätsbibliothek Kassel, Universität Kassel, Germany (2)
- Université de Lausanne, Switzerland (3)
- Université de Montréal, Canada (7)
- University of Michigan (47)
- University of Washington (3)
Resumo:
This manuscript details a technique for estimating gesture accuracy within the context of motion-based health video games using the MICROSOFT KINECT. We created a physical therapy game that requires players to imitate clinically significant reference gestures. Player performance is represented by the degree of similarity between the performed and reference gestures and is quantified by collecting the Euler angles of the player's gestures, converting them to a three-dimensional vector, and comparing the magnitude between the vectors. Lower difference values represent greater gestural correspondence and therefore greater player performance. A group of thirty-one subjects was tested. Subjects achieved gestural correspondence sufficient to complete the game's objectives while also improving their ability to perform reference gestures accurately.