Automatic Detection of Relevant Head Gestures in American Sign Language Communication


Autoria(s): Erdem, Ugur Murat; Sclaroff, Stan
Data(s)

20/10/2011

20/10/2011

2002

Resumo

An automated system for detection of head movements is described. The goal is to label relevant head gestures in video of American Sign Language (ASL) communication. In the system, a 3D head tracker recovers head rotation and translation parameters from monocular video. Relevant head gestures are then detected by analyzing the length and frequency of the motion signal's peaks and valleys. Each parameter is analyzed independently, due to the fact that a number of relevant head movements in ASL are associated with major changes around one rotational axis. No explicit training of the system is necessary. Currently, the system can detect "head shakes." In experimental evaluation, classification performance is compared against ground-truth labels obtained from ASL linguists. Initial results are promising, as the system matches the linguists' labels in a significant number of cases.

National Science Foundation (EIA 9809340, IIS 9912573)

Identificador

http://hdl.handle.net/2144/1657

Idioma(s)

en_US

Publicador

Boston University Computer Science Department

Relação

BUCS Technical Reports;BUCS-TR-2002-011

Palavras-Chave #Computer human interaction #Gesture classification #Visual motion #Image and video indexing
Tipo

Technical Report