Saturday, May 3, 2008

American Sign Language Recognition in Game Development for Deaf Children

American Sign Language Recognition in Game Development for Deaf Children

Brasher et al combine visual and accelerometer based hand tracking. Accelerometers provide x,y, and z position. Visual tracking provides x,y hand centers, mass, major and minor axes, eccentricity, and orientation. These are input to the Georgia Tech Gesture Tool Kit. They achieve fairly high word accuracy but relatively low sentence accuracy.

Discussion
Another fairly straight forward gesture system. Get data from a tracking system and plug into an HMM. The lower sentence accuracy is easily explained: missing a single word causes an entire sentence to be incorrect, but there are multiple words per sentence.

Reference
Brashear, H., Henderson, V., Park, K., Hamilton, H., Lee, S., and Starner, T. 2006. American sign language recognition in game development for deaf children. In Proceedings of the 8th international ACM SIGACCESS Conference on Computers and Accessibility (Portland, Oregon, USA, October 23 - 25, 2006). Assets '06. ACM, New York, NY, 79-86.

No comments: