Sunday, May 4, 2008

Using Ultrasonic Hand Tracking to Augment Motion Analysis Based Recognition of Manipulative Gestures

Using Ultrasonic Hand Tracking to Augment Motion Analysis Based Recognition of Manipulative Gestures

Ogris et al classify actions taken in a bike shop using a combination of gyroscopic and ultrasonic sensors. They try several method to classify the gestures: HMMs and 2 Frame based methods, kNN and C4.5 (decision tree). For kNN and C4.5, the classifiers vote on a set of sliding windows and the majority vote decides on what to classify the gesture as. They also test several sensor fusion methods: Plausibility Analysis, Joint Feature Vector classification, and classifier fusion. Fusion techniques improve classification results greatly.

Discussion
While the authors discussion the limitations of ultrasonics, such as sensitivity to reflections, they just leave it to the classifier to filter the noise. Wouldn't a real shop environment have a great deal of reflections from the surroundings? This isn't really address in the paper; maybe they have to have an empty room to work in.

Reference
Ogris, G., Stiefmeier, T., Junker, H., Lukowicz, P., and Troster, G. 2005. Using Ultrasonic Hand Tracking to Augment Motion Analysis Based Recognition of Manipulative Gestures. In Proceedings of the Ninth IEEE international Symposium on Wearable Computers (October 18 - 21, 2005). ISWC. IEEE Computer Society, Washington, DC, 152-159.

No comments: