Orientation of the arm used for Gesture Recognition
datasetposted on 06.01.2020 by Anouk van Diepen, A. (Bert) de Vries, M. (Marco) Cox
Datasets usually provide raw data for analysis. This raw data often comes in spreadsheet form, but can be any collection of data, on which analysis can be performed.
Gesture recognition enables a natural extension of the way we currently interact with devices. Commercially available gesture recognition systems are usually pre-trained and offer no option for customization by the user. In order to improve the user experience, it is desirable to allow end users to define their own gestures. This scenario requires learning from just a few training examples if we want to impose only a light training load on the user. To this end, we propose a gesture classifier based on a hierarchical probabilistic modeling approach. In this framework, high-level features that are shared among different gestures can be extracted from a large labeled data set, yielding a prior distribution for gestures. When learning new types of gestures, the learned shared prior reduces the number of required training examples for individual gestures. To test our approach we collected a gesture database using the Myo sensor bracelet, which is worn around the forearm. The Myo measures the orientation of the forearm in quaternions using a nine axis IMU. The dataset contains 17 different types of arm movements from 7 different test subjects (students from the Electrical Engineering department).