Nov 18, 2019

A wearable interface interpreting hand movements to computer interactions.


When we move, a lot more is happening than what can be explicitly observed: signals coming out of the brain, and converted into electrical inputs muscle can understand and execute. The signal that we can record on our skin is Electromyography (EMG). According to former research, EMG data can be collected and analyzed for the identification of certain movements. For example, AlterEgo (A. Kapur et al.) can translate indiscernible silent speech to words with machine learning of collected data.

But the concept and technology haven't been applied to the identification and interpretation of body movements yet. With the project Handy, I'll collect EMG data of different hand and finger gestures, use machine learning model to learn and identify different movements, and explore possible use scenarios.

Data will be collected and interacted with OpenBCI Cyton Board and sticky electrodes. I setup the board and tried with one channel (one muscle) on:

Early board testing

We can tell clearly when did I use the muscle and when did I idle. After the technical development phrase, the ultimate goal of the project would be researching and developing HCI for this new EMG-based interface.


Here are several (technical) questions I'll answer during the first stages of the project:

  1. How to collect meaningful data through electrodes and the broad? In other words, how to determine the quantity of electrodes needed, and set the universal ground?
  2. What are the best places (on wrist and hand) to place electrodes? And make sure of the cross-session consistency?
  3. How to preprocess data into understandable formats machine learning models can use?
  4. How to live stream data from the board to browser or other applications?

More TODOs and future works related to HCI to be updated.



One Cyton board can read and send up to 4 channels of EMG data, which means 4 different muscle groups. After learning about hand and arm muscle groups, I decided to use two channels for early stage development and simple gestures, and all 4 channels for final implementation and more complex and detailed movement detection. The electrodes will be attached to the upper limb and hand as follows (might be adjusted in the future):

2 Channels Implementation 2 Channels (Photos from Wikipedia) 4 Channels Implementation 4 Channels


  1. Electromyography - Wikipedia
  2. Muscles of the Upper Limb - Wikipedia
  3. Arm - Wiki, Upper Limb - Wiki
  4. AlterEgo - MIT Media Lab, Paper
  5. OpenBCI

Technical References

  1. Setting up for EMG - OpenBCI Doc
  2. Python signal filter test by J77M