This project was completed as part of the MSc Applied Machine Learning programme at Imperial College London's Department of Electrical Engineering.
This project develops a data glove with flex sensors and an inertial measurement unit (IMU) to produce different MIDI sounds with different gestures. Machine learning is used to classify the different gestures into different parts of the drum (kick, snare, hi-hat), allowing the user to use the data gloves to "play drums".
The model classifies five gestures into five main parts of a standard acoustic drum kit: the kick, hi-hat, snare, tom and crash.
Sound | Kick | Hihat | Snare | Tom | Crash |
---|---|---|---|---|---|
Gesture | ![]() |
![]() |
![]() |
![]() |
![]() |
Fist | 1 finger | 2 fingers | 3 fingers | Open palm |
The user chooses which drum sound to produce through the gesture, and when to produce the sound by performing a quick downward motion with their hand.
A quick, downward movement to produce a beat
Using the data gloves, the user can "play drums" in realtime, using manually downloaded drum sounds or MIDI sounds produced through a DAW.
Using one data glove to play drums using Logic Pro
(click for sound)
- Upload arduino/code>code.ino to your Nano.
- In collect_data.py, define your paths. For Windows,
SERIAL_PATH
would be like "COM10". For Unix systems,SERIAL_PATH
should be something like "/dev/ttyUSB0".FIGURES_PATH
is optional (directory doesn't have to exist). - Run collect_data.py. When prompted for the gesture, type in the gesture for that trial in the format
<first_gesture_num><second_gesture_num>
(e.g. 13). Gestures are:- (0) Kick - fist
- (1) Hihat - 1 finger
- (2) Snare - 2 fingers
- (3) Tom - 3 fingers
- (4) Crash - open palm
- Once
Reading...
is printed, start doing your gestures. Collect 100 gestures. - Once 100 gestures have been performed, press Ctrl-C to stop data collection. When prompted for your CSV filename, input your desired filename or just press Enter to set it as the gesture you previously inputted.
- collect_data.py: Python script for collecting data
- create_dataset.ipynb: Jupyter notebook for creating combined dataset from individual CSVs of trials for each gesture transition
- predict.py: Python script for realtime gesture prediction. Run with
python3 predict.py [--dev <usb_device_path>] [--hand <l/r>] [--sound <k/p for keyboard or playing>]
.