This group project arose out of Universitat Pompeu Fabra's Advanced Interface Design course competition and ultimately took the top prize for best design and implementation.
- Guillem Arias Bedmar (Cambridge / UPF)
- Marc Jones (UVA / UPF)
- Jake Phillips (Perdue / UPF)
We embedded a glove with resistive flex sensors (one for each finger) to communicate with a neural network, and thus trained the nn on various finger positions with MIDI data to output prescribed chords via open sound control (OSC). Further, guitar strums are generated by mobile device accelerometer variations (generated by flicking ones wrist), and accomplished using an accompanying custom OSC template installed to an Android or iOS device. The two OSC output sources (glove-nn and mobile device) are then recieved by a Max patch for sound generation and note-vizualization over time (similar to a piano roll style format).