MotionMix is a hand motion-controlled audio mixing interface using the stage metaphor, rather than the traditional channel strip, to bring gestural control into the Digital Audio Workstation environment.
The interface uses the Leap Motion SDK and Max to retrieve hand position data, and a GUI was built using OpenGL in Jitter. Pan and fader data is sent to the Digital Audio Workstation over the Open Sound Control protocol. The current implementation uses Max for Live to control Ableton Live.
MotionMix started as my Masters Thesis, and won the Best Graduate Student Project Award in the NYU Music Technology program for 2014.