David Merrill

Adaptive Music Controller: Consider a musical instrument that adapts to your preferences, learning your gestures and mappings rather than you having to learn it. This was the idea behind my masters thesis with Joe Paradiso - I built an adaptive music controller, to explore how a train-by-example model of input-to-output mapping could applied to an expressive interface. The controller also proved to be interesting as a data-collection platform to understand better how people naturally associate gesture with sound.

This device used the stack for data acquisition and transmission.

The pictures to the left are conceptual sketches, prototypes, and the finished product.

Click here to see a movie documenting the various input degrees of freedom (the physical affordances) of the device.
Click here to see a movie of the device being trained with three gesture-sound associations, then these gestural bookmarks being used to trigger the sounds.
Click here to see a movie showing some people using the device to train gestural bookmarks,to make input-DOF-to-effect mappings, and to jam with the customized instrument.
Click here to see a movie of the spring-return mechanism which I built to provide haptic feedback.


D. Merrill and J. Paradiso. Personalization, Expressivity, and Learnability of an Implicit Mapping Strategy for Physical Interfaces. In the Extended Abstracts (alt.chi session) of the Conference on Human Factors in Computing Systems (CHI'05). Portland, Oregon. 2005.

D. Merrill. FlexiGesture: A sensor-rich real-time adaptive gesture and affordance learning platform for electronic music control. S.M. Thesis, Massachusetts Institute of Technology. 2004.