David Merrill

Shopping labels from the My-ShoppingGuide application

Invisible Media video, from UbiComp 2005

User Attention informs the system's presentation of information

The simulated supermarket shelves (My-ShoppingGuide)

A transponder, mounted on the engine (Engine-Info)

Transponders on the engine (Engine-Info)

Invisible Media:
With Invisible Media we can augment objects around us to make them sensitive to, and able to inform, the focus of our attention in order to provide relevant content. The system is built to minimize bulky wearable gear, and allows the user to navigate this extra channel of physically-situated information with speech commands, keeping their hands available for manual manipulation of the objects themselves. Information is presented to people auditorily, resulting in a user-system dialog that attempts to mimic a domain expert or recommender who knows what objects are in view of the user and can suggest relevant content. We have created Engine-Info, a training application that teaches the components of an internal combustion engine, as well as My-ShoppingGuide, a personalized shopping scenario that can suggest appropriate foods in a supermarket based on a person's preferences and health needs.


D. Merrill and P. Maes. Augmenting Looking, Pointing and Reaching Gestures to Enhance the Searching and Browsing of Physical Objects. In the Proceedings of the 5th International Conference on Pervasive Computing (Pervasive'07). May 13-16 in Toronto, Ontario, Canada.

D. Merrill and P. Maes. Invisible Media: Attention-sensitive informational augmentation for physical objects. Abstract and video (97MB) accepted to the The Seventh International Conference on Ubiquitous Computing. Tokyo, Japan. 2005.

Other people involved (thanks!):
Chris Cheng and Daniel Schultz worked on the software (Java, XML, audio, speech recognition) for the invisible media project, and Amy Sheng volunteered for the video shoot. Martin Culpepper loaned us the engine that made our Engine-Info application possible.