next up previous
Next: Vision-Steered Beam Forming

Vision-Steered Beam Forming and Transaural Rendering for the Artificial Life Interactive Video Environment, (ALIVE)
Perceptual Computing Technical Report #352

Michael A. Casey, William G. Gardner, Sumit Basu
MIT Media Laboratory, Cambridge, USA
mkc,billg,sbasu@media.mit.edu

Abstract:

This paper describes the audio component of an interactive video system that uses remote sensing to free the user from body-mounted tracking equipment. Position information is obtained from a camera and used to constrain a beam-forming microphone array, for far-field speech input, and a two-speaker transaural audio system for rendering 3D audio.





Michael Casey
Mon Mar 4 18:47:28 EST 1996