IMPLEMENTATION
I decided that the best approach was to implement a relatively simple testbed environment for toying with behaviour variables and experience the effects. I created two graphical avatars, in the form of decorated golf balls, and made the viewing screen represent the view from the user's avatar (1st person view). At this point I did not want to deal with real time communication, so a fake conversation is taking place between the two balls. The speaker's lips are animated while the listener nods occationaly. When the user approaches the conversation, the balls react to your presence in a way that reflects the settings of 3 sliders.
SLIDERS: The sliders represent agreement, social mood and awareness. The agreement slider controls the amount of positive visual backchannel feedback to the speaker, realized here as head nods (that's their only bodypart anyway!). The social mood controls the change in their stance (rotation of head) as you approach the conversation. The more social they are, the more they turn towards you and give you a friendly gaze. The awarness slider changes their definition of a proximity zone, i.e. how close you have to get before certain social actions are taken.
TOOLS: The system was implemented on an SGI Indigo 2, using C++, Open Inventor and RapidApp interface designer.

SCREENSHOTS: ["get lost"] ["what's up?"]

[<<] [index] [>>]