home || bio || portfolio || publications || resume


Touch is a fundamental aspect of interpersonal communication. Whether a greeting handshake, an encouraging pat on the back, or a comforting hug, physical contact is a basic means through which people achieve a sense of connection, indicate intention, and express emotion.

Current interpersonal communication technology, such as telephones, video conferencing systems, and email, provides mechanisms for audio-visual and text-based interaction. Communication through touch, however, has been left largely unexplored. In this project, we apply haptic feedback technology to create a physical link between people separated by distance. The aim is to enrich current real-time communication by opening a channel for expression through touch.



the multi-locational object concept
The idea behind inTouch is to create the illusion that two people, separated by distance, are interacting with a shared physical object. We developed the metaphor of a single physical object that exists in multiple places at once. This Multi-Locational Object can be held, felt, and manipulated simultaneously by geographically separated people, thus creating a physical and intimate connection that transcends distance.

In reality, each user is interacting with his/her own object; however, when one of the objects is manipulated, both users' objects are affected. In our current design, the two connected objects each consist of three cylindrical rollers mounted on a base. When one of the rollers is rotated, the corresponding roller on the remote object rotates in the same way. This behavior can be achieved using haptic (force-feedback) technology with sensors to monitor the physical states of the rollers and internal motors to synchronize these states.

Two geographically distant people can then cooperatively move the rollers, fight over the state of the rollers, or more passively feel the other person's manipulation of the device. The presence of the other person is thus made tangible through physical interaction with the seemingly shared object. Since the two objects are not mechanically linked in reality, inconsistencies in their states must be resolved by the system agreeing on a single consistent state and then employing the motors to guide the objects into that state.


objective
Unlike the majority of applications of haptic technology, inTouch is not focused on the simulation of physical forms. Much of haptics research is aimed at the creation of virtual objects with form, mass, and texture that can be felt through feedback from a haptically augmented device. With inTouch, the idea is not to create a device to represent the physical form of the user at the other end, but rather to create a physical link for expressing the movements or gestures of that person. The physical form with which the user perceives to be interacting is thus not a simulation of the other user, but the device itself. The richness of the interaction then comes not from the representation of form, but from the representation of movement as mediated by the coupled objects. This is interesting in that it places great importance on the physical design of the device.


the design
The choice of rollers as the manipulable part of the object was made for two related reasons. First, rollers can be rotated in either a clockwise or counterclockwise direction indefinitely. Unlike a joystick or throttle, for example, where the motion of the device is bounded, the roller affords more fluid and continuos strokes. Although the roller has the potential to be manipulated aggressively, thrashing between bounds is not possible. For this reason, we felt that the motion of the roller was more appropriate for the expression of subtle emotional states than a bounded motion.

Second, rollers were chosen because they allow both passive and active interaction between users. A user can actively "grab" and manipulate the rollers by applying enough contact force to minimize slippage under the hand. In this way, the motion of the hand is directly translated to the rollers and the interaction is a kinesthetic one. If both users manipulate the rollers in this way, the interaction is fairly equal and mutual, like a handshake or a hug. Alternatively, one user could allow the rollers to slide comfortably beneath the hand, interacting in a more tactile and passive way, feeling but not affecting the motion of the rollers-like getting a pat on the back. Interactions falling between these two extremes, reflecting various levels of engagement with the rollers, are clearly also possible.


mechanical protoype
To get a basic idea of what the interaction through a "shared" physical object would ideally feel like, we built a mechanical model of inTouch that created a direct mechanical connection of the corresponding rollers with flexible drive shafts. This is clearly not the preferred method of connection over any reasonable distance; however, it let us experiment with the interaction.



electro-mechanical protoype
We then developed the inTouch virtually, using force-feedback technology. The goal was to have virtually connected rollers that behave identically to the mechanically connected version. Hewlett Packard optical position encoders were used to monitor the physical states of the rollers (positions were read directly, other values were interpolated) and high performance Maxon DC motors were used to synchronize those states. A 200MHz Pentium PC controlled all motor/encoder units (one unit for each roller) using Immersion Corporation's Impulse Drive Board 1.0 boards and 2-Axis Card 1.0 ISA cards.


conclusion
The inTouch system provides a physical link between people separated by distance, unattainable with current interpersonal communication technology. The key idea is to create the illusion that distant users are interacting through a shared physical object. We believe that inTouch suggests a new pathway for the application of haptic technology which has the potential to enrich interpersonal communication across distance.


home || bio || portfolio || publications || resume


Copyright © Andrew M. Dahley.
andyd[AT]media.mit.edu