|
||||||||||||||||
THE PHD RESEARCH OF JEFF ORKIN jorkin@alum.mit.edu | http://media.mit.edu/~jorkin
My goal is to simulate remarkably open-ended, natural, unscripted dialogue and social interaction in virtual environments. To accomplish this, I have developed a platform for capturing and simulating social behavior from recorded human performances. Collective Artificial Intelligence (CAI) is my method, building on techniques from Artificial Intelligence (AI) and crowdsourcing, to automate characters from a massive database of recorded content. Data-driven simulated role-playing has the potential to revolutionize digital entertainment, online education, and consumer engagement. Applications include new experiences in videogames, as well as new classes of training simulations, therapeutic applications, customer service support, tools for recruiting and interviewing, and social robots. Collective Artificial Intelligence (CAI) simulates behavior and dialogue using data recorded from thousands of human performances, mined for inter-related patterns. The CAI process combines crowdsourcing, pattern discovery, and case-based planning. This process produces characters capable of open-ended, face-to-face interaction and dialogue with humans. Characters can interact with the 3D virtual environment, and converse with humans via typed text or speech. As a proof of concept, I developed The Restaurant Game, and recorded over 16,000 people role-playing as customers and waitresses. The restaurant is a setting that everyone can understand, yet demonstrates the enormous range of behavior and dialogue in everyday interactions. My thesis evaluates interaction between human subjects and an AI-waitress driven by this data. Videos to the right show examples of the original human-human interaction (top), and interaction with the AI-waitress, via typed text and speech. Scroll down to see a visualization of action sequences observed in 5,000 recorded games. Two additional games have been developed from the same codebase: Improviso records players on the set of a low-budget sci-fi film, and Mars Escape captures human-robot interaction on a space station. Data from Mars Escape has been transfered from the virtual world to power a physical robot.
Related Publications:
Understanding speech in interactive narratives with crowdsourced data. (AIIDE 2012).
Related Press:
Crowdsourced Online Learning Gives Robots Human Skills - New Scientist (July 26, 2011)
Awards & Recognition:
Below: Visualization of all action sequences observed in 5,000 human-human games.
|
![]() |