
Movio
Electrical stimulation wearable device for providing haptic feedback and reducing motion sickness in VR. CHI 2019.
Photo credit: Oscar Rosello
I graduated with a PhD in June 2018 from the MIT Media Lab where I worked in the Fluid Interfaces group with Pattie Maes. I work in the area of human-computer interaction (HCI), specifically related to interactive devices and immersive technologies. The distance between the real and the digital is clearest at the interface layer. The ways that our bodies interact with the physical world are elaborate and rich while digital interactions are far more limited. My approach is to create systems and devices that use the entire body for input and output to allow for more implicit and natural interactions. I call my concept perceptual engineering, i.e., an approach to alter the user’s perception (or more specifically the input signals to their perception) and manipulate it in subtle ways. For example, I build systems and devices that modify a user’s sense of space, place, body, balance and orientation, and manipulate their visual attention and more, all without the user’s explicit input, and in order to assist or guide their interactive experience in an effortless way.
My recent TEDx talk on remote controlling human beings using galvanic vestibular stimulation (GVS).
I enjoy doodling, photography and playing video games. I also like glass blowing and kyudo. You can contact me at sra (at) media (dot) mit (dot) edu.
Electrical stimulation wearable device for providing haptic feedback and reducing motion sickness in VR. CHI 2019.
VR games that use breathing actions to augment controller input, e.g., blowing out makes you a fire breathing dragon and holding your breath makes you invisible. CHI 2018.
Best Paper Honorable Mention (top 5%)
Project Zanzibar is a flexible, portable mat that can sense and track physical objects, identify what they are, and allow you to interact through multi-touch and hover gestures. CHI 2018.
Best Paper Award (top 1%)
Methods for managing user perception and attention based on the cognitive illusion of Inattentional Blindness. DIS 2018.
Room-scale mapping techniques to support natural interaction in shared VR spaces for remotely located users. DIS 2018.
A collaborative interface that allows remote control of a person's walking trajectory through galvanic vestibular stimulation (GVS). VRST 2017.
A pipeline for the automatic generation of VR worlds from music. It uses a deep neural network for mood-based image generation. VRST 2017.
Automatic generation of VR worlds from 3D reconstruction of real world places. Object recognition used to recognize furniture for providing full-body haptics. VRST 2016. TVCG 2017.
Best Paper Award (top 0.02%)
Full-body tracking with Vive hand-held controllers and an inverse kinematics system. A mocap hack before Vive trackers became easily available. IAP 2017.
Creating immersive and interactive VR worlds using the real world as a template. New Context 2015.
Asymmetric multiplayer VR with roles based on the size of a user's space. UIST 2016.
Multisensory scuba diving experience in VR. CHI 2016. UIST 2016.
Room-scale SocialVR with object and full-body tracking. It was built before the Vive was released. IEEE VR 2016.
Room-scale SocialVR with full-body tracking. It was built before the Vive was announced. UIST 2015.
A sensor-based bedtime alarm and a connected peripheral display on the wallpaper of the user's mobile phone to promote awareness with sleep data visualization. MobiHealth 2015.
Handwritten Tamil character recognition with a Convolutional Neural Network (CNN). NEML 2014.
In-air gesture recognition system.
Gestural input for mobile and wearable devices.
A continuously morphing 4D geometrical VR world was my first VR project back in 2013.
Three physical microgames for taking short breaks from sitting.
An outdoor fast paced team-based Android game. This was my Master's Thesis at the Media Lab completed in Aug 2012. PUC 2015.
A location based personality generator and visualizer. Persuasive 2013.
Digital pen that transmits writings on paper to the whiteboard. UIST 2012.
An interactive tabletop application that displays a Health Score of a location, neighborhood or a selected area on a map.
I am co-organizing a CHI Workshop on Novel Interaction Techniques for Collaboration in VR and MR. My co-organizers are Ken Perlin, Luiz Velho and Mark Bolas.
I co-organized a 4-day VR/AR Hackathon at the MIT Media Lab in Oct 2017 and 2016 with learning workshops, hacking and a public expo. Approx. 375 participants, some from as far as New Zealand and Japan came to participate in the event sponsored by Microsoft, Samsung, AT&T, HTC and more.
Invited speaker at the 2015 New Context Conference in Tokyo.
I co-led an Intro to Electronics hands-on activity for high school students at MIT Splash 2014.