
MoveU
Electrical stimulation wearable device for providing haptic feedback and reducing motion sickness in VR. CHI 2019.
Photo credit: Oscar Rosello
I graduated with a PhD in 2018 from the MIT Media Lab where I worked in the Fluid Interfaces group with Pattie Maes. I am now an Assistant Professor in the CS department at UCSB. Check out the beautiful campus in this aerial video.
I am looking for highly motivated PhD students to join my research group at UCSB for Summer and Fall 2020. Please see the lab page for more info: Perceptual Engineering Lab
Over the last 50 years, as devices and technologies have evolved, interaction has become more direct and the things people can do have become more personal. My research asks the question: what if the whole body were an interface? The goal of my work is to create systems that use the entire body for input and output and automatically adapt to each user’s unique state and context. I call my concept Perceptual Engineering -- a way to alter the user’s perception (or more specifically the input signals to their perception) and manipulate it in subtle ways. My research explores the use of cognitive illusions to manage a user’s attention, breathing for direct interaction, machine learning for automated virtual-world generation, embodiment for remote collaboration, tangible interactions for play augmentation, and galvanic vestibular stimulation (GVS) for reducing nausea and in immersive experiences. My perceptual engineering approach has shown to (1) increase the user’s sense of presence in VR/MR, (2) provide a novel eyes-, ears-, and hands-free way to communicate with the user through proprioception and other senses, and (3) serve as a platform to question the boundaries of our sense of agency and trust.
TEDx talk (October 2018) on remote controlling human beings using galvanic vestibular stimulation (GVS).
Article about my work in Forbes India -- Fresh off the (MIT Media) Lab (April 2019)
I enjoy doodling, photography and playing video games. I also like glass blowing and kyudo. You can contact me at sra (at) cs (dot) ucsb (dot) edu.
Electrical stimulation wearable device for providing haptic feedback and reducing motion sickness in VR. CHI 2019.
An outdoor VR time travel experience that takes you to the MIT of 2016 as well as an envisioned future 100 years from now.
VR games that use breathing actions to augment controller input, e.g., blowing out makes you a fire breathing dragon and holding your breath makes you invisible. CHI 2018.
Best Paper Honorable Mention (top 5%)
Project Zanzibar is a flexible, portable mat that can sense and track physical objects, identify what they are, and allow you to interact through multi-touch and hover gestures. CHI 2018.
Best Paper Award (top 1%)
Methods for managing user perception and attention based on the cognitive illusion of Inattentional Blindness. DIS 2018.
Room-scale mapping techniques to support natural interaction in shared VR spaces for remotely located users. DIS 2018.
A collaborative interface that allows remote control of a person's walking trajectory through galvanic vestibular stimulation (GVS). VRST 2017.
A pipeline for the automatic generation of VR worlds from music. It uses a deep neural network for mood-based image generation. VRST 2017.
Automatic generation of VR worlds from 3D reconstruction of real world places. Object recognition used to recognize furniture for providing full-body haptics. VRST 2016. TVCG 2017.
Best Paper Award (top 0.02%)
Full-body tracking with Vive hand-held controllers and an inverse kinematics system. A mocap hack before Vive trackers became easily available. IAP 2017.
Creating immersive and interactive VR worlds using the real world as a template. New Context 2015.
Asymmetric multiplayer VR with roles based on the size of a user's space. UIST 2016.
Multisensory scuba diving experience in VR. CHI 2016. UIST 2016.
Room-scale SocialVR with object and full-body tracking. It was built before the Vive was released. IEEE VR 2016.
Room-scale SocialVR with full-body tracking. It was built before the Vive was announced. UIST 2015.
A sensor-based bedtime alarm and a connected peripheral display on the wallpaper of the user's mobile phone to promote awareness with sleep data visualization. MobiHealth 2015.
Handwritten Tamil character recognition with a Convolutional Neural Network (CNN). NEML 2014.
In-air gesture recognition system.
Gestural input for mobile and wearable devices.
A continuously morphing 4D geometrical VR world was my first VR project back in 2013.
Three physical microgames for taking short breaks from sitting.
An outdoor fast paced team-based Android game. This was my Master's Thesis at the Media Lab completed in Aug 2013. PUC 2015.
A location based personality generator and visualizer. Persuasive 2013.
Digital pen that transmits writings on paper to the whiteboard. UIST 2012.
An interactive tabletop application that displays a Health Score of a location, neighborhood or a selected area on a map.
Organized a CHI'18 Workshop on Novel Interaction Techniques for Collaboration in VR and MR. My co-organizers are Ken Perlin, Luiz Velho and Mark Bolas.
I co-organized a 4-day VR/AR Hackathon at the MIT Media Lab 2019/2017/2016 with learning workshops, hacking new experiences, and a public expo. Approx. 375 participants annually with sponsors like Microsoft, Samsung, AT&T, HTC and more.
Invited speaker at the 2015 New Context Conference in Tokyo.
I co-led an Intro to Electronics hands-on activity for high school students at MIT Splash 2014.