Machine learning, pattern recognition and data mining in the cloud or on IoT devices such as phones and wearables
Algorithms to recognize human activity, location, behavior, emotion, and environment from 1000's of sensors
Expertise on electronic design, multi-modal sensors and bio-sensors as input for intelligent algorithms
HW and SW design optimized for smartphones, smartwatches, wearables, and other industrial IoT devices
Emmanuel Munguia Tapia received his PhD and MS degrees from the Massachusetts Institute of Technology (MIT) and has 15 years of multi-disciplinary expertise combining machine learning, artificial intelligence, context awareness and novel sensors to make mobile, wearable and IoT devices smarter. He is presently a senior engineering manager in cognitive computing systems at Intel Corporation. He was previously the director of context awareness for Samsung. He was the recipient of the Samsung Gold Medal Award for creating the most innovative technology company wide in 2014 and also the recipient of the 10 year impact award at UBICOMP 2014, the top International Joint Conference on Pervasive and Ubiquitous Computing. Emmanuel holds 36+ international publications, 10+ patents and a degree on Engineering Leadership from the University of California Berkeley.
Doctor of Philosophy in Media Arts and Sciences
Areas of specialization: Machine learning, data mining, pattern recognition, computer vision and artificial intelligence applied to context awareness, pervasive, ubiquitous, and wearable computingMaster of Science in Media Arts and Sciences
Emphasis: machine learning, pattern recognition, artificial intelligence and computer vision. Research: algorithms for automatic human activity recognition from sensor data.Bachelor of Science in Communications and Electronics Engineering
Emphasis: Software engineering, analog and digital electronic design, sensors, communication systems, embedded systems, control systems, microprocessors and micro-controllers.Associate of Science degree in Electronics and Embedded Systems
Associate of Science degree in Programming Languages and Computer Networks
Tackling the Challenges of Big Data Program
Engineering Leadership Professional Program
I am a citizen of the United States of America
I am a Mexican American originally born in Cd. Obregon, Sonora, Mexico. I obtained my permanent residency in the United States of America via the EB-1 person of extraordinary ability immigration option after finishing my Ph.D. studies at MIT and naturalized as a citizen of the United States of America in 2015.
Research, architect, and build novel cognitive computing systems combining machine learning, context awareness, and sensors to make smartphones, tablets, wearables, and IoT devices more intelligent and personalized. Represent Intel in relationships with Google and Microsoft.
Direct Samsung's Context Awareness team. A group of 30 researchers creating product quality algorithms for the Galaxy and Tizen smartphones and wearables to infer user's human activity, context, social interactions and location.
Create algorithms to infer users' activity, location and context from sensor data collected by mobile devices and use this information to improve algorithms for personalized search, recommendation and behavioral modeling for millions of users based on large-scale analytics.
Project Lead focusing on applying machine learning and data mining techniques to large scale data generated by Oracle products in the domain of discrete manufacturing, telecommunications, and master data management.
Developed a real-time system to infer human behavior in a building equipped with a wireless sensor network of 500 motion activated sensors that I designed, built and installed in the ceiling of a two floor office space. I applied hierarchical dynamic Bayesian networks and decision trees.
Developed algorithms based on dynamic Bayesian networks, statistical shrinkage and ontologies of common sense knowledge to recognize human activities from a wearable bracelet equipped with an RFID reader to sense people's interactions with RFID tagged objects.
Eight years of machine learning, artificial intelligence, computer vision and sensors. Designed software and real-time systems to recognize human activity and context from wearable sensors, hundreds of sensors placed in the environment and cameras.
Five years of algorithms, data structures, software engineering, numerical methods, embedded software, analog and digital design, embedded systems, instrumentation, sensors, power electronics, control systems, communication systems and microcontrollers.
MobileMiner: Mining Your Frequent Behavior Patterns on Your Phone, UbiComp
MobileMiner mines for user behavioral rules over user soft and sensor data stored on a smartphone. The system was tested over 106 users for 3 months of data each. It only takes 2.6 minutes for the algorithm to process 3 months of user data on a Samsung Galaxy Smartphone to find all the behavioral rules.
continue readingBoe: Context-aware Global Power Management for Mobile Devices Balancing Battery Outage and User Experience, MASS
This work takes into account the user activities and smartphone’s usage patterns to dynamically adjust the device’s global power management policy to minimize power outage time while maximizing user experience.
continue readingCrowdsourced Mobile Data Collection: Lessons Learned from a New Study Methodology, HotMobile
This work presents a scalable data collection methodology that simultaneously achieves low cost and a high degree of control. We use crowdsourcing to recruit 63 subjects for a 90-day data collection that resulted in over 75,000 hours of data.
TIPS: Context-Aware Implicit User Identification using Touch Screen in Uncontrolled Environments, HotMobile
TIPS is a touch based identity protection service that implicitly and unobtrusively authenticates users in the background by continuously analyzing touch screen gestures in the context of a running application. This is the first work to incorporate contextual app information to improve user authentication on phones.
continue readingCommSense: Identify Social Relationship with Phone Contacts via Mining Communications, MDM
This is an on-device mining framework to deeply understand smartphone users' social relationships by mining mobile communication data. Automatically learning such relationships can support useful applications such as automatically categorizing mobile contacts, identifying their relative importance, and managing social communication with little user effort.
continue readingmFingerprint: Privacy-Preserving User Modeling with Multimodal Mobile Device Footprints, SBP
In contrast to many user identification studies that try to identify users based on raw sensor data such as raw location, browser history, and application data, mFingerprint computes high-level statistical features that do not disclose sensitive user information. This allows applications to share these privacy-preserving features to enable personalized services on devices.
continue readingEfficient In-Pocket Detection with Mobile Phones, UbiComp
In this work, we use sensor fusion and pattern recognition to detect the common placements of a mobile phone, such as “in pocket”, “in bag” or “out of pocket or bag” using the embedded proximity (IR) and light sensors on smartphones. The detection results are demonstrated on a Samsung Tizen Smartphone.
Smartphone Bluetooth based Social Sensing, UbiComp
This is a framework to infer smartphone user’s context and sociability from Bluetooth data. We employ features such as number of nearby Bluetooth devices and their semantics (e.g. smartphone or computer) to infer if a user is in an office, meeting, lunch, commute, and home. If a user’s Bluetooth device entropy is always high indicating user frequently meets new people, we consider the user to be more “social” compared to others
continue readingUsing Machine Learning for Real-time Activity Recognition and Estimation of Energy Expenditure, MIT PhD Thesis
In this PhD thesis work, I created algorithms to recognize 52 human activities from 7 wearable accelerometers and a heart rate monitor as well as algorithms to compute human energy expenditure from the same sensors. I analyzed trade-offs between accuracy, number of sensors to use, and different machine learning algorithms and signal processing techniques to achieve a system that balances usability and real-time performance on smartphones.
continue readingPortable Wireless Sensors for Object Usage Sensing in the Home: Challenges and Practicalities, AMI
In this work I present the design of tiny sensors that can be easily attached to any object in the environment to detect object usage (e.g. open a drawer, sit on a chair, etc.) The sensors combine an accelerometer and piezo sensors to run for years from a coin cell battery and use machine algorithms to differentiate motion usage patterns from different object types (chairs, doors, cups, remotes, etc).
Real-Time Recognition of Physical Activities and Their Intensities Using Wireless Accelerometers and a Heart Rate Monitor, ISWC
In this work, I present a real-time algorithm for automatic recognition of physical activities and their intensities, using five tri-axial wireless accelerometers and a wireless heart rate monitor. The algorithm was evaluated using datasets of 30 physical gymnasium activities collected from 21 people at two labs.
continue readingA Long-Term Evaluation of Sensing Modalities for Activity Recognition, UbiComp
In this work, we identify the most valuable sensors for activity recognition from 104 hours of annotated data collected from one person living in an instrumented home. The home contained over 900 sensor inputs, including wired reed switches, current and water flow inputs, object and person motion detectors, and RFID tags. Our aim was.
continue readingToward Scalable Activity Recognition for Sensor Networks, LOCA
As an intern at Mitsubishi Electric Research Labs, I created this real-time system to infer human behavior in a building equipped with 500 wireless motion activated sensors installed in the ceiling. The system recognizes individual and group activities such as walking, loitering, visiting and meeting in real-time using hierarchical dynamic Bayesian networks and decision trees.
Building Reliable Activity Models Using Hierarchical Shrinkage and Mined Ontology, Pervasive
At Intel Research Seattle, I developed algorithms based on dynamic Bayesian networks, statistical shrinkage and common sense knowledge ontologies to recognize human activities from people’s interactions with RFID tagged objects. The activity models were automatically mined from the web to minimize the need for training data. The system was tested over 126 examples of human activities performed by 9 subjects when 108 RFID tags were installed at everyday objects in a real home.
continue readingThe Design of a Portable Kit of Wireless Sensors for Naturalistic Data Collection, Pervasive
I designed MITes as a flexible kit of wireless sensing devices optimized for ease of use, ease of installation, affordability, and robustness. The kit includes six environmental sensors: movement, movement tuned for object-usage-detection, light, temperature, proximity, and current sensing and five wearable sensors: onbody acceleration, heart rate, ultra-violet radiation exposure, RFID reader wristband, and location beacons.
continue readingUsing a Live-in Laboratory for Ubiquitous Computing Research, Pervasive
In this work we describe the design and operation of the PlaceLab, a live-in laboratory for the study of ubiquitous technologies in home settings that I collaboratively designed and built while a PhD student at MIT. Volunteer research participants individually live in the PlaceLab for days or weeks at a time, treating it as a temporary home. Meanwhile, sensing devices integrated into the fabric of the architecture record a detailed description of their activities.
continue readingThe PlaceLab: A Live-in Laboratory for Pervasive Computing Research, Pervasive
In this video, we introduce the PlaceLab, a live-in laboratory for the study of ubiquitous computing technologies in the home that I collaboratively designed and built when I was a PhD student at MIT. The PlaceLab is a real home where the routine activities and interactions of everyday home life can be observed, recorded for later analysis, and experimentally manipulated.
continue readingA Living Laboratory for the Design and Evaluation of Ubiquitous Computing Interfaces, CHI
The PlaceLab is a tool for researchers developing context-aware and ubiquitous interaction technologies. It complements more traditional data gathering instruments and methods, such as home ethnography and laboratory studies. We describe the sensor data collection capabilities of this living laboratory or smart home and current examples of its use.
continue readingReachMedia: On-the-move Interaction with Everyday Objects, ISWC
In this paper we present “ReachMedia”, a system for seamlessly providing just-in-time information about everyday objects, built around a wireless wristband with an RFID reader to detect objects that the user interacts with and accelerometer sensors to detect user gestural commands. The system enables hands- and eyes- free interaction with relevant information in a socially acceptable manner.
continue readingMITes: Wireless Portable Sensors for Studying Behavior, UbiComp
In this work I introduce the MIT Environmental Sensors (MITes): a portable kit of ubiquitous wireless sensing devices for real-time data collection of human activities in natural settings. The sensors are designed to be used in two ways: (1) determining people’s interaction with objects in the environment, and (2) measuring acceleration on different parts of the body when hundreds of these sensors are deployed in real
continue readingActivity Recognition in the Home Setting Using Simple and Ubiquitous Sensors, Pervasive
In this master’s thesis work, I created a system for recognizing activities in the home setting using a set of small and simple state-change sensors. The sensors are designed to be “tape on and forget” devices that can be quickly and ubiquitously installed in home environments. The system was tested by installing up to 180 sensors in two real homes for two weeks.
Tools for Studying Behavior and Technology in Natural Settings, UbiComp
Three tools for acquiring data about people, their behavior, and their use of technology in natural settings are presented: (1) a context-aware experience sampling tool, (2) a ubiquitous sensing system that detects environmental changes, and (3) an image-based experience sampling system. These tools can provide researchers with a flexible toolkit for collecting data on activity in homes and workplaces.
continue readingAcquiring in Situ Training Data for Context-Aware Ubiquitous Computing Applications, CHI
Algorithms that can automatically detect context from wearable and environmental sensor systems show promise, many of the most flexible and robust systems use probabilistic detection algorithms that require extensive libraries of training data with labeled examples. In this paper, we describe the need for such training data and some challenges we have identified when trying to collect it while testing three context detection systems for ubiquitous computing and mobile applications.
continue readingUbiquitous Video Communication with the Perception of Eye Contact, UbiComp
This video communication system introduces a strategy for creating a video conferencing system for future ubiquitous computing environments that can guarantee two remote conversants the ability to establish eye contact even as people move about their respective environments. The system uses pin-hole cameras, an everywhere display and computer vision algorithms to achieve this goal.
Activity Recognition from Accelerometer Data for Videogame Applications
This is a system to recognize 12 karate gestures to control the Mortal Kombat video game characters using gestures while playing. I built this system on an XBOX three years before Nintendo Wii was released. We competed in the MIT 100K entrepreneur competition with this idea and we lost. The judges commented: “Video gamers want to be lying on a couch, not moving, this will never work…”
continue readingPeople Tracking Using Multiple Cameras with Easy Calibration
I built this system using six cameras to track and determine people’s semantic location (at desk, at couch, etc.) indoors using computer vision and simple calibration techniques. The idea was to have end users install the cameras themselves to perform health monitoring of elderly people at home via human activity recognition from semantic location.