Imagine if you could look at someone and see their heart beat?
·  Surgeons could see whether a transplanted organ/tissue has blood flowing to it.
·  Physical trainers would be able to see if athletes are in their "zone".
·  People could increase their awareness of their physiological impact on others.

Using augmented reality we built a pair of glasses that does exactly that.

Cardiolens Overview.

A mixed reality application that enables real-time hands-free measurement and visualization of blood flow and vital signs.
Our application runs on the Microsoft Hololens without a need for tethering the device or any additional hardware.

The system combines a front-facing camera, remote imaging photoplethysmography software and a heads up display allowing users to view the physiological state of a person simply by looking at them.

Figure 1. Cardiolens is a mixed reality system that allows non-contact physiological measurement of multiple people in real-time. The physiological signals are visualized in the real-world allowing the wearer to see magnified blood perfusion and vital signs in real-time.

We augment the appearance of the subject with the blood flow signal. Augmenting the real-world with physiological signals has key advantages as holograms can be displayed on the objects/people of interest interfering with other elements.

    Figure 2. The user is given feedback about a detected face via a white box overlaid on their real-world environment that highlights the ROI being analyzed. The heart rate is displayed next to the box.

The heart rate of the wearer is displayed in the top of the facial region. The semi-transparent mesh is augments that skin in real-time. An optional pulse wave plot can be displayed below the facial region.

Version 2.0 will allow visualization of arousal/stress from changes in the HRV parameters.

How it works.

    Tracking. Face tracking and skin segentation algorithms are used to locate the regions of interest (ROIs) in in-coming frames from the front-facing camera.
    Sensing. Remote imaging photoplethysmography (iPPG) is used to recover the blood volume pulse, heart rate and heart rate variability of the person you are looking at. Imaging PPG is an advanced set of computer vision methods that enables measurement using just the camera and ambient light - it does not requre calibration.
Visualization. We augment the appearance of the partner with the blood flow signal by altering the brightness of the skin on their face by displaying a semi-transparent mask. We developed a holographic overlay which shows the pulse signal superimposed onto the face using a linear image processing pipeline.

We used the Microsoft Hololens and implemented the system in C#. All the processing is performed on the device and images captured using the front-facing camera. The device does not need to be tethered to a computer. The results demonstrated good performance for capturing users' physiological signals despite camera and head motions.


We validated Cardiolens against gold-standard contact sensor measurements. The mean absolute error was 1.62 beats-per-minute. Using a peak detection algorithm we were able to identify the inter-beat intervals of the heart.

Figure 3. Examples of pulse waveforms from iPPG measurements and the contact sensor.

Figure 4. Scatter plots of the heart rate estimates from iPPG measurements and the contact sensor.


Cardiolens is based on a large-body of scientific research:

Cardiolens: Remote Physiological Monitoring in a Mixed Reality Environment. Christope Hurter and Daniel McDuff. ACM SIGGRAPH 2017.

Pulse and Vital Sign Measurement in Mixed Reality using a HoloLens. Daniel McDuff, Christope Hurter and Mar Gonzalez-Franco. Under review. 2017.

The Creators.
Daniel is a researcher at Microsoft Research in Redmond and a visiting scientist at Brigham and Women's Hospital in Boston. Previously, he was Director of Research at MIT Media Lab Spin-Out Affectiva. Daniel McDuff received his Ph.D. from the MIT Media Lab while working in the Affective Computing group. He received his bachelor's degree, with first-class honors, and master's degree in engineering from the University of Cambridge. The technology Daniel has helped develop is being commercialized by a number of companies including start-ups: Affectiva and Cardiio. His work has been reported in many publications:

Christophe is a professor at the Interactive Data Visualization group (part of the DEVI team) of the French Civil Aviation University (ENAC) in Toulouse, France. In 2014, he received my HDR (Habilitation à Diriger des Recherche) and in 2010 his PhD in Computer Science from the Toulouse university. Christophe is also an associate researcher at the research center for the French Military Air Force Test Center (CReA). Prof. Hurter has contributed to several books, 10 journal papers, more than 40 per reviewed international conference papers, 8 national conference papers, and more than 90 publications in total.
Mar Gonzalez-Franco is a computer scientist working at the forefront of Mixed Reality at Microsoft Research. Her interests range from devices to perception and neuroscience, area in which she completed her PhD. Her work has been covered by global media such as Fortune Magazine, Techcrunch, The Verge, ABC news, GeekWire, Inverse, Euronews and Vice. She has held several positions in leading academic institutions including the Massachusetts Institute of Technology, University College London and Tsinghua University. Willing to impact larger audiences she pivoted into industry; she created and led an immersive tech lab at Airbus Group and then moved into the startup world, joining Traity.


Presented at ACM SIGGRAPH Emerging Technologies


Selected for the SXSW Innovation Awards