Systematic Review ARTICLE
A Comparative Survey of Methods for Remote Heart Rate Detection From Frontal Face Videos
- 1Computer Science Department, Université de Genève, Switzerland
- 2Swiss Center for Affective Sciences, University of Geneva, Switzerland
Remotely measuring physiological activity can provide substantial benefits for both the medical and affective computing applications. Recent research has proposed different methodologies for the unobtrusive detection of heart rate using human face recordings. These methods are based on subtle color changes or motions of the face due to cardiovascular activities, which are invisible to human eyes but can be captured by digital cameras. Several approaches have been proposed such as signal processing and machine learning. However these methods are compared on different datasets and there is consequently no consensus on method performance. In this paper, we describe and evaluate several methods defined in literature, from 2008 until present day, for the remote detection of heart rate using human face recordings. The general heart rate processing pipeline is divided into three stages: face video processing, face blood volume pulse signal extraction and heart rate computation. Approaches presented in the paper are classified and grouped according to each stage. At each stage, algorithms are analyzed and compared based on their performance using the public database MAHNOB - HCI. Results found in this paper are limited on MAHNOB -HCI dataset and show that extracted face skin area contains more blood volume pulse information. Blind source separation and peak detection methods are more robust with head motions for estimating heart rate.
Keywords: Heart Rate, remote sensing, physiological signals, Photoplethysmography, Human-computer interaction (HCI)
Received: 21 Jul 2017;
Accepted: 13 Mar 2018.
Edited by:Danilo E. De Rossi, Università degli Studi di Pisa, Italy
Reviewed by:Andrea Bonarini, Politecnico di Milano, Italy
Gholamreza Anbarjafari, University of Tartu, Estonia
Copyright: © 2018 WANG, Pun and Chanel. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Ms. CHEN WANG, Université de Genève, Computer Science Department, Geneva, Switzerland, Chen.Wang@unige.ch