Impact Factor 2.129 | CiteScore 2.40
More on impact ›

Original Research ARTICLE Provisionally accepted The full-text will be published soon. Notify me

Front. Psychol. | doi: 10.3389/fpsyg.2019.02626

Development and Validation of the Yonsei Face Database (YFace DB)

  • 1Yonsei University, South Korea
  • 2Chungbuk National University, South Korea
  • 3University of California, Berkeley, United States

The purposes of this study were to develop Yonsei Face Database (YFace DB), consisting of both static and dynamic face stimuli for six basic emotions (happiness, sadness, anger, surprise, fear, and disgust), and to test its validity. The database includes selected pictures (static stimuli) and film clips (dynamic stimuli) of 74 models (50% female) aged between 19 and 40. 1480 selected pictures and film clips were assessed for the accuracy, intensity, and naturalness during the validation procedure by 221 undergraduate students. The overall accuracy of the pictures was 76%. Film clips had a higher accuracy, of 83%; the highest accuracy was observed in happiness and the lowest in fear across all conditions (static with mouth open or closed, or dynamic). Accuracy was higher in film clips across all but happiness and disgust, while naturalness was higher in the pictures than in film clips except for sadness and anger. Intensity varied the most across conditions and emotions. Significant gender effects were found in perception accuracy for both the gender of models and raters. Male raters perceived surprise more accurately in static stimuli with mouth open and dynamic stimuli while female raters perceived fear more accurately in all conditions. Moreover, sadness and anger expressed in static stimuli with mouth open and fear expressed in dynamic stimuli were perceived more accurately when models were male. Disgust expressed in static stimuli with mouth open and dynamic stimuli, and fear expressed in static stimuli with mouth closed were perceived more accurately when models were female. The YFace DB is the largest Asian face database by far and the first to include both static and dynamic facial expression stimuli, and the current study can provide researchers with a wealth of information about the validity of each stimulus through the validation procedure.

Keywords: Face database, Picture stimuli, Facial Expression, Validation, dynamic stimuli

Received: 04 Jul 2019; Accepted: 07 Nov 2019.

Copyright: © 2019 Chung, Kim, Jung and Kim. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Mx. Kyong-Mee Chung, Yonsei University, Seoul, South Korea, kmchung@yonsei.ac.kr