AUTHOR=Li Ruixin , Liang Yan , Liu Xiaojian , Wang Bingbing , Huang Wenxin , Cai Zhaoxin , Ye Yaoguang , Qiu Lina , Pan Jiahui TITLE=MindLink-Eumpy: An Open-Source Python Toolbox for Multimodal Emotion Recognition JOURNAL=Frontiers in Human Neuroscience VOLUME=Volume 15 - 2021 YEAR=2021 URL=https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2021.621493 DOI=10.3389/fnhum.2021.621493 ISSN=1662-5161 ABSTRACT=Emotion recognition plays an important role in intelligent human computer interaction, but the related research still faces low accuracy and subject-dependent limitation. In this paper, an open-source software toolbox called MindLink-Eumpy is developed to recognize emotions by integrating electroencephalogram (EEG) and facial expression information. MindLink-Eumpy applies a series of tools to automatically obtain physiological data from subjects. In the detection of facial expressions, the algorithm used by MindLink-Eumpy is a multi-task convolutional neural network (CNN) based on transfer learning technique. In the detection of EEG, MindLink-Eumpy provides two algorithms, including a subject-dependent model based on support vector machine (SVM) and a subject-independent model based on long short-term memory network (LSTM). In the decision-level fusion, the weight enumerator and AdaBoost technique are applied to combine the predictions of SVM and CNN. Two offline experiments were conducted on the DEAP dataset and MAHNOB-HCI dataset respectively and an online experiment was conducted on fifteen healthy subjects. The results show that multimodal methods outperform single-modal methods in both offline and online experiments. In the subject-dependent condition, the multimodal method achieved an accuracy of 71.00% for the valence dimension, and an accuracy of 72.14% for the arousal dimension. In the subject-independent condition, the LSTM-based method achieved an accuracy of 78.56% for the valence dimension, and an accuracy of 77.22% for the arousal dimension. The feasibility and efficiency of MindLink-Eumpy for emotion recognition is thus demonstrated.