AUTHOR=Giavina-Bianchi Mara , Vitor William Gois , Fornasiero de Paiva Victor , Okita Aline Lissa , Sousa Raquel Machado , Machado Birajara TITLE=Explainability agreement between dermatologists and five visual explanations techniques in deep neural networks for melanoma AI classification JOURNAL=Frontiers in Medicine VOLUME=Volume 10 - 2023 YEAR=2023 URL=https://www.frontiersin.org/journals/medicine/articles/10.3389/fmed.2023.1241484 DOI=10.3389/fmed.2023.1241484 ISSN=2296-858X ABSTRACT=The use of deep convolutional neural networks for analyzing skin lesion images has shown promising results. The identification of skin cancer by faster and less expensive means can lead to an early diagnosis, saving lives and avoiding treatment costs. However, to implement this technology in a clinical context, it is important for specialists to understand why a certain model makes a prediction; it must be explainable. Explainability techniques can be used to highlight the patterns of interest for a prediction. Our goal was to test five different techniques: Grad-CAM, Grad-CAM++, Score-CAM, Eigen-CAM, and LIME to analyze the agreement rate between features highlighted by the visual explanation maps to 3 important clinical criteria for melanoma classification: asymmetry, border irregularity, and color heterogeneity (ABC rule) in 100 melanoma images. Two dermatologists scored the visual maps and the clinical images using a semiquantitative scale and the results were compared. They also ranked their preferable techniques.We found that the techniques had different agreement rates and acceptance. In the overall analysis, Grad-CAM showed the best total+partial agreement rate (93.6%), followed by LIME (89.8%), Grad-CAM++ (88.0%), Eigen-CAM (86.4%) and Score-CAM (84.6%). Dermatologists ranked their favorite options: Grad-CAM and Grad-CAM++, followed by Score-CAM, LIME, and Eigen-CAM.