CNN Explanation Methods for Ordinal Regression Tasks

Hits: 36
Áreas de investigación:
Año:
2024
Tipo de publicación:
Artículo
Palabras clave:
convolutional neural networks, explanation methods, ordinal regression
Autores:
Journal:
Neurocomputing
Volumen:
Accepted on 6th November
ISSN:
0925-2312
BibTex:
Nota:
JCR(2023): 5.5, Position: 42/197 (Q1) Category: COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Abstract:
The use of Convolutional Neural Network (CNN) models for image classification tasks has gained significant popularity. However, the lack of interpretability in CNN models poses challenges for debugging and validation. To address this issue, various explanation methods have been developed to provide insights into CNN models. This paper focuses on the validity of these explanation methods for ordinal regression tasks, where the classes have a predefined order relationship. Different modifications are proposed for two explanation methods to exploit the ordinal relationships between classes: Grad-CAM based on Ordinal Binary Decomposition (GradOBD-CAM) and Ordinal Information Bottleneck Analysis (OIBA). The performance of these modified methods is compared to existing popular alternatives. Experimental results demonstrate that GradOBD-CAM outperforms other methods in terms of interpretability for three out of four datasets, while OIBA achieves superior performance compared to IBA.
Comentarios:
JCR(2023): 5.5, Position: 42/197 (Q1) Category: COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Back