CoLiDR: Co ncept L earn i ng using Aggregated D isentangled R epresentations

S Sinha, G Xiong, A Zhang - Proceedings of the 30th ACM SIGKDD …, 2024 - dl.acm.org
Interpretability of Deep Neural Networks using concept-based models offers a promising
way to explain model behavior through human understandable concepts. A parallel line of …

Concept Bottleneck Models Without Predefined Concepts

S Schrodi, J Schur, M Argus, T Brox - arXiv preprint arXiv:2407.03921, 2024 - arxiv.org
There has been considerable recent interest in interpretable concept-based models such as
Concept Bottleneck Models (CBMs), which first predict human-interpretable concepts and …

Improving Concept Alignment in Vision-Language Concept Bottleneck Models

NM Selvaraj, X Guo, AWK Kong, A Kot - arXiv preprint arXiv:2405.01825, 2024 - arxiv.org
Concept Bottleneck Models (CBM) map images to human-interpretable concepts before
making class predictions. Recent approaches automate CBM construction by prompting …

Understanding Inter-Concept Relationships in Concept-Based Models

N Raman, ME Zarlenga, M Jamnik - arXiv preprint arXiv:2405.18217, 2024 - arxiv.org
Concept-based explainability methods provide insight into deep learning systems by
constructing explanations using human-understandable concepts. While the literature on …

Incorporating Expert Rules into Neural Networks in the Framework of Concept-Based Learning

AV Konstantinov, LV Utkin - arXiv preprint arXiv:2402.14726, 2024 - arxiv.org
A problem of incorporating the expert rules into machine learning models for extending the
concept-based learning is formulated in the paper. It is proposed how to combine logical …

Adversarial Robustness in the Context of Concept-based Bottleneck Models/submitted by Franziska Denk

F Denk - 2024 - epub.jku.at
Carefully perturbed images, which trick well-generalizing neural networks into high-
confidence misclassifications, have gained considerable attention in recent years. A subfield …

Advancing Model Explainability: Visual Concept Knowledge Distillation for Concept Bottleneck Model

JH Lee, TV Dang, NK Lee, IH Shin, JY Kim - Available at SSRN 4835782 - papers.ssrn.com
This study marks the inaugural endeavors to amalgamate the concept bottleneck model
(CBM) with knowledge distillation (KD) to cultivate lightweight and interpretable models …

Enhancing Explainability Through Visual Concept Knowledge Distillation on Concept Bottleneck Model

JH Lee, TV Dang, NK Lee, IH Shin, JY Kim - Available at SSRN 4820883 - papers.ssrn.com
This study presents the first attempt to integrate Concept Bottleneck Model (CBM) with
knowledge distillation (KD) to train lightweight and interpretable models. KD is a promising …