Why am I not seeing it? Understanding users' needs for counterfactual explanations in everyday recommendations
Proceedings of the 2022 ACM Conference on Fairness, Accountability, and …, 2022•dl.acm.org
Intelligent everyday applications typically rely on automated Recommender Systems (RS) to
generate recommendations that help users make decisions among a large number of
options. Due to the increasing complexity of RS and the lack of transparency in its
algorithmic decision-making, researchers have recognized the need to support users with
explanations. While many traditional Explainable AI methods fall short in disclosing the
internal intricacy of recommender systems, counterfactual explanations provide many …
generate recommendations that help users make decisions among a large number of
options. Due to the increasing complexity of RS and the lack of transparency in its
algorithmic decision-making, researchers have recognized the need to support users with
explanations. While many traditional Explainable AI methods fall short in disclosing the
internal intricacy of recommender systems, counterfactual explanations provide many …
Intelligent everyday applications typically rely on automated Recommender Systems (RS) to generate recommendations that help users make decisions among a large number of options. Due to the increasing complexity of RS and the lack of transparency in its algorithmic decision-making, researchers have recognized the need to support users with explanations. While many traditional Explainable AI methods fall short in disclosing the internal intricacy of recommender systems, counterfactual explanations provide many desirable explainable features by offering human-like explanations that contrast an existing recommendation with alternatives. However, there is a lack of empirical research in understanding users’ needs of counterfactual explanations in their usage of everyday intelligent applications. In this paper, we investigate whether and when to provide counterfactual explanations to support people’s decision-making with everyday recommendations through a question-driven approach. We conducted a preliminary survey study and an interview study to understand how existing explanations might be insufficient to support users and elicit the triggers that prompt them to ask why not questions and seek additional explanations. The findings reveal that the utility of decision is a primary factor that may affect their counterfactual information needs. We then conducted an online scenario-based survey to quantify the correlation between utility and explanation needs and found significant correlations between the measured variables.
ACM Digital Library
以上显示的是最相近的搜索结果。 查看全部搜索结果