Question-Based Explainability in Abstract Argumentation - Intelligence Artificielle Accéder directement au contenu
Rapport (Rapport De Recherche) Année : 2022

Question-Based Explainability in Abstract Argumentation

Résumé

This paper explores the definition of questions and the computation of explanations in general, and in the specific context of abstract argumentation. We aim at 1) defining a methodological way to generate questions asking for explanations in a certain context, and 2) defining explanations based on the questions they answer and on the context in which they are asked. Applied to abstract argumentation, the explanations that we define are designed to be visual, in the sense that they take the form of subgraphs of the argumentation graph which is a part of the context of the questions they apply to. Moreover, these explanations rely on the modular aspects of abstract argumentation semantics and can consequently be either aggregated or decomposed. Finally, we also investigate the adequacy of the explanations with several desirable properties that they should possess, with a particular focus on Grice’s maxims of conversation.
Fichier principal
Vignette du fichier
Rapport_IRIT_RR_2022_01_FR.pdf (798.16 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03647896 , version 1 (21-04-2022)

Identifiants

  • HAL Id : hal-03647896 , version 1

Citer

Philippe Besnard, Sylvie Doutre, Théo Duchatelle, Marie-Christine Lagasquie-Schiex. Question-Based Explainability in Abstract Argumentation. [Research Report] IRIT/RR--2022--01--FR, IRIT : Institut de Recherche en Informatique de Toulouse, France. 2022, pp.1-64. ⟨hal-03647896⟩
197 Consultations
77 Téléchargements

Partager

Gmail Facebook X LinkedIn More