Explainable Software Systems: From Requirements Analysis to System Evaluation

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autoren

  • Larissa Chazette
  • Wasja Brunotte
  • Timo Speith
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Seiten (von - bis)457-487
Seitenumfang31
FachzeitschriftRequirements Engineering
Jahrgang27
Ausgabenummer4
Frühes Online-Datum14 Nov. 2022
PublikationsstatusVeröffentlicht - Dez. 2022

Abstract

The growing complexity of software systems and the influence of software-supported decisions in our society sparked the need for software that is transparent, accountable, and trustworthy. Explainability has been identified as a means to achieve these qualities. It is recognized as an emerging non-functional requirement (NFR) that has a significant impact on system quality. Accordingly, software engineers need means to assist them in incorporating this NFR into systems. This requires an early analysis of the benefits and possible design issues that arise from interrelationships between different quality aspects. However, explainability is currently under-researched in the domain of requirements engineering, and there is a lack of artifacts that support the requirements engineering process and system design. In this work, we remedy this deficit by proposing four artifacts: a definition of explainability, a conceptual model, a knowledge catalogue, and a reference model for explainable systems. These artifacts should support software and requirements engineers in understanding the definition of explainability and how it interacts with other quality aspects. Besides that, they may be considered a starting point to provide practical value in the refinement of explainability from high-level requirements to concrete design choices, as well as on the identification of methods and metrics for the evaluation of the implemented requirements.

ASJC Scopus Sachgebiete

Zitieren

Explainable Software Systems: From Requirements Analysis to System Evaluation. / Chazette, Larissa; Brunotte, Wasja; Speith, Timo.
in: Requirements Engineering, Jahrgang 27, Nr. 4, 12.2022, S. 457-487.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Chazette L, Brunotte W, Speith T. Explainable Software Systems: From Requirements Analysis to System Evaluation. Requirements Engineering. 2022 Dez;27(4):457-487. Epub 2022 Nov 14. doi: 10.1007/s00766-022-00393-5
Chazette, Larissa ; Brunotte, Wasja ; Speith, Timo. / Explainable Software Systems: From Requirements Analysis to System Evaluation. in: Requirements Engineering. 2022 ; Jahrgang 27, Nr. 4. S. 457-487.
Download
@article{df1fdf710e69470393dd75acfc30815f,
title = "Explainable Software Systems: From Requirements Analysis to System Evaluation",
abstract = "The growing complexity of software systems and the influence of software-supported decisions in our society sparked the need for software that is transparent, accountable, and trustworthy. Explainability has been identified as a means to achieve these qualities. It is recognized as an emerging non-functional requirement (NFR) that has a significant impact on system quality. Accordingly, software engineers need means to assist them in incorporating this NFR into systems. This requires an early analysis of the benefits and possible design issues that arise from interrelationships between different quality aspects. However, explainability is currently under-researched in the domain of requirements engineering, and there is a lack of artifacts that support the requirements engineering process and system design. In this work, we remedy this deficit by proposing four artifacts: a definition of explainability, a conceptual model, a knowledge catalogue, and a reference model for explainable systems. These artifacts should support software and requirements engineers in understanding the definition of explainability and how it interacts with other quality aspects. Besides that, they may be considered a starting point to provide practical value in the refinement of explainability from high-level requirements to concrete design choices, as well as on the identification of methods and metrics for the evaluation of the implemented requirements.",
keywords = "Conceptual model, Explainability, Explainable artificial intelligence, Knowledge catalogue, Non-functional requirements, Quality aspects, Reference model",
author = "Larissa Chazette and Wasja Brunotte and Timo Speith",
note = "Funding Information: This work was supported by the research initiative Mobilise between the Technical University of Braunschweig and Leibniz University Hannover, funded by the Ministry for Science and Culture of Lower Saxony and by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany{\textquoteright}s Excellence Strategy within the Cluster of Excellence PhoenixD (EXC 2122, Project ID 390833453). Work on this paper was also funded by the Volkswagen Foundation grants AZ 98509 and AZ 98514 “Explainable Intelligent Systems” (EIS) and by the DFG grant 389792660 as part of TRR 248. We thank Martin Glinz for his feedback on our research design and our colleague Nils Prenner for his feedback and the fruitful discussions. Furthermore, we thank all workshop participants, the anonymous reviewers, and the colleagues who gave feedback on our manuscript. Special gratitude goes to Dieter Belle, for his immeasurable help in staying organized.",
year = "2022",
month = dec,
doi = "10.1007/s00766-022-00393-5",
language = "English",
volume = "27",
pages = "457--487",
journal = "Requirements Engineering",
issn = "0947-3602",
publisher = "Springer London",
number = "4",

}

Download

TY - JOUR

T1 - Explainable Software Systems: From Requirements Analysis to System Evaluation

AU - Chazette, Larissa

AU - Brunotte, Wasja

AU - Speith, Timo

N1 - Funding Information: This work was supported by the research initiative Mobilise between the Technical University of Braunschweig and Leibniz University Hannover, funded by the Ministry for Science and Culture of Lower Saxony and by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany’s Excellence Strategy within the Cluster of Excellence PhoenixD (EXC 2122, Project ID 390833453). Work on this paper was also funded by the Volkswagen Foundation grants AZ 98509 and AZ 98514 “Explainable Intelligent Systems” (EIS) and by the DFG grant 389792660 as part of TRR 248. We thank Martin Glinz for his feedback on our research design and our colleague Nils Prenner for his feedback and the fruitful discussions. Furthermore, we thank all workshop participants, the anonymous reviewers, and the colleagues who gave feedback on our manuscript. Special gratitude goes to Dieter Belle, for his immeasurable help in staying organized.

PY - 2022/12

Y1 - 2022/12

N2 - The growing complexity of software systems and the influence of software-supported decisions in our society sparked the need for software that is transparent, accountable, and trustworthy. Explainability has been identified as a means to achieve these qualities. It is recognized as an emerging non-functional requirement (NFR) that has a significant impact on system quality. Accordingly, software engineers need means to assist them in incorporating this NFR into systems. This requires an early analysis of the benefits and possible design issues that arise from interrelationships between different quality aspects. However, explainability is currently under-researched in the domain of requirements engineering, and there is a lack of artifacts that support the requirements engineering process and system design. In this work, we remedy this deficit by proposing four artifacts: a definition of explainability, a conceptual model, a knowledge catalogue, and a reference model for explainable systems. These artifacts should support software and requirements engineers in understanding the definition of explainability and how it interacts with other quality aspects. Besides that, they may be considered a starting point to provide practical value in the refinement of explainability from high-level requirements to concrete design choices, as well as on the identification of methods and metrics for the evaluation of the implemented requirements.

AB - The growing complexity of software systems and the influence of software-supported decisions in our society sparked the need for software that is transparent, accountable, and trustworthy. Explainability has been identified as a means to achieve these qualities. It is recognized as an emerging non-functional requirement (NFR) that has a significant impact on system quality. Accordingly, software engineers need means to assist them in incorporating this NFR into systems. This requires an early analysis of the benefits and possible design issues that arise from interrelationships between different quality aspects. However, explainability is currently under-researched in the domain of requirements engineering, and there is a lack of artifacts that support the requirements engineering process and system design. In this work, we remedy this deficit by proposing four artifacts: a definition of explainability, a conceptual model, a knowledge catalogue, and a reference model for explainable systems. These artifacts should support software and requirements engineers in understanding the definition of explainability and how it interacts with other quality aspects. Besides that, they may be considered a starting point to provide practical value in the refinement of explainability from high-level requirements to concrete design choices, as well as on the identification of methods and metrics for the evaluation of the implemented requirements.

KW - Conceptual model

KW - Explainability

KW - Explainable artificial intelligence

KW - Knowledge catalogue

KW - Non-functional requirements

KW - Quality aspects

KW - Reference model

UR - http://www.scopus.com/inward/record.url?scp=85141989495&partnerID=8YFLogxK

U2 - 10.1007/s00766-022-00393-5

DO - 10.1007/s00766-022-00393-5

M3 - Article

VL - 27

SP - 457

EP - 487

JO - Requirements Engineering

JF - Requirements Engineering

SN - 0947-3602

IS - 4

ER -