Automatic Generation of Explainability Requirements and Software Explanations From User Reviews

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autorschaft

Organisationseinheiten

Externe Organisationen

  • Fachhochschule für die Wirtschaft (FHDW) Hannover
  • Fortiss GmbH
  • Phoenix Contact GmbH and Co. KG
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksProceedings - 2025 IEEE 33rd International Requirements Engineering Conference Workshops, REW 2025
Herausgeber (Verlag)Institute of Electrical and Electronics Engineers Inc.
Seiten49-58
Seitenumfang10
ISBN (elektronisch)9798331538347
ISBN (Print)979-8-3315-3835-4
PublikationsstatusVeröffentlicht - 1 Sept. 2025
Veranstaltung33rd IEEE International Requirements Engineering Conference Workshops, REW 2025 - Valencia, Spanien
Dauer: 1 Sept. 20255 Sept. 2025

Publikationsreihe

NameProceedings - IEEE International Requirements Engineering Conference Workshops
ISSN (Print)2770-6826
ISSN (elektronisch)2770-6834

Abstract

Explainability has become a crucial non-functional requirement to enhance transparency, build user trust, and ensure regulatory compliance. However, translating explanation needs expressed in user feedback into structured requirements and corresponding explanations remains challenging. While existing methods can identify explanation-related concerns in user reviews, there is no established approach for systematically deriving requirements and generating aligned explanations. To contribute toward addressing this gap, we introduce a tool-supported approach that automates this process. To evaluate its effectiveness, we collaborated with an industrial automation manufacturer to create a dataset of 58 user reviews, each annotated with manually crafted explainability requirements and explanations. Our evaluation shows that while AI-generated requirements often lack relevance and correctness compared to human-created ones, the AI-generated explanations are frequently preferred for their clarity and style. Nonetheless, correctness remains an issue, highlighting the importance of human validation. This work contributes to the advancement of explainability requirements in software systems by (1) introducing an automated approach to derive requirements from user reviews and generate corresponding explanations, (2) providing empirical insights into the strengths and limitations of automatically generated artifacts, and (3) releasing a curated dataset to support future research on the automatic generation of explainability requirements.

ASJC Scopus Sachgebiete

Zitieren

Automatic Generation of Explainability Requirements and Software Explanations From User Reviews. / Obaidi, Martin; Droste, Jakob; Deters, Hannah et al.
Proceedings - 2025 IEEE 33rd International Requirements Engineering Conference Workshops, REW 2025. Institute of Electrical and Electronics Engineers Inc., 2025. S. 49-58 (Proceedings - IEEE International Requirements Engineering Conference Workshops).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Obaidi, M, Droste, J, Deters, H, Herrmann, M, Schneider, K, Klünder, J, Villamizar, H, Fischbach, J & Krätzig, S 2025, Automatic Generation of Explainability Requirements and Software Explanations From User Reviews. in Proceedings - 2025 IEEE 33rd International Requirements Engineering Conference Workshops, REW 2025. Proceedings - IEEE International Requirements Engineering Conference Workshops, Institute of Electrical and Electronics Engineers Inc., S. 49-58, 33rd IEEE International Requirements Engineering Conference Workshops, REW 2025, Valencia, Spanien, 1 Sept. 2025. https://doi.org/10.1109/REW66121.2025.00011
Obaidi, M., Droste, J., Deters, H., Herrmann, M., Schneider, K., Klünder, J., Villamizar, H., Fischbach, J., & Krätzig, S. (2025). Automatic Generation of Explainability Requirements and Software Explanations From User Reviews. In Proceedings - 2025 IEEE 33rd International Requirements Engineering Conference Workshops, REW 2025 (S. 49-58). (Proceedings - IEEE International Requirements Engineering Conference Workshops). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/REW66121.2025.00011
Obaidi M, Droste J, Deters H, Herrmann M, Schneider K, Klünder J et al. Automatic Generation of Explainability Requirements and Software Explanations From User Reviews. in Proceedings - 2025 IEEE 33rd International Requirements Engineering Conference Workshops, REW 2025. Institute of Electrical and Electronics Engineers Inc. 2025. S. 49-58. (Proceedings - IEEE International Requirements Engineering Conference Workshops). doi: 10.1109/REW66121.2025.00011
Obaidi, Martin ; Droste, Jakob ; Deters, Hannah et al. / Automatic Generation of Explainability Requirements and Software Explanations From User Reviews. Proceedings - 2025 IEEE 33rd International Requirements Engineering Conference Workshops, REW 2025. Institute of Electrical and Electronics Engineers Inc., 2025. S. 49-58 (Proceedings - IEEE International Requirements Engineering Conference Workshops).
Download
@inproceedings{d3e225ea7abc4b8ab14b125196ddc39a,
title = "Automatic Generation of Explainability Requirements and Software Explanations From User Reviews",
abstract = "Explainability has become a crucial non-functional requirement to enhance transparency, build user trust, and ensure regulatory compliance. However, translating explanation needs expressed in user feedback into structured requirements and corresponding explanations remains challenging. While existing methods can identify explanation-related concerns in user reviews, there is no established approach for systematically deriving requirements and generating aligned explanations. To contribute toward addressing this gap, we introduce a tool-supported approach that automates this process. To evaluate its effectiveness, we collaborated with an industrial automation manufacturer to create a dataset of 58 user reviews, each annotated with manually crafted explainability requirements and explanations. Our evaluation shows that while AI-generated requirements often lack relevance and correctness compared to human-created ones, the AI-generated explanations are frequently preferred for their clarity and style. Nonetheless, correctness remains an issue, highlighting the importance of human validation. This work contributes to the advancement of explainability requirements in software systems by (1) introducing an automated approach to derive requirements from user reviews and generate corresponding explanations, (2) providing empirical insights into the strengths and limitations of automatically generated artifacts, and (3) releasing a curated dataset to support future research on the automatic generation of explainability requirements.",
keywords = "app reviews, explainability, large language models, requirements engineering, user feedback",
author = "Martin Obaidi and Jakob Droste and Hannah Deters and Marc Herrmann and Kurt Schneider and Jil Kl{\"u}nder and Hugo Villamizar and Jannik Fischbach and Steffen Kr{\"a}tzig",
note = "Publisher Copyright: {\textcopyright} 2025 IEEE.; 33rd IEEE International Requirements Engineering Conference Workshops, REW 2025, REW 2025 ; Conference date: 01-09-2025 Through 05-09-2025",
year = "2025",
month = sep,
day = "1",
doi = "10.1109/REW66121.2025.00011",
language = "English",
isbn = "979-8-3315-3835-4",
series = "Proceedings - IEEE International Requirements Engineering Conference Workshops",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "49--58",
booktitle = "Proceedings - 2025 IEEE 33rd International Requirements Engineering Conference Workshops, REW 2025",
address = "United States",

}

Download

TY - GEN

T1 - Automatic Generation of Explainability Requirements and Software Explanations From User Reviews

AU - Obaidi, Martin

AU - Droste, Jakob

AU - Deters, Hannah

AU - Herrmann, Marc

AU - Schneider, Kurt

AU - Klünder, Jil

AU - Villamizar, Hugo

AU - Fischbach, Jannik

AU - Krätzig, Steffen

N1 - Publisher Copyright: © 2025 IEEE.

PY - 2025/9/1

Y1 - 2025/9/1

N2 - Explainability has become a crucial non-functional requirement to enhance transparency, build user trust, and ensure regulatory compliance. However, translating explanation needs expressed in user feedback into structured requirements and corresponding explanations remains challenging. While existing methods can identify explanation-related concerns in user reviews, there is no established approach for systematically deriving requirements and generating aligned explanations. To contribute toward addressing this gap, we introduce a tool-supported approach that automates this process. To evaluate its effectiveness, we collaborated with an industrial automation manufacturer to create a dataset of 58 user reviews, each annotated with manually crafted explainability requirements and explanations. Our evaluation shows that while AI-generated requirements often lack relevance and correctness compared to human-created ones, the AI-generated explanations are frequently preferred for their clarity and style. Nonetheless, correctness remains an issue, highlighting the importance of human validation. This work contributes to the advancement of explainability requirements in software systems by (1) introducing an automated approach to derive requirements from user reviews and generate corresponding explanations, (2) providing empirical insights into the strengths and limitations of automatically generated artifacts, and (3) releasing a curated dataset to support future research on the automatic generation of explainability requirements.

AB - Explainability has become a crucial non-functional requirement to enhance transparency, build user trust, and ensure regulatory compliance. However, translating explanation needs expressed in user feedback into structured requirements and corresponding explanations remains challenging. While existing methods can identify explanation-related concerns in user reviews, there is no established approach for systematically deriving requirements and generating aligned explanations. To contribute toward addressing this gap, we introduce a tool-supported approach that automates this process. To evaluate its effectiveness, we collaborated with an industrial automation manufacturer to create a dataset of 58 user reviews, each annotated with manually crafted explainability requirements and explanations. Our evaluation shows that while AI-generated requirements often lack relevance and correctness compared to human-created ones, the AI-generated explanations are frequently preferred for their clarity and style. Nonetheless, correctness remains an issue, highlighting the importance of human validation. This work contributes to the advancement of explainability requirements in software systems by (1) introducing an automated approach to derive requirements from user reviews and generate corresponding explanations, (2) providing empirical insights into the strengths and limitations of automatically generated artifacts, and (3) releasing a curated dataset to support future research on the automatic generation of explainability requirements.

KW - app reviews

KW - explainability

KW - large language models

KW - requirements engineering

KW - user feedback

UR - http://www.scopus.com/inward/record.url?scp=105020976842&partnerID=8YFLogxK

U2 - 10.1109/REW66121.2025.00011

DO - 10.1109/REW66121.2025.00011

M3 - Conference contribution

AN - SCOPUS:105020976842

SN - 979-8-3315-3835-4

T3 - Proceedings - IEEE International Requirements Engineering Conference Workshops

SP - 49

EP - 58

BT - Proceedings - 2025 IEEE 33rd International Requirements Engineering Conference Workshops, REW 2025

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 33rd IEEE International Requirements Engineering Conference Workshops, REW 2025

Y2 - 1 September 2025 through 5 September 2025

ER -

Von denselben Autoren