Details
Original language | English |
---|---|
Title of host publication | 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW) |
Editors | Kurt Schneider, Fabiano Dalpiaz, Jennifer Horkoff |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 102-111 |
Number of pages | 10 |
ISBN (electronic) | 9798350326918 |
ISBN (print) | 979-8-3503-2692-5 |
Publication status | Published - 2023 |
Event | 31st IEEE International Requirements Engineering Conference Workshops, REW 2023 - Hannover, Germany, Hannover, Germany Duration: 4 Sept 2023 → 5 Sept 2023 Conference number: 31 |
Publication series
Name | IEEE International Requirements Engineering Conference Workshops |
---|---|
ISSN (Print) | 2770-6826 |
ISSN (electronic) | 2770-6834 |
Abstract
Explainability, i.e. the ability of a system to explain its behavior to users, has become an important quality of software-intensive systems. Recent work has focused on methods for generating explanations for various algorithmic paradigms (e.g., machine learning, self-adaptive systems). There is relatively little work on what situations and types of behavior should be explained. There is also a lack of support for eliciting explainability requirements. In this work, we explore the need for explanation expressed by users in app reviews. We manually coded a set of 1,730 app reviews from 8 apps and derived a taxonomy of Explanation Needs. We also explore several approaches to automatically identify Explanation Needs in app reviews. Our best classifier identifies Explanation Needs in 486 unseen reviews of 4 different apps with a weighted F-score of 86%. Our work contributes to a better understanding of users' Explanation Needs. Automated tools can help engineers focus on these needs and ultimately elicit valid Explanation Needs.
Keywords
- Explainability, NLP, Requirements
ASJC Scopus subject areas
- Business, Management and Accounting(all)
- Organizational Behavior and Human Resource Management
- Computer Science(all)
- Software
- Engineering(all)
- Safety, Risk, Reliability and Quality
- Psychology(all)
- Developmental and Educational Psychology
- Social Sciences(all)
- Education
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). ed. / Kurt Schneider; Fabiano Dalpiaz; Jennifer Horkoff. Institute of Electrical and Electronics Engineers Inc., 2023. p. 102-111 (IEEE International Requirements Engineering Conference Workshops).
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - Explanation Needs in App Reviews: Taxonomy and Automated Detection
AU - Unterbusch, Max
AU - Sadeghi, Mersedeh
AU - Fischbach, Jannik
AU - Obaidi, Martin
AU - Vogelsang, Andreas
N1 - Conference code: 31
PY - 2023
Y1 - 2023
N2 - Explainability, i.e. the ability of a system to explain its behavior to users, has become an important quality of software-intensive systems. Recent work has focused on methods for generating explanations for various algorithmic paradigms (e.g., machine learning, self-adaptive systems). There is relatively little work on what situations and types of behavior should be explained. There is also a lack of support for eliciting explainability requirements. In this work, we explore the need for explanation expressed by users in app reviews. We manually coded a set of 1,730 app reviews from 8 apps and derived a taxonomy of Explanation Needs. We also explore several approaches to automatically identify Explanation Needs in app reviews. Our best classifier identifies Explanation Needs in 486 unseen reviews of 4 different apps with a weighted F-score of 86%. Our work contributes to a better understanding of users' Explanation Needs. Automated tools can help engineers focus on these needs and ultimately elicit valid Explanation Needs.
AB - Explainability, i.e. the ability of a system to explain its behavior to users, has become an important quality of software-intensive systems. Recent work has focused on methods for generating explanations for various algorithmic paradigms (e.g., machine learning, self-adaptive systems). There is relatively little work on what situations and types of behavior should be explained. There is also a lack of support for eliciting explainability requirements. In this work, we explore the need for explanation expressed by users in app reviews. We manually coded a set of 1,730 app reviews from 8 apps and derived a taxonomy of Explanation Needs. We also explore several approaches to automatically identify Explanation Needs in app reviews. Our best classifier identifies Explanation Needs in 486 unseen reviews of 4 different apps with a weighted F-score of 86%. Our work contributes to a better understanding of users' Explanation Needs. Automated tools can help engineers focus on these needs and ultimately elicit valid Explanation Needs.
KW - Explainability
KW - NLP
KW - Requirements
UR - http://www.scopus.com/inward/record.url?scp=85174711553&partnerID=8YFLogxK
U2 - 10.48550/arXiv.2307.04367
DO - 10.48550/arXiv.2307.04367
M3 - Conference contribution
AN - SCOPUS:85174711553
SN - 979-8-3503-2692-5
T3 - IEEE International Requirements Engineering Conference Workshops
SP - 102
EP - 111
BT - 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)
A2 - Schneider, Kurt
A2 - Dalpiaz, Fabiano
A2 - Horkoff, Jennifer
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 31st IEEE International Requirements Engineering Conference Workshops, REW 2023
Y2 - 4 September 2023 through 5 September 2023
ER -