Explanation Needs in App Reviews: Taxonomy and Automated Detection

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

  • Max Unterbusch
  • Mersedeh Sadeghi
  • Jannik Fischbach
  • Martin Obaidi
  • Andreas Vogelsang

Research Organisations

External Research Organisations

  • University of Cologne
  • Fortiss GmbH
View graph of relations

Details

Original languageEnglish
Title of host publication2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)
EditorsKurt Schneider, Fabiano Dalpiaz, Jennifer Horkoff
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages102-111
Number of pages10
ISBN (electronic)9798350326918
ISBN (print)979-8-3503-2692-5
Publication statusPublished - 2023
Event31st IEEE International Requirements Engineering Conference Workshops, REW 2023 - Hannover, Germany, Hannover, Germany
Duration: 4 Sept 20235 Sept 2023
Conference number: 31

Publication series

NameIEEE International Requirements Engineering Conference Workshops
ISSN (Print)2770-6826
ISSN (electronic)2770-6834

Abstract

Explainability, i.e. the ability of a system to explain its behavior to users, has become an important quality of software-intensive systems. Recent work has focused on methods for generating explanations for various algorithmic paradigms (e.g., machine learning, self-adaptive systems). There is relatively little work on what situations and types of behavior should be explained. There is also a lack of support for eliciting explainability requirements. In this work, we explore the need for explanation expressed by users in app reviews. We manually coded a set of 1,730 app reviews from 8 apps and derived a taxonomy of Explanation Needs. We also explore several approaches to automatically identify Explanation Needs in app reviews. Our best classifier identifies Explanation Needs in 486 unseen reviews of 4 different apps with a weighted F-score of 86%. Our work contributes to a better understanding of users' Explanation Needs. Automated tools can help engineers focus on these needs and ultimately elicit valid Explanation Needs.

Keywords

    Explainability, NLP, Requirements

ASJC Scopus subject areas

Cite this

Explanation Needs in App Reviews: Taxonomy and Automated Detection. / Unterbusch, Max; Sadeghi, Mersedeh; Fischbach, Jannik et al.
2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). ed. / Kurt Schneider; Fabiano Dalpiaz; Jennifer Horkoff. Institute of Electrical and Electronics Engineers Inc., 2023. p. 102-111 (IEEE International Requirements Engineering Conference Workshops).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Unterbusch, M, Sadeghi, M, Fischbach, J, Obaidi, M & Vogelsang, A 2023, Explanation Needs in App Reviews: Taxonomy and Automated Detection. in K Schneider, F Dalpiaz & J Horkoff (eds), 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). IEEE International Requirements Engineering Conference Workshops, Institute of Electrical and Electronics Engineers Inc., pp. 102-111, 31st IEEE International Requirements Engineering Conference Workshops, REW 2023, Hannover, Lower Saxony, Germany, 4 Sept 2023. https://doi.org/10.48550/arXiv.2307.04367, https://doi.org/10.1109/REW57809.2023.00024
Unterbusch, M., Sadeghi, M., Fischbach, J., Obaidi, M., & Vogelsang, A. (2023). Explanation Needs in App Reviews: Taxonomy and Automated Detection. In K. Schneider, F. Dalpiaz, & J. Horkoff (Eds.), 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW) (pp. 102-111). (IEEE International Requirements Engineering Conference Workshops). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.48550/arXiv.2307.04367, https://doi.org/10.1109/REW57809.2023.00024
Unterbusch M, Sadeghi M, Fischbach J, Obaidi M, Vogelsang A. Explanation Needs in App Reviews: Taxonomy and Automated Detection. In Schneider K, Dalpiaz F, Horkoff J, editors, 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). Institute of Electrical and Electronics Engineers Inc. 2023. p. 102-111. (IEEE International Requirements Engineering Conference Workshops). doi: 10.48550/arXiv.2307.04367, 10.1109/REW57809.2023.00024
Unterbusch, Max ; Sadeghi, Mersedeh ; Fischbach, Jannik et al. / Explanation Needs in App Reviews: Taxonomy and Automated Detection. 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW). editor / Kurt Schneider ; Fabiano Dalpiaz ; Jennifer Horkoff. Institute of Electrical and Electronics Engineers Inc., 2023. pp. 102-111 (IEEE International Requirements Engineering Conference Workshops).
Download
@inproceedings{f9623cacde0448839b8a97d9526a0f07,
title = "Explanation Needs in App Reviews: Taxonomy and Automated Detection",
abstract = "Explainability, i.e. the ability of a system to explain its behavior to users, has become an important quality of software-intensive systems. Recent work has focused on methods for generating explanations for various algorithmic paradigms (e.g., machine learning, self-adaptive systems). There is relatively little work on what situations and types of behavior should be explained. There is also a lack of support for eliciting explainability requirements. In this work, we explore the need for explanation expressed by users in app reviews. We manually coded a set of 1,730 app reviews from 8 apps and derived a taxonomy of Explanation Needs. We also explore several approaches to automatically identify Explanation Needs in app reviews. Our best classifier identifies Explanation Needs in 486 unseen reviews of 4 different apps with a weighted F-score of 86%. Our work contributes to a better understanding of users' Explanation Needs. Automated tools can help engineers focus on these needs and ultimately elicit valid Explanation Needs.",
keywords = "Explainability, NLP, Requirements",
author = "Max Unterbusch and Mersedeh Sadeghi and Jannik Fischbach and Martin Obaidi and Andreas Vogelsang",
note = "Funding Information: ACKNOWLEDGEMENTS This work was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Grant No.: 470146331, project softXplain (2022-2025).; 31st IEEE International Requirements Engineering Conference Workshops, REW 2023 ; Conference date: 04-09-2023 Through 05-09-2023",
year = "2023",
doi = "10.48550/arXiv.2307.04367",
language = "English",
isbn = "979-8-3503-2692-5",
series = "IEEE International Requirements Engineering Conference Workshops",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "102--111",
editor = "Kurt Schneider and Fabiano Dalpiaz and Jennifer Horkoff",
booktitle = "2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)",
address = "United States",

}

Download

TY - GEN

T1 - Explanation Needs in App Reviews: Taxonomy and Automated Detection

AU - Unterbusch, Max

AU - Sadeghi, Mersedeh

AU - Fischbach, Jannik

AU - Obaidi, Martin

AU - Vogelsang, Andreas

N1 - Conference code: 31

PY - 2023

Y1 - 2023

N2 - Explainability, i.e. the ability of a system to explain its behavior to users, has become an important quality of software-intensive systems. Recent work has focused on methods for generating explanations for various algorithmic paradigms (e.g., machine learning, self-adaptive systems). There is relatively little work on what situations and types of behavior should be explained. There is also a lack of support for eliciting explainability requirements. In this work, we explore the need for explanation expressed by users in app reviews. We manually coded a set of 1,730 app reviews from 8 apps and derived a taxonomy of Explanation Needs. We also explore several approaches to automatically identify Explanation Needs in app reviews. Our best classifier identifies Explanation Needs in 486 unseen reviews of 4 different apps with a weighted F-score of 86%. Our work contributes to a better understanding of users' Explanation Needs. Automated tools can help engineers focus on these needs and ultimately elicit valid Explanation Needs.

AB - Explainability, i.e. the ability of a system to explain its behavior to users, has become an important quality of software-intensive systems. Recent work has focused on methods for generating explanations for various algorithmic paradigms (e.g., machine learning, self-adaptive systems). There is relatively little work on what situations and types of behavior should be explained. There is also a lack of support for eliciting explainability requirements. In this work, we explore the need for explanation expressed by users in app reviews. We manually coded a set of 1,730 app reviews from 8 apps and derived a taxonomy of Explanation Needs. We also explore several approaches to automatically identify Explanation Needs in app reviews. Our best classifier identifies Explanation Needs in 486 unseen reviews of 4 different apps with a weighted F-score of 86%. Our work contributes to a better understanding of users' Explanation Needs. Automated tools can help engineers focus on these needs and ultimately elicit valid Explanation Needs.

KW - Explainability

KW - NLP

KW - Requirements

UR - http://www.scopus.com/inward/record.url?scp=85174711553&partnerID=8YFLogxK

U2 - 10.48550/arXiv.2307.04367

DO - 10.48550/arXiv.2307.04367

M3 - Conference contribution

AN - SCOPUS:85174711553

SN - 979-8-3503-2692-5

T3 - IEEE International Requirements Engineering Conference Workshops

SP - 102

EP - 111

BT - 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW)

A2 - Schneider, Kurt

A2 - Dalpiaz, Fabiano

A2 - Horkoff, Jennifer

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 31st IEEE International Requirements Engineering Conference Workshops, REW 2023

Y2 - 4 September 2023 through 5 September 2023

ER -

By the same author(s)