Improving the Reliability of Failure Prediction Models through Concept Drift Monitoring

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

External Research Organisations

  • Delft University of Technology (TU Delft)
View graph of relations

Details

Original languageEnglish
Title of host publicationProceedings - 2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1-8
Number of pages8
ISBN (electronic)9798331501907
ISBN (print)979-8-3315-0191-4
Publication statusPublished - 3 May 2025
Event2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025 - Ottawa, Canada
Duration: 3 May 20253 May 2025

Abstract

Failure prediction models can be significantly beneficial for managing large-scale complex software systems, but their trustworthiness is severely affected by changes in the data over time, also known as concept drift. Thus, monitoring these models against concept drift and retraining them when the data changes becomes crucial in designing reliable failure prediction models. In this work, we evaluate the effects of monitoring failure prediction models over time using label-independent (unsupervised) drift detectors. We show that retraining based on unsupervised drift detectors instead of periodically reduces the cost of acquiring true labels without compromising accuracy. Furthermore, we propose a novel feature reduction for unsupervised drift detectors and an evaluation pipeline that practitioners can employ to select the most suitable unsupervised drift detector for their application.

Keywords

    concept drift, concept drift detection, failure prediction, machine learning monitoring

ASJC Scopus subject areas

Cite this

Improving the Reliability of Failure Prediction Models through Concept Drift Monitoring. / Poenaru-Olaru, Lorena; Miranda da Cruz, Luis; Rellermeyer, Jan S. et al.
Proceedings - 2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025. Institute of Electrical and Electronics Engineers Inc., 2025. p. 1-8.

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Poenaru-Olaru, L, Miranda da Cruz, L, Rellermeyer, JS & van Deursen, A 2025, Improving the Reliability of Failure Prediction Models through Concept Drift Monitoring. in Proceedings - 2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025. Institute of Electrical and Electronics Engineers Inc., pp. 1-8, 2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025, Ottawa, Ontario, Canada, 3 May 2025. https://doi.org/10.1109/DeepTest66595.2025.00006
Poenaru-Olaru, L., Miranda da Cruz, L., Rellermeyer, J. S., & van Deursen, A. (2025). Improving the Reliability of Failure Prediction Models through Concept Drift Monitoring. In Proceedings - 2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025 (pp. 1-8). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/DeepTest66595.2025.00006
Poenaru-Olaru L, Miranda da Cruz L, Rellermeyer JS, van Deursen A. Improving the Reliability of Failure Prediction Models through Concept Drift Monitoring. In Proceedings - 2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025. Institute of Electrical and Electronics Engineers Inc. 2025. p. 1-8 doi: 10.1109/DeepTest66595.2025.00006
Poenaru-Olaru, Lorena ; Miranda da Cruz, Luis ; Rellermeyer, Jan S. et al. / Improving the Reliability of Failure Prediction Models through Concept Drift Monitoring. Proceedings - 2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025. Institute of Electrical and Electronics Engineers Inc., 2025. pp. 1-8
Download
@inproceedings{ecc0f9bf14ce49e18b0887bdce35d53d,
title = "Improving the Reliability of Failure Prediction Models through Concept Drift Monitoring",
abstract = "Failure prediction models can be significantly beneficial for managing large-scale complex software systems, but their trustworthiness is severely affected by changes in the data over time, also known as concept drift. Thus, monitoring these models against concept drift and retraining them when the data changes becomes crucial in designing reliable failure prediction models. In this work, we evaluate the effects of monitoring failure prediction models over time using label-independent (unsupervised) drift detectors. We show that retraining based on unsupervised drift detectors instead of periodically reduces the cost of acquiring true labels without compromising accuracy. Furthermore, we propose a novel feature reduction for unsupervised drift detectors and an evaluation pipeline that practitioners can employ to select the most suitable unsupervised drift detector for their application.",
keywords = "concept drift, concept drift detection, failure prediction, machine learning monitoring",
author = "Lorena Poenaru-Olaru and {Miranda da Cruz}, Luis and Rellermeyer, {Jan S.} and {van Deursen}, Arie",
note = "Publisher Copyright: {\textcopyright} 2025 IEEE.; 2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025, DeepTest 2025 ; Conference date: 03-05-2025 Through 03-05-2025",
year = "2025",
month = may,
day = "3",
doi = "10.1109/DeepTest66595.2025.00006",
language = "English",
isbn = "979-8-3315-0191-4",
pages = "1--8",
booktitle = "Proceedings - 2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

Download

TY - GEN

T1 - Improving the Reliability of Failure Prediction Models through Concept Drift Monitoring

AU - Poenaru-Olaru, Lorena

AU - Miranda da Cruz, Luis

AU - Rellermeyer, Jan S.

AU - van Deursen, Arie

N1 - Publisher Copyright: © 2025 IEEE.

PY - 2025/5/3

Y1 - 2025/5/3

N2 - Failure prediction models can be significantly beneficial for managing large-scale complex software systems, but their trustworthiness is severely affected by changes in the data over time, also known as concept drift. Thus, monitoring these models against concept drift and retraining them when the data changes becomes crucial in designing reliable failure prediction models. In this work, we evaluate the effects of monitoring failure prediction models over time using label-independent (unsupervised) drift detectors. We show that retraining based on unsupervised drift detectors instead of periodically reduces the cost of acquiring true labels without compromising accuracy. Furthermore, we propose a novel feature reduction for unsupervised drift detectors and an evaluation pipeline that practitioners can employ to select the most suitable unsupervised drift detector for their application.

AB - Failure prediction models can be significantly beneficial for managing large-scale complex software systems, but their trustworthiness is severely affected by changes in the data over time, also known as concept drift. Thus, monitoring these models against concept drift and retraining them when the data changes becomes crucial in designing reliable failure prediction models. In this work, we evaluate the effects of monitoring failure prediction models over time using label-independent (unsupervised) drift detectors. We show that retraining based on unsupervised drift detectors instead of periodically reduces the cost of acquiring true labels without compromising accuracy. Furthermore, we propose a novel feature reduction for unsupervised drift detectors and an evaluation pipeline that practitioners can employ to select the most suitable unsupervised drift detector for their application.

KW - concept drift

KW - concept drift detection

KW - failure prediction

KW - machine learning monitoring

UR - http://www.scopus.com/inward/record.url?scp=105009125791&partnerID=8YFLogxK

U2 - 10.1109/DeepTest66595.2025.00006

DO - 10.1109/DeepTest66595.2025.00006

M3 - Conference contribution

AN - SCOPUS:105009125791

SN - 979-8-3315-0191-4

SP - 1

EP - 8

BT - Proceedings - 2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 2025 IEEE/ACM International Workshop on Deep Learning for Testing and Testing for Deep Learning, DeepTest 2025

Y2 - 3 May 2025 through 3 May 2025

ER -

By the same author(s)