An effective single-model learning for multi-label data

Research output: Contribution to journalArticleResearchpeer review

Authors

  • Sajjad Kamali Siahroudi
  • Daniel Kudenko

Research Organisations

View graph of relations

Details

Original languageEnglish
Article number120887
JournalExpert systems with applications
Volume232
Early online date24 Jun 2023
Publication statusPublished - 1 Dec 2023

Abstract

Multi-label data classification (MLC) has become an increasingly active research area over the past decade. MLC refers to a classification problem where each instance can be associated with more than one class label. Capturing the correlation among labels and tackling the label imbalance are the main challenges in MLC. Problem transformation is one of the well-known approaches in this area that became the de-facto approach for MLC. Existing methods in this approach consider MLC as a collection of single-label tasks and solve each of them separately. To consider correlation among labels, some of them consider the combination of labels that appear in the training data as a separate label. The main drawback of these kinds of methods is the complexity of the model, which makes them not applicable in real-world applications. In this paper, we show how MLC can be efficiently and effectively tackled with a single classifier. Our proposed method maps the training data into a new sub-space for each label. Then, it pools all the mapped data together and efficiently trains a single classifier for all the labels together. Experimental results show that our method successfully tackles MLC tasks and outperforms the state-of-the-art methods.

Keywords

    Deep learning, Imbalanced data, Multi-label learning

ASJC Scopus subject areas

Cite this

An effective single-model learning for multi-label data. / Siahroudi, Sajjad Kamali; Kudenko, Daniel.
In: Expert systems with applications, Vol. 232, 120887, 01.12.2023.

Research output: Contribution to journalArticleResearchpeer review

Siahroudi SK, Kudenko D. An effective single-model learning for multi-label data. Expert systems with applications. 2023 Dec 1;232:120887. Epub 2023 Jun 24. doi: 10.1016/j.eswa.2023.120887
Siahroudi, Sajjad Kamali ; Kudenko, Daniel. / An effective single-model learning for multi-label data. In: Expert systems with applications. 2023 ; Vol. 232.
Download
@article{e6a8d9fe4ec74695a80804b93aab9afb,
title = "An effective single-model learning for multi-label data",
abstract = "Multi-label data classification (MLC) has become an increasingly active research area over the past decade. MLC refers to a classification problem where each instance can be associated with more than one class label. Capturing the correlation among labels and tackling the label imbalance are the main challenges in MLC. Problem transformation is one of the well-known approaches in this area that became the de-facto approach for MLC. Existing methods in this approach consider MLC as a collection of single-label tasks and solve each of them separately. To consider correlation among labels, some of them consider the combination of labels that appear in the training data as a separate label. The main drawback of these kinds of methods is the complexity of the model, which makes them not applicable in real-world applications. In this paper, we show how MLC can be efficiently and effectively tackled with a single classifier. Our proposed method maps the training data into a new sub-space for each label. Then, it pools all the mapped data together and efficiently trains a single classifier for all the labels together. Experimental results show that our method successfully tackles MLC tasks and outperforms the state-of-the-art methods.",
keywords = "Deep learning, Imbalanced data, Multi-label learning",
author = "Siahroudi, {Sajjad Kamali} and Daniel Kudenko",
year = "2023",
month = dec,
day = "1",
doi = "10.1016/j.eswa.2023.120887",
language = "English",
volume = "232",
journal = "Expert systems with applications",
issn = "0957-4174",
publisher = "Elsevier Ltd.",

}

Download

TY - JOUR

T1 - An effective single-model learning for multi-label data

AU - Siahroudi, Sajjad Kamali

AU - Kudenko, Daniel

PY - 2023/12/1

Y1 - 2023/12/1

N2 - Multi-label data classification (MLC) has become an increasingly active research area over the past decade. MLC refers to a classification problem where each instance can be associated with more than one class label. Capturing the correlation among labels and tackling the label imbalance are the main challenges in MLC. Problem transformation is one of the well-known approaches in this area that became the de-facto approach for MLC. Existing methods in this approach consider MLC as a collection of single-label tasks and solve each of them separately. To consider correlation among labels, some of them consider the combination of labels that appear in the training data as a separate label. The main drawback of these kinds of methods is the complexity of the model, which makes them not applicable in real-world applications. In this paper, we show how MLC can be efficiently and effectively tackled with a single classifier. Our proposed method maps the training data into a new sub-space for each label. Then, it pools all the mapped data together and efficiently trains a single classifier for all the labels together. Experimental results show that our method successfully tackles MLC tasks and outperforms the state-of-the-art methods.

AB - Multi-label data classification (MLC) has become an increasingly active research area over the past decade. MLC refers to a classification problem where each instance can be associated with more than one class label. Capturing the correlation among labels and tackling the label imbalance are the main challenges in MLC. Problem transformation is one of the well-known approaches in this area that became the de-facto approach for MLC. Existing methods in this approach consider MLC as a collection of single-label tasks and solve each of them separately. To consider correlation among labels, some of them consider the combination of labels that appear in the training data as a separate label. The main drawback of these kinds of methods is the complexity of the model, which makes them not applicable in real-world applications. In this paper, we show how MLC can be efficiently and effectively tackled with a single classifier. Our proposed method maps the training data into a new sub-space for each label. Then, it pools all the mapped data together and efficiently trains a single classifier for all the labels together. Experimental results show that our method successfully tackles MLC tasks and outperforms the state-of-the-art methods.

KW - Deep learning

KW - Imbalanced data

KW - Multi-label learning

UR - http://www.scopus.com/inward/record.url?scp=85164029151&partnerID=8YFLogxK

U2 - 10.1016/j.eswa.2023.120887

DO - 10.1016/j.eswa.2023.120887

M3 - Article

AN - SCOPUS:85164029151

VL - 232

JO - Expert systems with applications

JF - Expert systems with applications

SN - 0957-4174

M1 - 120887

ER -