Partially-federated learning: A new approach to achieving privacy and effectiveness

Research output: Contribution to journalArticleResearchpeer review

Authors

Research Organisations

External Research Organisations

  • University “Mediterranea” of Reggio Calabria
View graph of relations

Details

Original languageEnglish
Pages (from-to)534-547
Number of pages14
JournalInformation sciences
Volume614
Early online date20 Oct 2022
Publication statusPublished - Oct 2022

Abstract

In Machine Learning, the data for training the model are stored centrally. However, when the data come from different sources and contain sensitive information, we can use federated learning to implement a privacy-preserving distributed machine learning framework. In this case, multiple client devices participate in global model training by sharing only the model updates with the server while keeping the original data local. In this paper, we propose a new approach, called partially-federated learning, that combines machine learning with federated learning. This hybrid architecture can train a unified model across multiple clients, where the individual client can decide whether a sample must remain private or can be shared with the server. This decision is made by a privacy module that can enforce various techniques to protect the privacy of client data. The proposed approach improves the performance compared to classical federated learning.

Keywords

    Collaborative learning, Distributed databases, k-anonymity, l-diversity, Machine learning

ASJC Scopus subject areas

Cite this

Partially-federated learning: A new approach to achieving privacy and effectiveness. / Fisichella, Marco; Lax, Gianluca; Russo, Antonia.
In: Information sciences, Vol. 614, 10.2022, p. 534-547.

Research output: Contribution to journalArticleResearchpeer review

Fisichella M, Lax G, Russo A. Partially-federated learning: A new approach to achieving privacy and effectiveness. Information sciences. 2022 Oct;614:534-547. Epub 2022 Oct 20. doi: 10.1016/j.ins.2022.10.082
Download
@article{16eb22e64b7547528eb02460ef947083,
title = "Partially-federated learning: A new approach to achieving privacy and effectiveness",
abstract = "In Machine Learning, the data for training the model are stored centrally. However, when the data come from different sources and contain sensitive information, we can use federated learning to implement a privacy-preserving distributed machine learning framework. In this case, multiple client devices participate in global model training by sharing only the model updates with the server while keeping the original data local. In this paper, we propose a new approach, called partially-federated learning, that combines machine learning with federated learning. This hybrid architecture can train a unified model across multiple clients, where the individual client can decide whether a sample must remain private or can be shared with the server. This decision is made by a privacy module that can enforce various techniques to protect the privacy of client data. The proposed approach improves the performance compared to classical federated learning.",
keywords = "Collaborative learning, Distributed databases, k-anonymity, l-diversity, Machine learning",
author = "Marco Fisichella and Gianluca Lax and Antonia Russo",
year = "2022",
month = oct,
doi = "10.1016/j.ins.2022.10.082",
language = "English",
volume = "614",
pages = "534--547",
journal = "Information sciences",
issn = "0020-0255",
publisher = "Elsevier Inc.",

}

Download

TY - JOUR

T1 - Partially-federated learning

T2 - A new approach to achieving privacy and effectiveness

AU - Fisichella, Marco

AU - Lax, Gianluca

AU - Russo, Antonia

PY - 2022/10

Y1 - 2022/10

N2 - In Machine Learning, the data for training the model are stored centrally. However, when the data come from different sources and contain sensitive information, we can use federated learning to implement a privacy-preserving distributed machine learning framework. In this case, multiple client devices participate in global model training by sharing only the model updates with the server while keeping the original data local. In this paper, we propose a new approach, called partially-federated learning, that combines machine learning with federated learning. This hybrid architecture can train a unified model across multiple clients, where the individual client can decide whether a sample must remain private or can be shared with the server. This decision is made by a privacy module that can enforce various techniques to protect the privacy of client data. The proposed approach improves the performance compared to classical federated learning.

AB - In Machine Learning, the data for training the model are stored centrally. However, when the data come from different sources and contain sensitive information, we can use federated learning to implement a privacy-preserving distributed machine learning framework. In this case, multiple client devices participate in global model training by sharing only the model updates with the server while keeping the original data local. In this paper, we propose a new approach, called partially-federated learning, that combines machine learning with federated learning. This hybrid architecture can train a unified model across multiple clients, where the individual client can decide whether a sample must remain private or can be shared with the server. This decision is made by a privacy module that can enforce various techniques to protect the privacy of client data. The proposed approach improves the performance compared to classical federated learning.

KW - Collaborative learning

KW - Distributed databases

KW - k-anonymity

KW - l-diversity

KW - Machine learning

UR - http://www.scopus.com/inward/record.url?scp=85140979213&partnerID=8YFLogxK

U2 - 10.1016/j.ins.2022.10.082

DO - 10.1016/j.ins.2022.10.082

M3 - Article

AN - SCOPUS:85140979213

VL - 614

SP - 534

EP - 547

JO - Information sciences

JF - Information sciences

SN - 0020-0255

ER -

By the same author(s)