Emergent Discrimination: Should We Protect Algorithmic Groups?

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autorschaft

  • Jannik Zeiser

Organisationseinheiten

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Seiten (von - bis)910-928
Seitenumfang19
FachzeitschriftJournal of applied philosophy
Jahrgang42
Ausgabenummer3
PublikationsstatusVeröffentlicht - 17 Juli 2025

Abstract

Discrimination is usually thought of in terms of socially salient groups, such as race or gender. Some scholars argue that the rise of algorithmic decision-making poses a challenge to this notion. Algorithms are not bound by a social view of the world. Therefore, they may not only inherit pre-existing social biases and injustices but may also discriminate based on entirely new categories that have little or no meaning to humans at all, such as ‘being born on a Tuesday’. Should this prospect change how we theorize about discrimination, and should we protect these algorithmic groups, as some have suggested? I argue that the phenomenon is adequately described as ‘discrimination’ when a group is systematically disadvantaged. At present, we lack information about whether any algorithmic group meets this criterion, so it is difficult to protect such groups. Instead, we should prevent algorithms from disproportionately disadvantaging certain individuals, and I outline strategies for doing so.

ASJC Scopus Sachgebiete

Zitieren

Emergent Discrimination: Should We Protect Algorithmic Groups? / Zeiser, Jannik.
in: Journal of applied philosophy, Jahrgang 42, Nr. 3, 17.07.2025, S. 910-928.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Zeiser J. Emergent Discrimination: Should We Protect Algorithmic Groups? Journal of applied philosophy. 2025 Jul 17;42(3):910-928. doi: 10.1111/japp.12793
Zeiser, Jannik. / Emergent Discrimination : Should We Protect Algorithmic Groups?. in: Journal of applied philosophy. 2025 ; Jahrgang 42, Nr. 3. S. 910-928.
Download
@article{be137a6eaaa44f42bc76e868c795840c,
title = "Emergent Discrimination: Should We Protect Algorithmic Groups?",
abstract = "Discrimination is usually thought of in terms of socially salient groups, such as race or gender. Some scholars argue that the rise of algorithmic decision-making poses a challenge to this notion. Algorithms are not bound by a social view of the world. Therefore, they may not only inherit pre-existing social biases and injustices but may also discriminate based on entirely new categories that have little or no meaning to humans at all, such as {\textquoteleft}being born on a Tuesday{\textquoteright}. Should this prospect change how we theorize about discrimination, and should we protect these algorithmic groups, as some have suggested? I argue that the phenomenon is adequately described as {\textquoteleft}discrimination{\textquoteright} when a group is systematically disadvantaged. At present, we lack information about whether any algorithmic group meets this criterion, so it is difficult to protect such groups. Instead, we should prevent algorithms from disproportionately disadvantaging certain individuals, and I outline strategies for doing so.",
author = "Jannik Zeiser",
note = "Publisher Copyright: {\textcopyright} 2025 The Author(s). Journal of Applied Philosophy published by John Wiley & Sons Ltd on behalf of Society for Applied Philosophy.",
year = "2025",
month = jul,
day = "17",
doi = "10.1111/japp.12793",
language = "English",
volume = "42",
pages = "910--928",
journal = "Journal of applied philosophy",
issn = "0264-3758",
publisher = "John Wiley and Sons Inc.",
number = "3",

}

Download

TY - JOUR

T1 - Emergent Discrimination

T2 - Should We Protect Algorithmic Groups?

AU - Zeiser, Jannik

N1 - Publisher Copyright: © 2025 The Author(s). Journal of Applied Philosophy published by John Wiley & Sons Ltd on behalf of Society for Applied Philosophy.

PY - 2025/7/17

Y1 - 2025/7/17

N2 - Discrimination is usually thought of in terms of socially salient groups, such as race or gender. Some scholars argue that the rise of algorithmic decision-making poses a challenge to this notion. Algorithms are not bound by a social view of the world. Therefore, they may not only inherit pre-existing social biases and injustices but may also discriminate based on entirely new categories that have little or no meaning to humans at all, such as ‘being born on a Tuesday’. Should this prospect change how we theorize about discrimination, and should we protect these algorithmic groups, as some have suggested? I argue that the phenomenon is adequately described as ‘discrimination’ when a group is systematically disadvantaged. At present, we lack information about whether any algorithmic group meets this criterion, so it is difficult to protect such groups. Instead, we should prevent algorithms from disproportionately disadvantaging certain individuals, and I outline strategies for doing so.

AB - Discrimination is usually thought of in terms of socially salient groups, such as race or gender. Some scholars argue that the rise of algorithmic decision-making poses a challenge to this notion. Algorithms are not bound by a social view of the world. Therefore, they may not only inherit pre-existing social biases and injustices but may also discriminate based on entirely new categories that have little or no meaning to humans at all, such as ‘being born on a Tuesday’. Should this prospect change how we theorize about discrimination, and should we protect these algorithmic groups, as some have suggested? I argue that the phenomenon is adequately described as ‘discrimination’ when a group is systematically disadvantaged. At present, we lack information about whether any algorithmic group meets this criterion, so it is difficult to protect such groups. Instead, we should prevent algorithms from disproportionately disadvantaging certain individuals, and I outline strategies for doing so.

UR - http://www.scopus.com/inward/record.url?scp=85216974172&partnerID=8YFLogxK

U2 - 10.1111/japp.12793

DO - 10.1111/japp.12793

M3 - Article

AN - SCOPUS:85216974172

VL - 42

SP - 910

EP - 928

JO - Journal of applied philosophy

JF - Journal of applied philosophy

SN - 0264-3758

IS - 3

ER -