Creating and validating a scholarly knowledge graph using natural language processing and microtask crowdsourcing

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autoren

Externe Organisationen

  • Technische Informationsbibliothek (TIB) Leibniz-Informationszentrum Technik und Naturwissenschaften und Universitätsbibliothek
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Seitenumfang13
FachzeitschriftInternational Journal on Digital Libraries
PublikationsstatusVeröffentlicht - 5 Apr. 2023
Extern publiziertJa

Abstract

Due to the growing number of scholarly publications, finding relevant articles becomes increasingly difficult. Scholarly knowledge graphs can be used to organize the scholarly knowledge presented within those publications and represent them in machine-readable formats. Natural language processing (NLP) provides scalable methods to automatically extract knowledge from articles and populate scholarly knowledge graphs. However, NLP extraction is generally not sufficiently accurate and, thus, fails to generate high granularity quality data. In this work, we present TinyGenius, a methodology to validate NLP-extracted scholarly knowledge statements using microtasks performed with crowdsourcing. TinyGenius is employed to populate a paper-centric knowledge graph, using five distinct NLP methods. We extend our previous work of the TinyGenius methodology in various ways. Specifically, we discuss the NLP tasks in more detail and include an explanation of the data model. Moreover, we present a user evaluation where participants validate the generated NLP statements. The results indicate that employing microtasks for statement validation is a promising approach despite the varying participant agreement for different microtasks.

ASJC Scopus Sachgebiete

Zitieren

Creating and validating a scholarly knowledge graph using natural language processing and microtask crowdsourcing. / Oelen, Allard; Stocker, Markus; Auer, Sören.
in: International Journal on Digital Libraries, 05.04.2023.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Oelen A, Stocker M, Auer S. Creating and validating a scholarly knowledge graph using natural language processing and microtask crowdsourcing. International Journal on Digital Libraries. 2023 Apr 5. doi: 10.1007/s00799-023-00360-7
Oelen, Allard ; Stocker, Markus ; Auer, Sören. / Creating and validating a scholarly knowledge graph using natural language processing and microtask crowdsourcing. in: International Journal on Digital Libraries. 2023.
Download
@article{fe755a80bf434665af45f47d39bfd10d,
title = "Creating and validating a scholarly knowledge graph using natural language processing and microtask crowdsourcing",
abstract = "Due to the growing number of scholarly publications, finding relevant articles becomes increasingly difficult. Scholarly knowledge graphs can be used to organize the scholarly knowledge presented within those publications and represent them in machine-readable formats. Natural language processing (NLP) provides scalable methods to automatically extract knowledge from articles and populate scholarly knowledge graphs. However, NLP extraction is generally not sufficiently accurate and, thus, fails to generate high granularity quality data. In this work, we present TinyGenius, a methodology to validate NLP-extracted scholarly knowledge statements using microtasks performed with crowdsourcing. TinyGenius is employed to populate a paper-centric knowledge graph, using five distinct NLP methods. We extend our previous work of the TinyGenius methodology in various ways. Specifically, we discuss the NLP tasks in more detail and include an explanation of the data model. Moreover, we present a user evaluation where participants validate the generated NLP statements. The results indicate that employing microtasks for statement validation is a promising approach despite the varying participant agreement for different microtasks.",
keywords = "Crowdsourcing microtasks, Knowledge graph validation, Scholarly knowledge graphs, User interface evaluation",
author = "Allard Oelen and Markus Stocker and S{\"o}ren Auer",
note = "Funding Information: This work was co-funded by the European Research Council for the project ScienceGRAPH (Grant agreement ID: 819536) and the TIB Leibniz Information Centre for Science and Technology. We would like to thank Mohamad Yaser Jaradeh and Jennifer D{\textquoteright}Souza for their contributions to this work. Open Access funding enabled and organized by Projekt DEAL. ",
year = "2023",
month = apr,
day = "5",
doi = "10.1007/s00799-023-00360-7",
language = "English",

}

Download

TY - JOUR

T1 - Creating and validating a scholarly knowledge graph using natural language processing and microtask crowdsourcing

AU - Oelen, Allard

AU - Stocker, Markus

AU - Auer, Sören

N1 - Funding Information: This work was co-funded by the European Research Council for the project ScienceGRAPH (Grant agreement ID: 819536) and the TIB Leibniz Information Centre for Science and Technology. We would like to thank Mohamad Yaser Jaradeh and Jennifer D’Souza for their contributions to this work. Open Access funding enabled and organized by Projekt DEAL.

PY - 2023/4/5

Y1 - 2023/4/5

N2 - Due to the growing number of scholarly publications, finding relevant articles becomes increasingly difficult. Scholarly knowledge graphs can be used to organize the scholarly knowledge presented within those publications and represent them in machine-readable formats. Natural language processing (NLP) provides scalable methods to automatically extract knowledge from articles and populate scholarly knowledge graphs. However, NLP extraction is generally not sufficiently accurate and, thus, fails to generate high granularity quality data. In this work, we present TinyGenius, a methodology to validate NLP-extracted scholarly knowledge statements using microtasks performed with crowdsourcing. TinyGenius is employed to populate a paper-centric knowledge graph, using five distinct NLP methods. We extend our previous work of the TinyGenius methodology in various ways. Specifically, we discuss the NLP tasks in more detail and include an explanation of the data model. Moreover, we present a user evaluation where participants validate the generated NLP statements. The results indicate that employing microtasks for statement validation is a promising approach despite the varying participant agreement for different microtasks.

AB - Due to the growing number of scholarly publications, finding relevant articles becomes increasingly difficult. Scholarly knowledge graphs can be used to organize the scholarly knowledge presented within those publications and represent them in machine-readable formats. Natural language processing (NLP) provides scalable methods to automatically extract knowledge from articles and populate scholarly knowledge graphs. However, NLP extraction is generally not sufficiently accurate and, thus, fails to generate high granularity quality data. In this work, we present TinyGenius, a methodology to validate NLP-extracted scholarly knowledge statements using microtasks performed with crowdsourcing. TinyGenius is employed to populate a paper-centric knowledge graph, using five distinct NLP methods. We extend our previous work of the TinyGenius methodology in various ways. Specifically, we discuss the NLP tasks in more detail and include an explanation of the data model. Moreover, we present a user evaluation where participants validate the generated NLP statements. The results indicate that employing microtasks for statement validation is a promising approach despite the varying participant agreement for different microtasks.

KW - Crowdsourcing microtasks

KW - Knowledge graph validation

KW - Scholarly knowledge graphs

KW - User interface evaluation

UR - http://www.scopus.com/inward/record.url?scp=85151531559&partnerID=8YFLogxK

U2 - 10.1007/s00799-023-00360-7

DO - 10.1007/s00799-023-00360-7

M3 - Article

AN - SCOPUS:85151531559

JO - International Journal on Digital Libraries

JF - International Journal on Digital Libraries

SN - 1432-5012

ER -

Von denselben Autoren