Loading [MathJax]/extensions/tex2jax.js

Encoding Knowledge Graph Entity Aliases in Attentive Neural Network for Wikidata Entity Linking

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

  • Isaiah Onando Mulang
  • Kuldeep Singh
  • Akhilesh Vyas
  • Saeedeh Shekarpour
  • Maria Esther Vidal
  • Sören Auer

External Research Organisations

  • Fraunhofer Institute for Intelligent Analysis and Information Systems (IAIS)
  • Zerotha-Research and Cerence GmbH
  • University of Dayton
  • German National Library of Science and Technology (TIB)

Details

Original languageEnglish
Title of host publicationWeb Information Systems Engineering – WISE 2020
Subtitle of host publication21st International Conference, Amsterdam, The Netherlands, October 20–24, 2020, Proceedings, Part I
EditorsZhisheng Huang, Wouter Beek, Hua Wang, Yanchun Zhang, Rui Zhou
Pages328-342
Number of pages15
ISBN (electronic)978-3-030-62005-9
Publication statusPublished - 2020
Externally publishedYes
Event21st International Conference on Web Information Systems Engineering - Amsterdam, Netherlands
Duration: 20 Oct 202024 Oct 2020
Conference number: 21

Publication series

NameLecture Notes in Computer Science
Volume12342
ISSN (Print)0302-9743
ISSN (electronic)1611-3349

Abstract

The collaborative knowledge graphs such as Wikidata excessively rely on the crowd to author the information. Since the crowd is not bound to a standard protocol for assigning entity titles, the knowledge graph is populated by non-standard, noisy, long or even sometimes awkward titles. The issue of long, implicit, and nonstandard entity representations is a challenge in Entity Linking (EL) approaches for gaining high precision and recall. Underlying KG in general is the source of target entities for EL approaches, however, it often contains other relevant information, such as aliases of entities (e.g., Obama and Barack Hussein Obama are aliases for the entity Barack Obama). EL models usually ignore such readily available entity attributes. In this paper, we examine the role of knowledge graph context on an attentive neural network approach for entity linking on Wikidata. Our approach contributes by exploiting the sufficient context from a KG as a source of background knowledge, which is then fed into the neural network. This approach demonstrates merit to address challenges associated with entity titles (multi-word, long, implicit, case-sensitive). Our experimental study shows 8% improvements over the baseline approach, and significantly outperform an end to end approach for Wikidata entity linking.

Keywords

    cs.CL, Entity linking, Wikidata, Knowledge graph context

ASJC Scopus subject areas

Cite this

Encoding Knowledge Graph Entity Aliases in Attentive Neural Network for Wikidata Entity Linking. / Mulang, Isaiah Onando; Singh, Kuldeep; Vyas, Akhilesh et al.
Web Information Systems Engineering – WISE 2020: 21st International Conference, Amsterdam, The Netherlands, October 20–24, 2020, Proceedings, Part I. ed. / Zhisheng Huang; Wouter Beek; Hua Wang; Yanchun Zhang; Rui Zhou. 2020. p. 328-342 (Lecture Notes in Computer Science; Vol. 12342).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Mulang, IO, Singh, K, Vyas, A, Shekarpour, S, Vidal, ME, Lehmann, J & Auer, S 2020, Encoding Knowledge Graph Entity Aliases in Attentive Neural Network for Wikidata Entity Linking. in Z Huang, W Beek, H Wang, Y Zhang & R Zhou (eds), Web Information Systems Engineering – WISE 2020: 21st International Conference, Amsterdam, The Netherlands, October 20–24, 2020, Proceedings, Part I. Lecture Notes in Computer Science, vol. 12342, pp. 328-342, 21st International Conference on Web Information Systems Engineering, Amsterdam, Netherlands, 20 Oct 2020. https://doi.org/10.1007/978-3-030-62005-9_24
Mulang, I. O., Singh, K., Vyas, A., Shekarpour, S., Vidal, M. E., Lehmann, J., & Auer, S. (2020). Encoding Knowledge Graph Entity Aliases in Attentive Neural Network for Wikidata Entity Linking. In Z. Huang, W. Beek, H. Wang, Y. Zhang, & R. Zhou (Eds.), Web Information Systems Engineering – WISE 2020: 21st International Conference, Amsterdam, The Netherlands, October 20–24, 2020, Proceedings, Part I (pp. 328-342). (Lecture Notes in Computer Science; Vol. 12342). https://doi.org/10.1007/978-3-030-62005-9_24
Mulang IO, Singh K, Vyas A, Shekarpour S, Vidal ME, Lehmann J et al. Encoding Knowledge Graph Entity Aliases in Attentive Neural Network for Wikidata Entity Linking. In Huang Z, Beek W, Wang H, Zhang Y, Zhou R, editors, Web Information Systems Engineering – WISE 2020: 21st International Conference, Amsterdam, The Netherlands, October 20–24, 2020, Proceedings, Part I. 2020. p. 328-342. (Lecture Notes in Computer Science). Epub 2020 Oct 18. doi: 10.1007/978-3-030-62005-9_24
Mulang, Isaiah Onando ; Singh, Kuldeep ; Vyas, Akhilesh et al. / Encoding Knowledge Graph Entity Aliases in Attentive Neural Network for Wikidata Entity Linking. Web Information Systems Engineering – WISE 2020: 21st International Conference, Amsterdam, The Netherlands, October 20–24, 2020, Proceedings, Part I. editor / Zhisheng Huang ; Wouter Beek ; Hua Wang ; Yanchun Zhang ; Rui Zhou. 2020. pp. 328-342 (Lecture Notes in Computer Science).
Download
@inproceedings{cdd4ad517a604a98be0ebe4b37c19c26,
title = "Encoding Knowledge Graph Entity Aliases in Attentive Neural Network for Wikidata Entity Linking",
abstract = "The collaborative knowledge graphs such as Wikidata excessively rely on the crowd to author the information. Since the crowd is not bound to a standard protocol for assigning entity titles, the knowledge graph is populated by non-standard, noisy, long or even sometimes awkward titles. The issue of long, implicit, and nonstandard entity representations is a challenge in Entity Linking (EL) approaches for gaining high precision and recall. Underlying KG in general is the source of target entities for EL approaches, however, it often contains other relevant information, such as aliases of entities (e.g., Obama and Barack Hussein Obama are aliases for the entity Barack Obama). EL models usually ignore such readily available entity attributes. In this paper, we examine the role of knowledge graph context on an attentive neural network approach for entity linking on Wikidata. Our approach contributes by exploiting the sufficient context from a KG as a source of background knowledge, which is then fed into the neural network. This approach demonstrates merit to address challenges associated with entity titles (multi-word, long, implicit, case-sensitive). Our experimental study shows 8% improvements over the baseline approach, and significantly outperform an end to end approach for Wikidata entity linking.",
keywords = "cs.CL, Entity linking, Wikidata, Knowledge graph context",
author = "Mulang, {Isaiah Onando} and Kuldeep Singh and Akhilesh Vyas and Saeedeh Shekarpour and Vidal, {Maria Esther} and Jens Lehmann and S{\"o}ren Auer",
note = "Funding Information: Acknowledgments. This work is Co-funded by the European Union{\textquoteright}s Horizon 2020 research and innovation programme under the QualiChain Project, Grant Agreement No. 822404; and the IASIS project, Grant Agreement No. 727658.; 21st International Conference on Web Information Systems Engineering ; Conference date: 20-10-2020 Through 24-10-2020",
year = "2020",
doi = "10.1007/978-3-030-62005-9_24",
language = "English",
isbn = "978-3-030-62004-2",
series = "Lecture Notes in Computer Science",
pages = "328--342",
editor = "Zhisheng Huang and Wouter Beek and Hua Wang and Yanchun Zhang and Rui Zhou",
booktitle = "Web Information Systems Engineering – WISE 2020",

}

Download

TY - GEN

T1 - Encoding Knowledge Graph Entity Aliases in Attentive Neural Network for Wikidata Entity Linking

AU - Mulang, Isaiah Onando

AU - Singh, Kuldeep

AU - Vyas, Akhilesh

AU - Shekarpour, Saeedeh

AU - Vidal, Maria Esther

AU - Lehmann, Jens

AU - Auer, Sören

N1 - Conference code: 21

PY - 2020

Y1 - 2020

N2 - The collaborative knowledge graphs such as Wikidata excessively rely on the crowd to author the information. Since the crowd is not bound to a standard protocol for assigning entity titles, the knowledge graph is populated by non-standard, noisy, long or even sometimes awkward titles. The issue of long, implicit, and nonstandard entity representations is a challenge in Entity Linking (EL) approaches for gaining high precision and recall. Underlying KG in general is the source of target entities for EL approaches, however, it often contains other relevant information, such as aliases of entities (e.g., Obama and Barack Hussein Obama are aliases for the entity Barack Obama). EL models usually ignore such readily available entity attributes. In this paper, we examine the role of knowledge graph context on an attentive neural network approach for entity linking on Wikidata. Our approach contributes by exploiting the sufficient context from a KG as a source of background knowledge, which is then fed into the neural network. This approach demonstrates merit to address challenges associated with entity titles (multi-word, long, implicit, case-sensitive). Our experimental study shows 8% improvements over the baseline approach, and significantly outperform an end to end approach for Wikidata entity linking.

AB - The collaborative knowledge graphs such as Wikidata excessively rely on the crowd to author the information. Since the crowd is not bound to a standard protocol for assigning entity titles, the knowledge graph is populated by non-standard, noisy, long or even sometimes awkward titles. The issue of long, implicit, and nonstandard entity representations is a challenge in Entity Linking (EL) approaches for gaining high precision and recall. Underlying KG in general is the source of target entities for EL approaches, however, it often contains other relevant information, such as aliases of entities (e.g., Obama and Barack Hussein Obama are aliases for the entity Barack Obama). EL models usually ignore such readily available entity attributes. In this paper, we examine the role of knowledge graph context on an attentive neural network approach for entity linking on Wikidata. Our approach contributes by exploiting the sufficient context from a KG as a source of background knowledge, which is then fed into the neural network. This approach demonstrates merit to address challenges associated with entity titles (multi-word, long, implicit, case-sensitive). Our experimental study shows 8% improvements over the baseline approach, and significantly outperform an end to end approach for Wikidata entity linking.

KW - cs.CL

KW - Entity linking

KW - Wikidata

KW - Knowledge graph context

UR - http://www.scopus.com/inward/record.url?scp=85096624316&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-62005-9_24

DO - 10.1007/978-3-030-62005-9_24

M3 - Conference contribution

SN - 978-3-030-62004-2

T3 - Lecture Notes in Computer Science

SP - 328

EP - 342

BT - Web Information Systems Engineering – WISE 2020

A2 - Huang, Zhisheng

A2 - Beek, Wouter

A2 - Wang, Hua

A2 - Zhang, Yanchun

A2 - Zhou, Rui

T2 - 21st International Conference on Web Information Systems Engineering

Y2 - 20 October 2020 through 24 October 2020

ER -

By the same author(s)