SPaRKLE: Symbolic caPtuRing of knowledge for Knowledge graph enrichment with LEarning

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

  • Disha Purohit
  • Yashrajsinh Chudasama
  • Ariam Rivas
  • Maria Esther Vidal

Research Organisations

External Research Organisations

  • German National Library of Science and Technology (TIB)
View graph of relations

Details

Original languageEnglish
Title of host publicationK-CAP '23
Subtitle of host publicationProceedings of the 12th Knowledge Capture Conference 2023
Pages44-52
Number of pages9
Publication statusPublished - 5 Dec 2023
Event12th ACM International Conference on Knowledge Capture, K-CAP 2023 - Pensacola, United States
Duration: 5 Dec 20237 Dec 2023

Abstract

Knowledge graphs (KGs) naturally capture the convergence of data and knowledge, making them expressive frameworks for describing and integrating heterogeneous data in a coherent and interconnected manner. However, based on the Open World Assumption (OWA), the absence of information within KGs does not indicate falsity or non-existence; it merely reflects incompleteness. Inductive learning over KGs involves predicting new relationships based on existing statements in the KG, using either numerical or symbolic learning models. The Partial Completeness Assumption (PCA) heuristic efficiently guides inductive learning methods for Link Prediction (LP) by refining predictions about absent KG relationships. Nevertheless, numeric techniques- like KG embedding models- alone may fall short in accurately predicting missing information, particularly when it comes to capturing implicit knowledge and complex relationships. We propose a hybrid method named SPaRKLE that seamlessly integrates symbolic and numerical techniques, leveraging the PCA heuristic to capture implicit knowledge and enrich KGs. We empirically compare SPaRKLE with state-of-the-art KG embedding and symbolic models, using established benchmarks. Our experimental outcomes underscore the efficacy of this hybrid approach, as it harnesses the strengths of both paradigms. SPaRKLE is publicly available on GitHub1.

Keywords

    Inductive Learning, Knowledge Graphs, Symbolic Learning

ASJC Scopus subject areas

Cite this

SPaRKLE: Symbolic caPtuRing of knowledge for Knowledge graph enrichment with LEarning. / Purohit, Disha; Chudasama, Yashrajsinh; Rivas, Ariam et al.
K-CAP '23: Proceedings of the 12th Knowledge Capture Conference 2023. 2023. p. 44-52.

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Purohit, D, Chudasama, Y, Rivas, A & Vidal, ME 2023, SPaRKLE: Symbolic caPtuRing of knowledge for Knowledge graph enrichment with LEarning. in K-CAP '23: Proceedings of the 12th Knowledge Capture Conference 2023. pp. 44-52, 12th ACM International Conference on Knowledge Capture, K-CAP 2023, Pensacola, United States, 5 Dec 2023. https://doi.org/10.1145/3587259.3627547
Purohit, D., Chudasama, Y., Rivas, A., & Vidal, M. E. (2023). SPaRKLE: Symbolic caPtuRing of knowledge for Knowledge graph enrichment with LEarning. In K-CAP '23: Proceedings of the 12th Knowledge Capture Conference 2023 (pp. 44-52) https://doi.org/10.1145/3587259.3627547
Purohit D, Chudasama Y, Rivas A, Vidal ME. SPaRKLE: Symbolic caPtuRing of knowledge for Knowledge graph enrichment with LEarning. In K-CAP '23: Proceedings of the 12th Knowledge Capture Conference 2023. 2023. p. 44-52 doi: 10.1145/3587259.3627547
Purohit, Disha ; Chudasama, Yashrajsinh ; Rivas, Ariam et al. / SPaRKLE : Symbolic caPtuRing of knowledge for Knowledge graph enrichment with LEarning. K-CAP '23: Proceedings of the 12th Knowledge Capture Conference 2023. 2023. pp. 44-52
Download
@inproceedings{1e1da79199df42e6bd93167499c444ef,
title = "SPaRKLE: Symbolic caPtuRing of knowledge for Knowledge graph enrichment with LEarning",
abstract = "Knowledge graphs (KGs) naturally capture the convergence of data and knowledge, making them expressive frameworks for describing and integrating heterogeneous data in a coherent and interconnected manner. However, based on the Open World Assumption (OWA), the absence of information within KGs does not indicate falsity or non-existence; it merely reflects incompleteness. Inductive learning over KGs involves predicting new relationships based on existing statements in the KG, using either numerical or symbolic learning models. The Partial Completeness Assumption (PCA) heuristic efficiently guides inductive learning methods for Link Prediction (LP) by refining predictions about absent KG relationships. Nevertheless, numeric techniques- like KG embedding models- alone may fall short in accurately predicting missing information, particularly when it comes to capturing implicit knowledge and complex relationships. We propose a hybrid method named SPaRKLE that seamlessly integrates symbolic and numerical techniques, leveraging the PCA heuristic to capture implicit knowledge and enrich KGs. We empirically compare SPaRKLE with state-of-the-art KG embedding and symbolic models, using established benchmarks. Our experimental outcomes underscore the efficacy of this hybrid approach, as it harnesses the strengths of both paradigms. SPaRKLE is publicly available on GitHub1.",
keywords = "Inductive Learning, Knowledge Graphs, Symbolic Learning",
author = "Disha Purohit and Yashrajsinh Chudasama and Ariam Rivas and Vidal, {Maria Esther}",
note = "Funding Information: This work has been supported by TrustKG - Transforming Data in Trustable Insights (GA No. P99/2020) funded by the Leibniz Association and the EraMed project P4-LUCAT (GA No. 53000015). ; 12th ACM International Conference on Knowledge Capture, K-CAP 2023 ; Conference date: 05-12-2023 Through 07-12-2023",
year = "2023",
month = dec,
day = "5",
doi = "10.1145/3587259.3627547",
language = "English",
isbn = "979-8-4007-0141-2",
pages = "44--52",
booktitle = "K-CAP '23",

}

Download

TY - GEN

T1 - SPaRKLE

T2 - 12th ACM International Conference on Knowledge Capture, K-CAP 2023

AU - Purohit, Disha

AU - Chudasama, Yashrajsinh

AU - Rivas, Ariam

AU - Vidal, Maria Esther

N1 - Funding Information: This work has been supported by TrustKG - Transforming Data in Trustable Insights (GA No. P99/2020) funded by the Leibniz Association and the EraMed project P4-LUCAT (GA No. 53000015).

PY - 2023/12/5

Y1 - 2023/12/5

N2 - Knowledge graphs (KGs) naturally capture the convergence of data and knowledge, making them expressive frameworks for describing and integrating heterogeneous data in a coherent and interconnected manner. However, based on the Open World Assumption (OWA), the absence of information within KGs does not indicate falsity or non-existence; it merely reflects incompleteness. Inductive learning over KGs involves predicting new relationships based on existing statements in the KG, using either numerical or symbolic learning models. The Partial Completeness Assumption (PCA) heuristic efficiently guides inductive learning methods for Link Prediction (LP) by refining predictions about absent KG relationships. Nevertheless, numeric techniques- like KG embedding models- alone may fall short in accurately predicting missing information, particularly when it comes to capturing implicit knowledge and complex relationships. We propose a hybrid method named SPaRKLE that seamlessly integrates symbolic and numerical techniques, leveraging the PCA heuristic to capture implicit knowledge and enrich KGs. We empirically compare SPaRKLE with state-of-the-art KG embedding and symbolic models, using established benchmarks. Our experimental outcomes underscore the efficacy of this hybrid approach, as it harnesses the strengths of both paradigms. SPaRKLE is publicly available on GitHub1.

AB - Knowledge graphs (KGs) naturally capture the convergence of data and knowledge, making them expressive frameworks for describing and integrating heterogeneous data in a coherent and interconnected manner. However, based on the Open World Assumption (OWA), the absence of information within KGs does not indicate falsity or non-existence; it merely reflects incompleteness. Inductive learning over KGs involves predicting new relationships based on existing statements in the KG, using either numerical or symbolic learning models. The Partial Completeness Assumption (PCA) heuristic efficiently guides inductive learning methods for Link Prediction (LP) by refining predictions about absent KG relationships. Nevertheless, numeric techniques- like KG embedding models- alone may fall short in accurately predicting missing information, particularly when it comes to capturing implicit knowledge and complex relationships. We propose a hybrid method named SPaRKLE that seamlessly integrates symbolic and numerical techniques, leveraging the PCA heuristic to capture implicit knowledge and enrich KGs. We empirically compare SPaRKLE with state-of-the-art KG embedding and symbolic models, using established benchmarks. Our experimental outcomes underscore the efficacy of this hybrid approach, as it harnesses the strengths of both paradigms. SPaRKLE is publicly available on GitHub1.

KW - Inductive Learning

KW - Knowledge Graphs

KW - Symbolic Learning

UR - http://www.scopus.com/inward/record.url?scp=85180366887&partnerID=8YFLogxK

U2 - 10.1145/3587259.3627547

DO - 10.1145/3587259.3627547

M3 - Conference contribution

AN - SCOPUS:85180366887

SN - 979-8-4007-0141-2

SP - 44

EP - 52

BT - K-CAP '23

Y2 - 5 December 2023 through 7 December 2023

ER -