APSIPA Transactions on Signal and Information Processing > Vol 13 > Issue 1

Knowledge Graph Embedding: An Overview

Xiou Ge, University of Southern California, USA, xiouge@usc.edu , Yun Cheng Wang, University of Southern California, USA, Bin Wang, Institute for Infocomm Research (I2R), A*STAR, Singapore, C.-C. Jay Kuo, University of Southern California, USA
 
Suggested Citation
Xiou Ge, Yun Cheng Wang, Bin Wang and C.-C. Jay Kuo (2024), "Knowledge Graph Embedding: An Overview", APSIPA Transactions on Signal and Information Processing: Vol. 13: No. 1, e1. http://dx.doi.org/10.1561/116.00000065

Publication Date: 12 Feb 2024
© 2024 X. Ge, Y. C. Wang, B. Wang and C. C. J. Kuo
 
Subjects
 
Keywords
Knowledge graph embeddingKnowledge graph completionLink prediction
 

Share

Open Access

This is published under the terms of CC BY-NC.

Downloaded: 3363 times

In this article:
Introduction 
Existing Models 
Unified Framework for Distance-based KGE with Affine Transformations: CompoundE and CompoundE3D 
Dataset and Evaluation 
Emerging Direction 
Conclusion 
Acknowledgments 
References 

Abstract

Many mathematical models have been leveraged to design embeddings for representing Knowledge Graph (KG) entities and relations for link prediction and many downstream tasks. These mathematically-inspired models are not only highly scalable for inference in large KGs, but also have many explainable advantages in modeling different relation patterns that can be validated through both formal proofs and empirical results. In this paper, we make a comprehensive overview of the current state of research in KG completion. In particular, we focus on two main branches of KG embedding (KGE) design: 1) distance-based methods and 2) semantic matching-based methods. We discover the connections between recently proposed models and present an underlying trend that might help researchers invent novel and more effective models. Next, we delve into CompoundE and CompoundE3D, which draw inspiration from 2D and 3D affine operations, respectively. They encompass a broad spectrum of distance-based embedding techniques. We will also discuss an emerging approach for KG completion which leverages pre-trained language models (PLMs) and textual descriptions of entities and relations and offer insights into the integration of KGE embedding methods with PLMs for KG completion.

DOI:10.1561/116.00000065