New attention strategy for negative sampling in knowledge graph embedding

Si CEN, Xizhao WANG*, Xiaoying ZOU, Chao LIU, Guoquan DAI

*Corresponding author for this work

Research output: Journal PublicationsJournal Article (refereed)peer-review

Abstract

In the study of knowledge graph embedding (KGE), self-adversarial negative sampling is a recently proposed technique based on the attention mechanism, which pays more attention to the negative triplets with higher embedding scores. Unfortunately, the technique also pays more attention to those false negative triplets with higher embedding scores, which is obviously unreasonable and often leads to a performance downgrade of the KGE model. To alleviate the downgrade, this paper proposes a new attention strategy, aiming at gradually decreasing the attention of high-score false negative triplets. In the new strategy, the attention difference between high-score and low-score negative triplets will be narrowed as the KGE model performance improves, which is more reasonable during training. Experimental results on TransE, DistMult, RotatE, and PairRE show that our proposed strategy indeed has a significant performance improvement for KGE models on the task of linking prediction and triplet classification.

Original languageEnglish
Pages (from-to)26418-26438
Number of pages21
JournalApplied Intelligence
Volume53
Issue number22
Early online date23 Aug 2023
DOIs
Publication statusPublished - Nov 2023
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.

Keywords

  • Attention mechanism
  • Embedding
  • Knowledge graph
  • Negative sampling

Fingerprint

Dive into the research topics of 'New attention strategy for negative sampling in knowledge graph embedding'. Together they form a unique fingerprint.

Cite this