Abstract
In the study of knowledge graph embedding (KGE), self-adversarial negative sampling is a recently proposed technique based on the attention mechanism, which pays more attention to the negative triplets with higher embedding scores. Unfortunately, the technique also pays more attention to those false negative triplets with higher embedding scores, which is obviously unreasonable and often leads to a performance downgrade of the KGE model. To alleviate the downgrade, this paper proposes a new attention strategy, aiming at gradually decreasing the attention of high-score false negative triplets. In the new strategy, the attention difference between high-score and low-score negative triplets will be narrowed as the KGE model performance improves, which is more reasonable during training. Experimental results on TransE, DistMult, RotatE, and PairRE show that our proposed strategy indeed has a significant performance improvement for KGE models on the task of linking prediction and triplet classification.
Original language | English |
---|---|
Pages (from-to) | 26418-26438 |
Number of pages | 21 |
Journal | Applied Intelligence |
Volume | 53 |
Issue number | 22 |
Early online date | 23 Aug 2023 |
DOIs | |
Publication status | Published - Nov 2023 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
Keywords
- Attention mechanism
- Embedding
- Knowledge graph
- Negative sampling