Abstract
The study of ensemble learning in knowledge graph embedding (KGE) shows that combining multiple individual KGE models can perform better on knowledge graph completion. However, existing KGE ensemble methods ignore the creation of model diversity because these methods independently train individual models, which are short of training interaction. To create rich model diversity, we propose a novel training method for ensemble bilinear models (EBM) for the problem of knowledge graph completion. EBM uses a weighted loss to allow individual KGE models to interact during training. In this way, the relations in the knowledge graph can be automatically modeled by the most appropriate model from the KGE individual ones. The experiments on knowledge graph completion show that EBM has richer diversity and performs better than the single bilinear model and the ensemble methods without interaction.
Original language | English |
---|---|
Title of host publication | Proceedings of 2023 International Conference on Machine Learning and Cybernetics, ICMLC 2023 |
Publisher | IEEE Computer Society |
Pages | 26-30 |
Number of pages | 5 |
ISBN (Electronic) | 9798350303780 |
DOIs | |
Publication status | Published - 2023 |
Externally published | Yes |
Event | 2023 International Conference on Machine Learning and Cybernetics, ICMLC 2023 - Adelaide, Australia Duration: 9 Jul 2023 → 11 Jul 2023 |
Publication series
Name | Proceedings - International Conference on Machine Learning and Cybernetics |
---|---|
ISSN (Print) | 2160-133X |
ISSN (Electronic) | 2160-1348 |
Conference
Conference | 2023 International Conference on Machine Learning and Cybernetics, ICMLC 2023 |
---|---|
Country/Territory | Australia |
City | Adelaide |
Period | 9/07/23 → 11/07/23 |
Bibliographical note
Publisher Copyright:© 2023 IEEE.
Keywords
- Ensemble learning
- Knowledge graph completion
- Knowledge graph embedding
- Weighted loss