Multimodal Data Enhanced Representation Learning for Knowledge Graphs

Zikang Wang, Linjing Li, Qiudan Li, Daniel Zeng

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Knowledge graph, or knowledge base, plays an important role in a variety of applications in the field of artificial intelligence. In both research and application of knowledge graph, knowledge representation learning is one of the fundamental tasks. Existing representation learning approaches are mainly based on structural knowledge between entities and relations, while knowledge among entities per se is largely ignored. Though a few approaches integrated entity knowledge while learning representations, these methods lack the flexibility to apply to multimodalities. To tackle this problem, in this paper, we propose a new representation learning method, TransAE, by combining multimodal autoencoder with TransE model, where TransE is a simple and effective representation learning method for knowledge graphs. In TransAE, the hidden layer of autoencoder is used as the representation of entities in the TransE model, thus it encodes not only the structural knowledge, but also the multimodal knowledge, such as visual and textural knowledge, into the final representation. Compared with traditional methods based on only structural knowledge, TransAE can significantly improve the performance in the sense of link prediction and triplet classification. Also, TransAE has the ability to learn representations for entities out of knowledge base in zero-shot. Experiments on various tasks demonstrate the effectiveness of our proposed TransAE method.

Original languageEnglish (US)
Title of host publication2019 International Joint Conference on Neural Networks, IJCNN 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728119854
DOIs
StatePublished - Jul 2019
Externally publishedYes
Event2019 International Joint Conference on Neural Networks, IJCNN 2019 - Budapest, Hungary
Duration: Jul 14 2019Jul 19 2019

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2019-July

Conference

Conference2019 International Joint Conference on Neural Networks, IJCNN 2019
CountryHungary
CityBudapest
Period7/14/197/19/19

Fingerprint

Knowledge representation
Artificial intelligence
Experiments

Keywords

  • knowledge graph
  • multimodal
  • representation learning

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Wang, Z., Li, L., Li, Q., & Zeng, D. (2019). Multimodal Data Enhanced Representation Learning for Knowledge Graphs. In 2019 International Joint Conference on Neural Networks, IJCNN 2019 [8852079] (Proceedings of the International Joint Conference on Neural Networks; Vol. 2019-July). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IJCNN.2019.8852079

Multimodal Data Enhanced Representation Learning for Knowledge Graphs. / Wang, Zikang; Li, Linjing; Li, Qiudan; Zeng, Daniel.

2019 International Joint Conference on Neural Networks, IJCNN 2019. Institute of Electrical and Electronics Engineers Inc., 2019. 8852079 (Proceedings of the International Joint Conference on Neural Networks; Vol. 2019-July).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Wang, Z, Li, L, Li, Q & Zeng, D 2019, Multimodal Data Enhanced Representation Learning for Knowledge Graphs. in 2019 International Joint Conference on Neural Networks, IJCNN 2019., 8852079, Proceedings of the International Joint Conference on Neural Networks, vol. 2019-July, Institute of Electrical and Electronics Engineers Inc., 2019 International Joint Conference on Neural Networks, IJCNN 2019, Budapest, Hungary, 7/14/19. https://doi.org/10.1109/IJCNN.2019.8852079
Wang Z, Li L, Li Q, Zeng D. Multimodal Data Enhanced Representation Learning for Knowledge Graphs. In 2019 International Joint Conference on Neural Networks, IJCNN 2019. Institute of Electrical and Electronics Engineers Inc. 2019. 8852079. (Proceedings of the International Joint Conference on Neural Networks). https://doi.org/10.1109/IJCNN.2019.8852079
Wang, Zikang ; Li, Linjing ; Li, Qiudan ; Zeng, Daniel. / Multimodal Data Enhanced Representation Learning for Knowledge Graphs. 2019 International Joint Conference on Neural Networks, IJCNN 2019. Institute of Electrical and Electronics Engineers Inc., 2019. (Proceedings of the International Joint Conference on Neural Networks).
@inproceedings{f843447e96ca49c28c25d23a2275886f,
title = "Multimodal Data Enhanced Representation Learning for Knowledge Graphs",
abstract = "Knowledge graph, or knowledge base, plays an important role in a variety of applications in the field of artificial intelligence. In both research and application of knowledge graph, knowledge representation learning is one of the fundamental tasks. Existing representation learning approaches are mainly based on structural knowledge between entities and relations, while knowledge among entities per se is largely ignored. Though a few approaches integrated entity knowledge while learning representations, these methods lack the flexibility to apply to multimodalities. To tackle this problem, in this paper, we propose a new representation learning method, TransAE, by combining multimodal autoencoder with TransE model, where TransE is a simple and effective representation learning method for knowledge graphs. In TransAE, the hidden layer of autoencoder is used as the representation of entities in the TransE model, thus it encodes not only the structural knowledge, but also the multimodal knowledge, such as visual and textural knowledge, into the final representation. Compared with traditional methods based on only structural knowledge, TransAE can significantly improve the performance in the sense of link prediction and triplet classification. Also, TransAE has the ability to learn representations for entities out of knowledge base in zero-shot. Experiments on various tasks demonstrate the effectiveness of our proposed TransAE method.",
keywords = "knowledge graph, multimodal, representation learning",
author = "Zikang Wang and Linjing Li and Qiudan Li and Daniel Zeng",
year = "2019",
month = "7",
doi = "10.1109/IJCNN.2019.8852079",
language = "English (US)",
series = "Proceedings of the International Joint Conference on Neural Networks",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
booktitle = "2019 International Joint Conference on Neural Networks, IJCNN 2019",

}

TY - GEN

T1 - Multimodal Data Enhanced Representation Learning for Knowledge Graphs

AU - Wang, Zikang

AU - Li, Linjing

AU - Li, Qiudan

AU - Zeng, Daniel

PY - 2019/7

Y1 - 2019/7

N2 - Knowledge graph, or knowledge base, plays an important role in a variety of applications in the field of artificial intelligence. In both research and application of knowledge graph, knowledge representation learning is one of the fundamental tasks. Existing representation learning approaches are mainly based on structural knowledge between entities and relations, while knowledge among entities per se is largely ignored. Though a few approaches integrated entity knowledge while learning representations, these methods lack the flexibility to apply to multimodalities. To tackle this problem, in this paper, we propose a new representation learning method, TransAE, by combining multimodal autoencoder with TransE model, where TransE is a simple and effective representation learning method for knowledge graphs. In TransAE, the hidden layer of autoencoder is used as the representation of entities in the TransE model, thus it encodes not only the structural knowledge, but also the multimodal knowledge, such as visual and textural knowledge, into the final representation. Compared with traditional methods based on only structural knowledge, TransAE can significantly improve the performance in the sense of link prediction and triplet classification. Also, TransAE has the ability to learn representations for entities out of knowledge base in zero-shot. Experiments on various tasks demonstrate the effectiveness of our proposed TransAE method.

AB - Knowledge graph, or knowledge base, plays an important role in a variety of applications in the field of artificial intelligence. In both research and application of knowledge graph, knowledge representation learning is one of the fundamental tasks. Existing representation learning approaches are mainly based on structural knowledge between entities and relations, while knowledge among entities per se is largely ignored. Though a few approaches integrated entity knowledge while learning representations, these methods lack the flexibility to apply to multimodalities. To tackle this problem, in this paper, we propose a new representation learning method, TransAE, by combining multimodal autoencoder with TransE model, where TransE is a simple and effective representation learning method for knowledge graphs. In TransAE, the hidden layer of autoencoder is used as the representation of entities in the TransE model, thus it encodes not only the structural knowledge, but also the multimodal knowledge, such as visual and textural knowledge, into the final representation. Compared with traditional methods based on only structural knowledge, TransAE can significantly improve the performance in the sense of link prediction and triplet classification. Also, TransAE has the ability to learn representations for entities out of knowledge base in zero-shot. Experiments on various tasks demonstrate the effectiveness of our proposed TransAE method.

KW - knowledge graph

KW - multimodal

KW - representation learning

UR - http://www.scopus.com/inward/record.url?scp=85073242679&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85073242679&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2019.8852079

DO - 10.1109/IJCNN.2019.8852079

M3 - Conference contribution

AN - SCOPUS:85073242679

T3 - Proceedings of the International Joint Conference on Neural Networks

BT - 2019 International Joint Conference on Neural Networks, IJCNN 2019

PB - Institute of Electrical and Electronics Engineers Inc.

ER -