TY - JOUR

T1 - Quantum probability-inspired graph neural network for document representation and classification

AU - Yan, Peng

AU - Li, Linjing

AU - Jin, Miaotianzi

AU - Zeng, Daniel

N1 - Funding Information:
This work was supported in part by the National Key Research and Development Program of China under Grant 2020AAA0103405, the National Natural Science Foundation of China under Grant 71621002, the Strategic Priority Research Program of Chinese Academy of Sciences under Grant XDA27030100, as well as Shenzhen Longhua District Science and Technology Innovation Fund under Grant 10162a20200617b70da63.
Publisher Copyright:
© 2021 Elsevier B.V.

PY - 2021/7/20

Y1 - 2021/7/20

N2 - Recent studies have found that text can be represented in Hilbert space through a neural network driven by quantum probability, which provides a unified representation of texts with different granularities without losing the performance of downstream tasks. However, these quantum probability-inspired methods only focus on intra-document semantics and lack modeling global structural information. In this paper, we explore the potential of combining quantum probability with graph neural network, and propose a quantum probability-inspired graph neural network model to capture global structural information of interaction between documents for document representation and classification. We build a document interaction graph for a given corpus based on document word relation and frequency information, then learn a graph neural network driven by quantum probability on the defined graph. First, the proposed model represents each document node in the graph as a superposition state in a Hilbert space. Then the proposed model further computes density matrix representations for nodes to encode document interaction as mixed states. Finally, the model computes classification probability by performing quantum measurement on the mixed states. Experiments on four document classification benchmarks show that the proposed model outperforms a variety of classical neural network models and the previous quantum probability-inspired model with much smaller parameter size. Extended analyses also demonstrate the robustness of the proposed model with limited training data and its ability to learn semantically distinguishable document representation.

AB - Recent studies have found that text can be represented in Hilbert space through a neural network driven by quantum probability, which provides a unified representation of texts with different granularities without losing the performance of downstream tasks. However, these quantum probability-inspired methods only focus on intra-document semantics and lack modeling global structural information. In this paper, we explore the potential of combining quantum probability with graph neural network, and propose a quantum probability-inspired graph neural network model to capture global structural information of interaction between documents for document representation and classification. We build a document interaction graph for a given corpus based on document word relation and frequency information, then learn a graph neural network driven by quantum probability on the defined graph. First, the proposed model represents each document node in the graph as a superposition state in a Hilbert space. Then the proposed model further computes density matrix representations for nodes to encode document interaction as mixed states. Finally, the model computes classification probability by performing quantum measurement on the mixed states. Experiments on four document classification benchmarks show that the proposed model outperforms a variety of classical neural network models and the previous quantum probability-inspired model with much smaller parameter size. Extended analyses also demonstrate the robustness of the proposed model with limited training data and its ability to learn semantically distinguishable document representation.

KW - Document classification

KW - Document representation

KW - Graph neural network

KW - Natural language processing

KW - Quantum probability

UR - http://www.scopus.com/inward/record.url?scp=85103673229&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85103673229&partnerID=8YFLogxK

U2 - 10.1016/j.neucom.2021.02.060

DO - 10.1016/j.neucom.2021.02.060

M3 - Article

AN - SCOPUS:85103673229

VL - 445

SP - 276

EP - 286

JO - Neurocomputing

JF - Neurocomputing

SN - 0925-2312

ER -