A multi-level text representation model within background knowledge based on human cognitive process for big data analysis

Xiao Wei, Jun Zhang, Dajun Zeng, Qing Li

Research output: Contribution to journalArticle

4 Scopus citations

Abstract

Text representation is part of the most fundamental work in text comprehension, processing, and search. Various kinds of work has been proposed to mine the semantics in texts and then to represent them. However, most of them only focus on how to mine semantics from the text itself, while few of them take the background knowledge into consideration, which is very important to text understanding. In this paper, on the basis of human cognitive process, we propose a multi-level text representation model within background knowledge, called TRMBK. It is composed of three levels, which are machine surface code, machine text base and machine situational model. All of them are able to be constructed automatically to acquire semantics both inside and outside of the texts. Simultaneously, we also propose a method to establish background knowledge automatically and offer supports for the current text comprehension. Finally, experiments and comparisons have been presented to show the better performance of TRMBK.

Original languageEnglish (US)
Pages (from-to)1475-1487
Number of pages13
JournalCluster Computing
Volume19
Issue number3
DOIs
StatePublished - Sep 1 2016
Externally publishedYes

Keywords

  • Background knowledge
  • Human cognitive process
  • Semantics
  • Situational model
  • Surface code
  • Text base
  • Text comprehension
  • Text representation

ASJC Scopus subject areas

  • Software
  • Computer Networks and Communications

Fingerprint Dive into the research topics of 'A multi-level text representation model within background knowledge based on human cognitive process for big data analysis'. Together they form a unique fingerprint.

  • Cite this