TY - GEN
T1 - Gated Graph Neural Networks (GG-NNs) for Abstractive Multi-Comment Summarization
AU - Zhan, Huixin
AU - Zhang, Kun
AU - Hu, Chenyi
AU - Sheng, Victor S.
N1 - Funding Information:
Chenyi Hu is partially supported by US National Science Foundation through the grant award OIA 1946391.
Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Summarization of long sequences into a concise statement is a core problem in natural language processing, which requires a non-trivial understanding of the weakly structured text. Therefore, integrating crowdsourced multiple users' comments into a concise summary is even harder because (1) it requires transferring the weakly structured comments to structured knowledge. Besides, (2) the users comments are informal and noisy. In order to capture the long-distance relationships in staggered long sentences, we propose a neural multi-comment summarization (MCS) system that incorporates the sentence relationships via graph heuristics that utilize relation knowledge graphs, i.e., sentence relation graphs (SRG) and approximate dis-course graphs (ADG). Motivated by the promising results of gated graph neural networks (GG- NNs) on highly structured data, we develop a GG-NNs with sequence encoder that incorporates SRG or ADG in order to capture the sentence relationships. Specifi-cally, we employ the GG- NNs on both relation knowledge graphs, with the sentence embeddings as the input node features and the graph heuristics as the edges' weights. Through multiple layer-wise propagations, the GG- NNs generate the salience for each sentence from high-level hidden sentence features. Consequently, we use a greedy heuristic to extract salient users' comments while avoiding the noise in comments. The experimental results show that the proposed MCS improves the summarization performance both quantitatively and qualitatively.
AB - Summarization of long sequences into a concise statement is a core problem in natural language processing, which requires a non-trivial understanding of the weakly structured text. Therefore, integrating crowdsourced multiple users' comments into a concise summary is even harder because (1) it requires transferring the weakly structured comments to structured knowledge. Besides, (2) the users comments are informal and noisy. In order to capture the long-distance relationships in staggered long sentences, we propose a neural multi-comment summarization (MCS) system that incorporates the sentence relationships via graph heuristics that utilize relation knowledge graphs, i.e., sentence relation graphs (SRG) and approximate dis-course graphs (ADG). Motivated by the promising results of gated graph neural networks (GG- NNs) on highly structured data, we develop a GG-NNs with sequence encoder that incorporates SRG or ADG in order to capture the sentence relationships. Specifi-cally, we employ the GG- NNs on both relation knowledge graphs, with the sentence embeddings as the input node features and the graph heuristics as the edges' weights. Through multiple layer-wise propagations, the GG- NNs generate the salience for each sentence from high-level hidden sentence features. Consequently, we use a greedy heuristic to extract salient users' comments while avoiding the noise in comments. The experimental results show that the proposed MCS improves the summarization performance both quantitatively and qualitatively.
KW - Graph data structure
KW - Graph neural network
KW - Multi-comment summarization
UR - http://www.scopus.com/inward/record.url?scp=85125058953&partnerID=8YFLogxK
U2 - 10.1109/ICKG52313.2021.00050
DO - 10.1109/ICKG52313.2021.00050
M3 - Conference contribution
AN - SCOPUS:85125058953
T3 - Proceedings - 12th IEEE International Conference on Big Knowledge, ICBK 2021
SP - 323
EP - 330
BT - Proceedings - 12th IEEE International Conference on Big Knowledge, ICBK 2021
A2 - Gong, Zhiguo
A2 - Li, Xue
A2 - Oguducu, Sule Gunduz
A2 - Chen, Lei
A2 - Manjon, Baltasar Fernandez
A2 - Wu, Xindong
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 12th IEEE International Conference on Big Knowledge, ICBK 2021
Y2 - 7 December 2021 through 8 December 2021
ER -