TY - JOUR
T1 - Multi-label graph node classification with label attentive neighborhood convolution
AU - Zhou, Cangqi
AU - Chen, Hui
AU - Zhang, Jing
AU - Li, Qianmu
AU - Hu, Dianming
AU - Sheng, Victor S.
N1 - Funding Information:
This work was supported by the National Natural Science Foundation of China [Grant No. 61902186, No. 62076130 and No. 91846104], the Natural Science Foundation of Jiangsu Province [Grant No. BK20180463], the Fundamental Research Funds for the Central Universities [Grant No. 30920010008, No. 30918012204 and No. 30920041112], the National Key R&D Program of China [Grant No. 2020YFB1804604 and No. 2020YFB1805503], Industrial Internet Innovation and Development Project from Ministry of Industry and Information Technology of China, Jiangsu Province Modern Education Technology Research Project [Grant No. 84365], Jiangsu Province Major Technical Research Project "Information Security Simulation System", National Vocational Education Teacher Enterprise Practice Base "Integration of Industry and Education" Special Project, and the Scientific Research Project of Nanjing Vocational University of Industry Technology [Grant No. 2020SKYJ03].
Publisher Copyright:
© 2021 Elsevier Ltd
PY - 2021/10/15
Y1 - 2021/10/15
N2 - Learning with graph structured data is of great significance for many practical applications. A crucial and fundamental task in graph learning is node classification. In reality, graph nodes are often encoded with various attributes. In addition, the task is usually multi-labeled in nature. In this paper, we tackle the problem of multi-label graph node classification, by leveraging structure, attribute and label information simultaneously. Specifically, to obtain rational node feature representations, we propose an intuitive yet effective graph convolution module to aggregate local attribute information of a given node. Moreover, the homophily hypothesis motivates us to build a label attention module. By exploiting both input and output contextual representations, we utilize the additive attention mechanism and build a label-aware representation learning framework to measure the compatibility between pairs of node embeddings and label embeddings. The proposed novel neural network-based, multi-label classification method has been verified by extensive experiments conducted on five public-available benchmark datasets, including both attributed and non-attributed networks. The results demonstrate the effectiveness of the proposed model with respect to micro-F1, macro-F1 and Hamming loss, comparing with several state-of-the-art methods, including two relational neighbor classifiers and several popular graph neural network models.
AB - Learning with graph structured data is of great significance for many practical applications. A crucial and fundamental task in graph learning is node classification. In reality, graph nodes are often encoded with various attributes. In addition, the task is usually multi-labeled in nature. In this paper, we tackle the problem of multi-label graph node classification, by leveraging structure, attribute and label information simultaneously. Specifically, to obtain rational node feature representations, we propose an intuitive yet effective graph convolution module to aggregate local attribute information of a given node. Moreover, the homophily hypothesis motivates us to build a label attention module. By exploiting both input and output contextual representations, we utilize the additive attention mechanism and build a label-aware representation learning framework to measure the compatibility between pairs of node embeddings and label embeddings. The proposed novel neural network-based, multi-label classification method has been verified by extensive experiments conducted on five public-available benchmark datasets, including both attributed and non-attributed networks. The results demonstrate the effectiveness of the proposed model with respect to micro-F1, macro-F1 and Hamming loss, comparing with several state-of-the-art methods, including two relational neighbor classifiers and several popular graph neural network models.
KW - Attention mechanism
KW - Graph convolution
KW - Graph node classification
KW - Multi-label classification
UR - http://www.scopus.com/inward/record.url?scp=85105888722&partnerID=8YFLogxK
U2 - 10.1016/j.eswa.2021.115063
DO - 10.1016/j.eswa.2021.115063
M3 - Article
AN - SCOPUS:85105888722
VL - 180
JO - Expert Systems with Applications
JF - Expert Systems with Applications
SN - 0957-4174
M1 - 115063
ER -