AUTHOR=Ye Zhonglin , Li Zhuoran , Li Gege , Zhao Haixing TITLE=Dual-channel deep graph convolutional neural networks JOURNAL=Frontiers in Artificial Intelligence VOLUME=Volume 7 - 2024 YEAR=2024 URL=https://www.frontiersin.org/journals/artificial-intelligence/articles/10.3389/frai.2024.1290491 DOI=10.3389/frai.2024.1290491 ISSN=2624-8212 ABSTRACT=The dual-channel graph convolutional neural networks based on hybrid features jointly model the different features of networks, so that the features can learn each other and improve the performance of various subsequent machine learning tasks. However, current dual-channel graph convolutional neural networks are limited by the number of convolution layers, which hinders the performance improvement of the models. Graph convolutional neural networks superimpose multi-layer graph convolution operations, which would occur in smoothing phenomena, resulting in performance decreasing as the increasing number of graph convolutional layers. Inspired by the success of residual connections on convolutional neural networks, this paper applies residual connections to dual-channel graph convolutional neural networks, and increases the depth of dual-channel graph convolutional neural networks. Thus, a dual-channel deep graph convolutional neural network (D2GCN) is proposed, which can effectively avoid over-smoothing and improve model performance. D2GCN is verified on CiteSeer, DBLP and SDBLP datasets, the results show that D2GCN performs better than the comparison algorithms used in node classification tasks. This is a provisional file, not the final typeset article dependencies in graph data, Graph Convolutional Networks (GCNs) became one of the most active research fields. With the wide application of GCNs in data mining, such as recommendation system [4- 5] and point cloud segmentation [6][7] , researchers will pay more attention to the improvement of GCNs performance.Another key factor behind the success of CNNs is that they can design and train deeper CNN models. However, increasing the number of convolutional layers in the graph by GCNs may cause the gradient disappearance, which means that smoothing occurs during backpropagation, i.e., the features of all nodes in the graph converge to the same value. Thus, GCNs are generally shallow structures that contain 2-3 graph convolutional layers. Shallow structures limit the performance of the model, because they cannot mine higher-order node information. Gradient disappearance poses a challenge for the deep GCNs designing.