Feature overcorrelation in deep graph neural networks: A new perspective

W Jin, X Liu, Y Ma, C Aggarwal, J Tang - arXiv preprint arXiv:2206.07743, 2022 - arxiv.org
arXiv preprint arXiv:2206.07743, 2022arxiv.org
Recent years have witnessed remarkable success achieved by graph neural networks
(GNNs) in many real-world applications such as recommendation and drug discovery.
Despite the success, oversmoothing has been identified as one of the key issues which limit
the performance of deep GNNs. It indicates that the learned node representations are highly
indistinguishable due to the stacked aggregators. In this paper, we propose a new
perspective to look at the performance degradation of deep GNNs, ie, feature …
Recent years have witnessed remarkable success achieved by graph neural networks (GNNs) in many real-world applications such as recommendation and drug discovery. Despite the success, oversmoothing has been identified as one of the key issues which limit the performance of deep GNNs. It indicates that the learned node representations are highly indistinguishable due to the stacked aggregators. In this paper, we propose a new perspective to look at the performance degradation of deep GNNs, i.e., feature overcorrelation. Through empirical and theoretical study on this matter, we demonstrate the existence of feature overcorrelation in deeper GNNs and reveal potential reasons leading to this issue. To reduce the feature correlation, we propose a general framework DeCorr which can encourage GNNs to encode less redundant information. Extensive experiments have demonstrated that DeCorr can help enable deeper GNNs and is complementary to existing techniques tackling the oversmoothing issue.
arxiv.org
以上显示的是最相近的搜索结果。 查看全部搜索结果