Degree-based stratification of nodes in Graph Neural Networks

Ameen Ali, Lior Wolf, Hakan Cevikalp

Research output: Contribution to journalConference articlepeer-review


Despite much research, Graph Neural Networks (GNNs) still do not display the favorable scaling properties of other deep neural networks such as Convolutional Neural Networks and Transformers. Previous work has identified issues such as oversmoothing of the latent representation and have suggested solutions such as skip connections and sophisticated normalization schemes. Here, we propose a different approach that is based on a stratification of the graph nodes. We provide motivation that the nodes in a graph can be stratified into those with a low degree and those with a high degree and that the two groups are likely to behave differently. Based on this motivation, we modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group. This simple-to-implement modification seems to improve performance across datasets and GNN methods. To verify that this increase in performance is not only due to the added capacity, we also perform the same modification for random splits of the nodes, which does not lead to any improvement.

Original languageEnglish
Pages (from-to)15-27
Number of pages13
JournalProceedings of Machine Learning Research
StatePublished - 2023
Externally publishedYes
Event15th Asian Conference on Machine Learning, ACML 2023 - Istanbul, Turkey
Duration: 11 Nov 202314 Nov 2023


FundersFunder number
Tel Aviv University


    • graph neural networks
    • message passing
    • node degree


    Dive into the research topics of 'Degree-based stratification of nodes in Graph Neural Networks'. Together they form a unique fingerprint.

    Cite this