Neural (Tangent Kernel) Collapse

Mariia Seleznova*, Dana Weitzner, Raja Giryes, Gitta Kutyniok, Hung Hsu Chou

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

This work bridges two important concepts: the Neural Tangent Kernel (NTK), which captures the evolution of deep neural networks (DNNs) during training, and the Neural Collapse (NC) phenomenon, which refers to the emergence of symmetry and structure in the last-layer features of well-trained classification DNNs. We adopt the natural assumption that the empirical NTK develops a block structure aligned with the class labels, i.e., samples within the same class have stronger correlations than samples from different classes. Under this assumption, we derive the dynamics of DNNs trained with mean squared (MSE) loss and break them into interpretable phases. Moreover, we identify an invariant that captures the essence of the dynamics, and use it to prove the emergence of NC in DNNs with block-structured NTK. We provide large-scale numerical experiments on three common DNN architectures and three benchmark datasets to support our theory.

Original languageEnglish
JournalAdvances in Neural Information Processing Systems
Volume36
StatePublished - 2023
Event37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, United States
Duration: 10 Dec 202316 Dec 2023

Funding

FundersFunder number
Munich Center for Machine Learning
Deutscher Akademischer Austauschdienst
Konrad Zuse School of Excellence in Reliable AI
Bundesministerium für Bildung und Forschung
LMU-TAU
International Key Cooperation Tel Aviv University
Deutsche ForschungsgemeinschaftKU 1446/32-1, KU 1446/31-1, DFG-SFB/TR 109, DFG-SPP-2298
Deutsche Forschungsgemeinschaft
ERC-stg SPADE757497

    Fingerprint

    Dive into the research topics of 'Neural (Tangent Kernel) Collapse'. Together they form a unique fingerprint.

    Cite this