Neural (Tangent Kernel) Collapse

Mariia Seleznova*, Dana Weitzner, Raja Giryes, Gitta Kutyniok, Hung Hsu Chou

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations

Abstract

This work bridges two important concepts: the Neural Tangent Kernel (NTK), which captures the evolution of deep neural networks (DNNs) during training, and the Neural Collapse (NC) phenomenon, which refers to the emergence of symmetry and structure in the last-layer features of well-trained classification DNNs. We adopt the natural assumption that the empirical NTK develops a block structure aligned with the class labels, i.e., samples within the same class have stronger correlations than samples from different classes. Under this assumption, we derive the dynamics of DNNs trained with mean squared (MSE) loss and break them into interpretable phases. Moreover, we identify an invariant that captures the essence of the dynamics, and use it to prove the emergence of NC in DNNs with block-structured NTK. We provide large-scale numerical experiments on three common DNN architectures and three benchmark datasets to support our theory.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 36 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023
EditorsA. Oh, T. Neumann, A. Globerson, K. Saenko, M. Hardt, S. Levine
PublisherNeural information processing systems foundation
ISBN (Electronic)9781713899921
StatePublished - 2023
Event37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, United States
Duration: 10 Dec 202316 Dec 2023

Publication series

NameAdvances in Neural Information Processing Systems
Volume36
ISSN (Print)1049-5258

Conference

Conference37th Conference on Neural Information Processing Systems, NeurIPS 2023
Country/TerritoryUnited States
CityNew Orleans
Period10/12/2316/12/23

Funding

FundersFunder number
Munich Center for Machine Learning
Deutscher Akademischer Austauschdienst
Konrad Zuse School of Excellence in Reliable AI
Bundesministerium für Bildung und Forschung
LMU-TAU
International Key Cooperation Tel Aviv University
Deutsche ForschungsgemeinschaftKU 1446/32-1, KU 1446/31-1, DFG-SFB/TR 109, DFG-SPP-2298
ERC-stg SPADE757497

    Fingerprint

    Dive into the research topics of 'Neural (Tangent Kernel) Collapse'. Together they form a unique fingerprint.

    Cite this