NATURAL STATISTICS OF NETWORK ACTIVATIONS AND IMPLICATIONS FOR KNOWLEDGE DISTILLATION

Michael Rotman, Lior Wolf

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In a matter that is analogous to the study of natural image statistics, we study the natural statistics of the deep neural network activations at various layers. As we show, these statistics, similar to image statistics, follow a power law. We also show, both analytically and empirically, that with depth the exponent of this power law increases at a linear rate. As a direct implication of our discoveries, we present a method for performing Knowledge Distillation (KD). While classical KD methods consider the logits of the teacher network, more recent methods obtain a leap in performance by considering the activation maps. This, however, uses metrics that are suitable for comparing images. We propose to employ two additional loss terms that are based on the spectral properties of the intermediate activation maps. The proposed method obtains state of the art results on multiple image recognition KD benchmarks.

Original languageEnglish
Title of host publication2021 IEEE International Conference on Image Processing, ICIP 2021 - Proceedings
PublisherIEEE Computer Society
Pages399-403
Number of pages5
ISBN (Electronic)9781665441155
DOIs
StatePublished - 2021
Event2021 IEEE International Conference on Image Processing, ICIP 2021 - Anchorage, United States
Duration: 19 Sep 202122 Sep 2021

Publication series

NameProceedings - International Conference on Image Processing, ICIP
Volume2021-September
ISSN (Print)1522-4880

Conference

Conference2021 IEEE International Conference on Image Processing, ICIP 2021
Country/TerritoryUnited States
CityAnchorage
Period19/09/2122/09/21

Keywords

  • Image statistics
  • Knowledge distillation

Fingerprint

Dive into the research topics of 'NATURAL STATISTICS OF NETWORK ACTIVATIONS AND IMPLICATIONS FOR KNOWLEDGE DISTILLATION'. Together they form a unique fingerprint.

Cite this