TY - JOUR
T1 - MetAdapt
T2 - Meta-learned task-adaptive architecture for few-shot classification
AU - Doveh, Sivan
AU - Schwartz, Eli
AU - Xue, Chao
AU - Feris, Rogerio
AU - Bronstein, Alex
AU - Giryes, Raja
AU - Karlinsky, Leonid
N1 - Publisher Copyright:
© 2021
PY - 2021/9
Y1 - 2021/9
N2 - Recently, great progress has been made in the field of Few-Shot Learning (FSL). While many different methods have been proposed, one of the key factors leading to higher FSL performance is surprisingly simple. It is the backbone network architecture used to embed the images of the few-shot tasks. While first works on FSL resorted to small architectures with just a few convolution layers, recent works show that large architectures pre-trained on the training portion of FSL datasets produce strong features that are more easily transferable to novel few-shot tasks, thus attaining significant gains to methods using them. Despite these observations, little to no work has been done towards finding the right backbone for FSL. In this paper we propose MetAdapt that not only meta-searches for an optimized architecture for FSL using Network Architecture Search (NAS), but also results in a model that can adaptively ‘re-wire’ itself predicting the better architecture for a given novel few-shot task. Using the proposed approach we observe strong results on two popular few-shot benchmarks: miniImageNet and FC100.
AB - Recently, great progress has been made in the field of Few-Shot Learning (FSL). While many different methods have been proposed, one of the key factors leading to higher FSL performance is surprisingly simple. It is the backbone network architecture used to embed the images of the few-shot tasks. While first works on FSL resorted to small architectures with just a few convolution layers, recent works show that large architectures pre-trained on the training portion of FSL datasets produce strong features that are more easily transferable to novel few-shot tasks, thus attaining significant gains to methods using them. Despite these observations, little to no work has been done towards finding the right backbone for FSL. In this paper we propose MetAdapt that not only meta-searches for an optimized architecture for FSL using Network Architecture Search (NAS), but also results in a model that can adaptively ‘re-wire’ itself predicting the better architecture for a given novel few-shot task. Using the proposed approach we observe strong results on two popular few-shot benchmarks: miniImageNet and FC100.
UR - http://www.scopus.com/inward/record.url?scp=85108943680&partnerID=8YFLogxK
U2 - 10.1016/j.patrec.2021.05.010
DO - 10.1016/j.patrec.2021.05.010
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85108943680
SN - 0167-8655
VL - 149
SP - 130
EP - 136
JO - Pattern Recognition Letters
JF - Pattern Recognition Letters
ER -