TY - CHAP
T1 - Regularized Classification-Aware Quantization
AU - Severo, Daniel
AU - Domanovitz, Elad
AU - Khisti, Ashish
N1 - Publisher Copyright:
© 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2022
Y1 - 2022
N2 - Traditionally, quantization is designed to minimize the reconstruction error of a data source. When considering downstream classification tasks, other measures of distortion can be of interest, such as the 0-1 classification loss. Furthermore, it is desirable that the performance of these quantizers does not deteriorate once they are deployed into production, as re-learning the scheme online is not always possible. In this chapter, we present a class of algorithms that learn distributed quantization schemes for binary classification tasks. Our method performs well on unseen data and is faster than previous methods proportional to a quadratic term of the dataset size. It works by regularizing the 0-1 loss with the reconstruction error. We present experiments on synthetic mixture and bivariate Gaussian data and compare training, testing, and generalization errors with a family of benchmark quantization schemes from the literature. Our method is called Regularized Classification-Aware Quantization.
AB - Traditionally, quantization is designed to minimize the reconstruction error of a data source. When considering downstream classification tasks, other measures of distortion can be of interest, such as the 0-1 classification loss. Furthermore, it is desirable that the performance of these quantizers does not deteriorate once they are deployed into production, as re-learning the scheme online is not always possible. In this chapter, we present a class of algorithms that learn distributed quantization schemes for binary classification tasks. Our method performs well on unseen data and is faster than previous methods proportional to a quadratic term of the dataset size. It works by regularizing the 0-1 loss with the reconstruction error. We present experiments on synthetic mixture and bivariate Gaussian data and compare training, testing, and generalization errors with a family of benchmark quantization schemes from the literature. Our method is called Regularized Classification-Aware Quantization.
KW - Classification
KW - Distributed quantization
KW - Generalization
KW - Regularization
UR - http://www.scopus.com/inward/record.url?scp=85140709448&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-06947-5_5
DO - 10.1007/978-3-031-06947-5_5
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.chapter???
AN - SCOPUS:85140709448
T3 - Signals and Communication Technology
SP - 61
EP - 73
BT - Signals and Communication Technology
PB - Springer Science and Business Media Deutschland GmbH
ER -