TY - GEN
T1 - Rate-Distortion in Non-Convex Families
AU - Ratson, Hila
AU - Zamir, Ram
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Iterative constrained optimization often requires convexity conditions about the argument set in order to converge to the global optimum. One such instant is the parametric version of the Blahut algorithm for rate-distortion function computation. However, there are many interesting cases for which the parametric set is not convex, e.g a discrete reproduction alphabet at unknown (parametric) locations for a continuous source. In this paper we show examples of non-convex families for which the parametric Blahut algorithm does not converge to the global optimum, and suggest a combined parametric Blahut and random annealing (A) method to overcome this problem.
AB - Iterative constrained optimization often requires convexity conditions about the argument set in order to converge to the global optimum. One such instant is the parametric version of the Blahut algorithm for rate-distortion function computation. However, there are many interesting cases for which the parametric set is not convex, e.g a discrete reproduction alphabet at unknown (parametric) locations for a continuous source. In this paper we show examples of non-convex families for which the parametric Blahut algorithm does not converge to the global optimum, and suggest a combined parametric Blahut and random annealing (A) method to overcome this problem.
UR - http://www.scopus.com/inward/record.url?scp=85165010536&partnerID=8YFLogxK
U2 - 10.1109/ITW55543.2023.10160233
DO - 10.1109/ITW55543.2023.10160233
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:85165010536
T3 - 2023 IEEE Information Theory Workshop, ITW 2023
SP - 87
EP - 91
BT - 2023 IEEE Information Theory Workshop, ITW 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 IEEE Information Theory Workshop, ITW 2023
Y2 - 23 April 2023 through 28 April 2023
ER -