We propose a method that incorporates a non-Euclidean gradient descent step with a generic matrix sketching procedure, for solving unconstrained, nonconvex, matrix optimization problems, in which the decision variable cannot be saved in memory due to its size, and the objective function is the composition of a vector function on a linear operator. The method updates the sketch directly without updating or storing the decision variable. Subsequence convergence, global convergence under the Kurdyka–Lojasiewicz property, and rate of convergence, are established.
- Convergence analysis
- Matrix minimization
- Matrix sketching
- Non-Euclidean gradient method