## Abstract

Interior point methods (IPMs) are a common approach for solving linear programs (LPs) with strong theoretical guarantees and solid empirical performance. The time complexity of these methods is dominated by the cost of solving a linear system of equations at each iteration. In common applications of linear programming, particularly in machine learning and scientific computing, the size of this linear system can become prohibitively large, requiring the use of iterative solvers, which provide an approximate solution to the linear system. However, approximately solving the linear system at each iteration of an IPM invalidates the theoretical guarantees of common IPM analyses. To remedy this, we theoretically and empirically analyze (slightly modified) predictor-corrector IPMs when using approximate linear solvers: our approach guarantees that, when certain conditions are satisfied, the number of IPM iterations does not increase and that the final solution remains feasible. We also provide practical instantiations of approximate linear solvers that satisfy these conditions for special classes of constraint matrices using randomized linear algebra.

Original language | English |
---|---|

Pages (from-to) | 5007-5038 |

Number of pages | 32 |

Journal | Proceedings of Machine Learning Research |

Volume | 162 |

State | Published - 2022 |

Event | 39th International Conference on Machine Learning, ICML 2022 - Baltimore, United States Duration: 17 Jul 2022 → 23 Jul 2022 |

### Funding

Funders | Funder number |
---|---|

BSF | 2017698 |

Department of Statistics | |

NSF | |

Office of Advanced Scientific Computing Research | |

U.S. Department of Energy | |

UT-Battelle LLC | DE-AC05-00OR22725 |

National Science Foundation | AF 1814041, FRG 1760353, DOE-SC0022085 |

U.S. Department of Energy | |

Bloom's Syndrome Foundation | |

Advanced Scientific Computing Research | |

Purdue University | |

UT-Battelle |