Skip to Main content Skip to Navigation
Conference papers

Accelerated Stochastic Matrix Inversion: General Theory and Speeding up BFGS Rules for Faster Second-Order Optimization

Abstract : We present the first accelerated randomized algorithm for solving linear systems in Euclidean spaces. One essential problem of this type is the matrix inversion problem. In particular, our algorithm can be specialized to invert positive definite matrices in such a way that all iterates (approximate solutions) generated by the algorithm are positive definite matrices themselves. This opens the way for many applications in the field of optimization and machine learning. As an application of our general theory, we develop the first accelerated (deterministic and stochastic) quasi-Newton updates. Our updates lead to provably more aggressive approximations of the inverse Hessian, and lead to speed-ups over classical non-accelerated rules in numerical experiments. Experiments with empirical risk minimization show that our rules can accelerate training of machine learning models.
Complete list of metadatas

Cited literature [38 references]  Display  Hide  Download

https://hal.telecom-paris.fr/hal-02365327
Contributor : Robert Gower <>
Submitted on : Friday, November 15, 2019 - 1:16:19 PM
Last modification on : Monday, October 12, 2020 - 3:28:43 AM
Long-term archiving on: : Sunday, February 16, 2020 - 4:57:57 PM

File

7434-accelerated-stochastic-ma...
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02365327, version 1

Citation

Robert Gower, Filip Hanzely, Peter Richtárik, Sebastian Stich. Accelerated Stochastic Matrix Inversion: General Theory and Speeding up BFGS Rules for Faster Second-Order Optimization. International Conference on Machine Learning, Jun 2019, Los Angeles, United States. ⟨hal-02365327⟩

Share

Metrics

Record views

30

Files downloads

24