Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Infinite-dimensional gradient-based descent for alpha-divergence minimisation

Abstract : This paper introduces the $(\alpha, \Gamma)$-descent, an iterative algorithm which operates on measures and performs $\alpha$-divergence minimisation in a Bayesian framework. This gradient-based procedure extends the commonly-used variational approximation by adding a prior on the variational parameters in the form of a measure. We prove that for a rich family of functions $\Gamma$, this algorithm leads at each step to a systematic decrease in the $\alpha$-divergence. Our framework recovers the Entropic Mirror Descent (MD) algorithm with improved $O(1/N)$ convergence results and provides an alternative to the Entropic MD that we call the Power descent and for which we prove convergence to an optimum. Moreover, the $(\alpha, \Gamma)$-descent allows to optimise the mixture weights of any given mixture model without any information on the underlying distribution of the variational parameters. This renders our method compatible with many choices of parameters updates and applicable to a wide range of Machine Learning tasks. We demonstrate empirically on both toy and real-world examples the benefit of using the Power descent and going beyond the Entropic MD framework, which fails as the dimension grows.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

Cited literature [49 references]  Display  Hide  Download

https://hal.telecom-paris.fr/hal-02614605
Contributor : Kamélia Daudel <>
Submitted on : Thursday, May 21, 2020 - 11:46:41 AM
Last modification on : Wednesday, June 24, 2020 - 4:19:56 PM

File

ddpr2019_revisited.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02614605, version 1
  • ARXIV : 2005.10618

Citation

Kamélia Daudel, Randal Douc, François Portier. Infinite-dimensional gradient-based descent for alpha-divergence minimisation. 2020. ⟨hal-02614605⟩

Share

Metrics

Record views

78

Files downloads

43