# Infinite-dimensional gradient-based descent for alpha-divergence minimisation

2 S2A - Signal, Statistique et Apprentissage
LTCI - Laboratoire Traitement et Communication de l'Information
4 TIPIC-SAMOVAR - Traitement de l'Information Pour Images et Communications
SAMOVAR - Services répartis, Architectures, MOdélisation, Validation, Administration des Réseaux
Abstract : This paper introduces the $(\alpha, \Gamma)$-descent, an iterative algorithm which operates on measures and performs $\alpha$-divergence minimisation in a Bayesian framework. This gradient-based procedure extends the commonly-used variational approximation by adding a prior on the variational parameters in the form of a measure. We prove that for a rich family of functions $\Gamma$, this algorithm leads at each step to a systematic decrease in the $\alpha$-divergence. Our framework recovers the Entropic Mirror Descent (MD) algorithm with improved $O(1/N)$ convergence results and provides an alternative to the Entropic MD that we call the Power descent and for which we prove convergence to an optimum. Moreover, the $(\alpha, \Gamma)$-descent allows to optimise the mixture weights of any given mixture model without any information on the underlying distribution of the variational parameters. This renders our method compatible with many choices of parameters updates and applicable to a wide range of Machine Learning tasks. We demonstrate empirically on both toy and real-world examples the benefit of using the Power descent and going beyond the Entropic MD framework, which fails as the dimension grows.
Keywords :
Document type :
Preprints, Working Papers, ...
Domain :

Cited literature [49 references]

https://hal.telecom-paris.fr/hal-02614605
Contributor : Kamélia Daudel <>
Submitted on : Thursday, May 21, 2020 - 11:46:41 AM
Last modification on : Wednesday, June 24, 2020 - 4:19:56 PM

### File

ddpr2019_revisited.pdf
Files produced by the author(s)

### Identifiers

• HAL Id : hal-02614605, version 1
• ARXIV : 2005.10618

### Citation

Kamélia Daudel, Randal Douc, François Portier. Infinite-dimensional gradient-based descent for alpha-divergence minimisation. 2020. ⟨hal-02614605⟩

Record views