Monotonic Alpha-divergence Minimisation for Variational Inference - Télécom Paris Accéder directement au contenu
Article Dans Une Revue Journal of Machine Learning Research Année : 2023

Monotonic Alpha-divergence Minimisation for Variational Inference

Résumé

In this paper, we introduce a novel family of iterative algorithms which carry out $\alpha$-divergence minimisation in a Variational Inference context. They do so by ensuring a systematic decrease at each step in the $\alpha$-divergence between the variational and the posterior distributions. In its most general form, the variational distribution is a mixture model and our framework allows us to simultaneously optimise the weights and components parameters of this mixture model. Our approach permits us to build on various methods previously proposed for $\alpha$-divergence minimisation such as Gradient or Power Descent schemes and we also shed a new light on an integrated Expectation Maximization algorithm. Lastly, we provide empirical evidence that our methodology yields improved results on several multimodal target distributions and on a real data example.
Fichier principal
Vignette du fichier
ddr_jmlr_final.pdf (3.69 Mo) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte
Licence : CC BY - Paternité

Dates et versions

hal-03164338 , version 1 (09-03-2021)
hal-03164338 , version 2 (27-04-2023)

Licence

Paternité

Identifiants

Citer

Kamélia Daudel, Randal Douc, François Roueff. Monotonic Alpha-divergence Minimisation for Variational Inference. Journal of Machine Learning Research, 2023, 24 (62), pp.1-76. ⟨hal-03164338v2⟩
303 Consultations
84 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More