Vector-Valued Least-Squares Regression under Output Regularity Assumptions - Télécom Paris Accéder directement au contenu
Article Dans Une Revue Journal of Machine Learning Research Année : 2022

Vector-Valued Least-Squares Regression under Output Regularity Assumptions

Résumé

We propose and analyse a reduced-rank method for solving least-squares regression problems with infinite dimensional output. We derive learning bounds for our method, and study under which setting statistical performance is improved in comparison to full-rank method. Our analysis extends the interest of reduced-rank regression beyond the standard low-rank setting to more general output regularity assumptions. We illustrate our theoretical insights on synthetic least-squares problems. Then, we propose a surrogate structured prediction method derived from this reduced-rank method. We assess its benefits on three different problems: image reconstruction, multi-label classification, and metabolite identification.
Fichier principal
Vignette du fichier
21-1357.pdf (675.23 Ko) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

hal-03888760 , version 1 (08-12-2022)

Identifiants

  • HAL Id : hal-03888760 , version 1

Citer

Luc Brogat-Motte, Alessandro Rudi, Celine Brouard, Juho Rousu, Florence d'Alché-Buc. Vector-Valued Least-Squares Regression under Output Regularity Assumptions. Journal of Machine Learning Research, 2022. ⟨hal-03888760⟩
84 Consultations
33 Téléchargements

Partager

Gmail Facebook X LinkedIn More