A Unified Framework for Training Neural Networks - Equipe Communications numériques Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2021

A Unified Framework for Training Neural Networks

Résumé

The lack of mathematical tractability of Deep Neural Networks (DNNs) has hindered progress towards having a unified convergence analysis of training algorithms, in the general setting. We propose a unified optimization framework for training different types of DNNs, and establish its convergence for arbitrary loss, activation, and regularization functions, assumed to be smooth. We show that framework generalizes well-known first- and second-order training methods, and thus allows us to show the convergence of these methods for various DNN architectures and learning tasks, as a special case of our approach. We discuss some of its applications in training various DNN architectures (e.g., feed-forward, convolutional, linear networks), to regression and classification tasks.

Dates et versions

hal-03275771 , version 1 (01-07-2021)

Identifiants

Citer

Hadi Ghauch, Hossein Shokri-Ghadikolaei, Carlo Fischione, Mikael Skoglund. A Unified Framework for Training Neural Networks. 2021. ⟨hal-03275771⟩
26 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More