A Full Adagrad algorithm with O(Nd) operations - Université de Paris - Faculté des Sciences Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2024

A Full Adagrad algorithm with O(Nd) operations

Résumé

A novel approach is given to overcome the computational challenges of the full-matrix Adaptive Gradient algorithm (Full AdaGrad) in stochastic optimization. By developing a recursive method that estimates the inverse of the square root of the covariance of the gradient, alongside a streaming variant for parameter updates, the study offers efficient and practical algorithms for large-scale applications. This innovative strategy significantly reduces the complexity and resource demands typically associated with full-matrix methods, enabling more effective optimization processes. Moreover, the convergence rates of the proposed estimators and their asymptotic efficiency are given. Their effectiveness is demonstrated through numerical studies.
Fichier principal
Vignette du fichier
AdaGrad_hal_arxiv.pdf (2.38 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04560729 , version 1 (03-05-2024)

Identifiants

  • HAL Id : hal-04560729 , version 1

Citer

Antoine Godichon-Baggioni, Wei Lu, Bruno Portier. A Full Adagrad algorithm with O(Nd) operations. 2024. ⟨hal-04560729⟩
0 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More