For HyperBFs AGOP is a greedy approximation to gradient descent
Author(s)
Gan, Yulu; Poggio, Tomaso
DownloadCBMM-Memo-148.pdf (1.056Mb)
Metadata
Show full item recordAbstract
The Average Gradient Outer Product (AGOP) provides a novel approach to feature learning in neural networks. We applied both AGOP and Gradient Descent to learn the matrix M in the Hyper Basis Function Network (HyperBF) and observed very similar performance. We show formally that AGOP is a greedy approximation of gradient descent.
Date issued
2024-07-13Publisher
Center for Brains, Minds and Machines (CBMM)
Series/Report no.
CBMM Memo;148