How Important is Weight Symmetry in Backpropagation?
Author(s)
Liao, Qianli; Leibo, Joel Z.; Poggio, Tomaso
DownloadCBMM-Memo-036.pdf (615.3Kb)
Terms of use
Metadata
Show full item recordAbstract
Gradient backpropagation (BP) requires symmetric feedforward and feedback connections—the same weights must be used for forward and backward passes. This “weight transport problem” [1] is thought to be one of the main reasons of BP’s biological implausibility. Using 15 different classification datasets, we systematically study to what extent BP really depends on weight symmetry. In a study that turned out to be surprisingly similar in spirit to Lillicrap et al.’s demonstration [2] but orthogonal in its results, our experiments indicate that: (1) the magnitudes of feedback weights do not matter to performance (2) the signs of feedback weights do matter—the more concordant signs between feedforward and their corresponding feedback connections, the better (3) with feedback weights having random magnitudes and 100% concordant signs, we were able to achieve the same or even better performance than SGD. (4) some normalizations/stabilizations are indispensable for such asymmetric BP to work, namely Batch Normalization (BN) [3] and/or a “Batch Manhattan” (BM) update rule.
Date issued
2015-11-29Publisher
Center for Brains, Minds and Machines (CBMM), arXiv
Citation
arXiv:1510.05067v3
Series/Report no.
CBMM Memo Series;036
Keywords
Gradient backpropagation (BP), Batch Normalization (BN)
Collections
The following license files are associated with this item: