CBMM Memo Series: Recent submissions
Now showing items 1-3 of 149
-
Multiplicative Regularization Generalizes Better Than Additive Regularization
(Center for Brains, Minds and Machines (CBMM), 2025-07-02)We investigate the effectiveness of multiplicative versus additive (L2) regularization in deep neural networks, focusing on convolutional neural networks for classification. While additive methods constrain the sum of ... -
Position: A Theory of Deep Learning Must Include Compositional Sparsity
(Center for Brains, Minds and Machines (CBMM), 2025-07-02)Overparametrized Deep Neural Networks (DNNs) have demonstrated remarkable success in a wide variety of domains too high-dimensional for classical shallow networks subject to the curse of dimensionality. However, open ... -
On efficiently computable functions, deep networks and sparse compositionality
(Center for Brains, Minds and Machines (CBMM), 2025-02-01)In previous papers [4, 6] we have claimed that for each function which is efficiently Turing computable there exists a deep and sparse network which approximates it arbitrarily well. We also claimed a key role for ...