Search
Now showing items 1-6 of 6
Theoretical Issues in Deep Networks
(Center for Brains, Minds and Machines (CBMM), 2019-08-17)
While deep learning is successful in a number of applications, it is not yet well understood theoretically. A theoretical characterization of deep learning should answer questions about their approximation power, the ...
Double descent in the condition number
(Center for Brains, Minds and Machines (CBMM), 2019-12-04)
In solving a system of n linear equations in d variables Ax=b, the condition number of the (n,d) matrix A measures how much errors in the data b affect the solution x. Bounds of this type are important in many inverse ...
Hierarchically Local Tasks and Deep Convolutional Networks
(Center for Brains, Minds and Machines (CBMM), 2020-06-24)
The main success stories of deep learning, starting with ImageNet, depend on convolutional networks, which on certain tasks perform significantly better than traditional shallow classifiers, such as support vector machines. ...
Biologically Inspired Mechanisms for Adversarial Robustness
(Center for Brains, Minds and Machines (CBMM), 2020-06-23)
A convolutional neural network strongly robust to adversarial perturbations at reasonable computational and performance cost has not yet been demonstrated. The primate visual ventral stream seems to be robust to small ...
Dreaming with ARC
(Center for Brains, Minds and Machines (CBMM), 2020-11-23)
Current machine learning algorithms are highly specialized to whatever it is they are meant to do –– e.g. playing chess, picking up objects, or object recognition. How can we extend this to a system that could solve a ...
Cross-validation Stability of Deep Networks
(Center for Brains, Minds and Machines (CBMM), 2021-02-09)
Recent theoretical results show that gradient descent on deep neural networks under exponential loss functions locally maximizes classification margin, which is equivalent to minimizing the norm of the weight matrices under ...