Now showing items 1-10 of 11
Musings on Deep Learning: Properties of SGD
(Center for Brains, Minds and Machines (CBMM), 2017-04-04)
[previously titled "Theory of Deep Learning III: Generalization Properties of SGD"] In Theory III we characterize with a mix of theory and experiments the generalization properties of Stochastic Gradient Descent in ...
(Center for Brains, Minds and Machines (CBMM), 2017-05-26)
The properties of a representation, such as smoothness, adaptability, generality, equivari- ance/invariance, depend on restrictions imposed during learning. In this paper, we propose using data symmetries, in the sense of ...
3D Object-Oriented Learning: An End-to-end Transformation-Disentangled 3D Representation
We provide more detailed explanation of the ideas behind a recent paper on “Object-Oriented Deep Learning”  and extend it to handle 3D inputs/outputs. Similar to , every layer of the system takes in a list of ...
Theory II: Landscape of the Empirical Risk in Deep Learning
(Center for Brains, Minds and Machines (CBMM), arXiv, 2017-03-30)
Previous theoretical work on deep learning and neural network optimization tend to focus on avoiding saddle points and local minima. However, the practical observation is that, at least for the most successful Deep ...
Spatial IQ Test for AI
We introduce SITD (Spatial IQ Test Dataset), a dataset used to evaluate the capabilities of computational models for pattern recognition and visual reasoning. SITD is a generator of images in the style of the Raven Progressive ...
Human-like Learning: A Research Proposal
We propose Human-like Learning, a new machine learning paradigm aiming at training generalist AI systems in a human-like manner with a focus on human-unique skills.
Theory of Deep Learning IIb: Optimization Properties of SGD
(Center for Brains, Minds and Machines (CBMM), 2017-12-27)
In Theory IIb we characterize with a mix of theory and experiments the optimization of deep convolutional networks by Stochastic Gradient Descent. The main new result in this paper is theoretical and experimental evidence ...
Do Deep Neural Networks Suffer from Crowding?
(Center for Brains, Minds and Machines (CBMM), arXiv, 2017-06-26)
Crowding is a visual effect suffered by humans, in which an object that can be recognized in isolation can no longer be recognized when other objects, called flankers, are placed close to it. In this work, we study the ...
Object-Oriented Deep Learning
(Center for Brains, Minds and Machines (CBMM), 2017-10-31)
We investigate an unconventional direction of research that aims at converting neural networks, a class of distributed, connectionist, sub-symbolic models into a symbolic level with the ultimate goal of achieving AI ...
Exact Equivariance, Disentanglement and Invariance of Transformations
Invariance, equivariance and disentanglement of transformations are important topics in the field of representation learning. Previous models like Variational Autoencoder  and Generative Adversarial Networks  attempted ...