Now showing items 21-30 of 55
3D Object-Oriented Learning: An End-to-end Transformation-Disentangled 3D Representation
We provide more detailed explanation of the ideas behind a recent paper on “Object-Oriented Deep Learning”  and extend it to handle 3D inputs/outputs. Similar to , every layer of the system takes in a list of ...
Theory II: Landscape of the Empirical Risk in Deep Learning
(Center for Brains, Minds and Machines (CBMM), arXiv, 2017-03-30)
Previous theoretical work on deep learning and neural network optimization tend to focus on avoiding saddle points and local minima. However, the practical observation is that, at least for the most successful Deep ...
A Deep Representation for Invariance And Music Classification
(Center for Brains, Minds and Machines (CBMM), arXiv, 2014-17-03)
Representations in the auditory cortex might be based on mechanisms similar to the visual ventral stream; modules for building invariance to transformations and multiple layers for compositionality and selectivity. In this ...
Biologically-Plausible Learning Algorithms Can Scale to Large Datasets
(Center for Brains, Minds and Machines (CBMM), 2018-09-27)
The backpropagation (BP) algorithm is often thought to be biologically implausible in the brain. One of the main reasons is that BP requires symmetric weight matrices in the feedforward and feed- back pathways. To address ...
Spatial IQ Test for AI
We introduce SITD (Spatial IQ Test Dataset), a dataset used to evaluate the capabilities of computational models for pattern recognition and visual reasoning. SITD is a generator of images in the style of the Raven Progressive ...
Human-like Learning: A Research Proposal
We propose Human-like Learning, a new machine learning paradigm aiming at training generalist AI systems in a human-like manner with a focus on human-unique skills.
When Is Handcrafting Not a Curse?
Recently, with the proliferation of deep learning, there is a strong trend of abandoning handcrafted sys- tems/features in machine learning and AI by replacing them with “end-to-end” systems “learned from scratch”. These ...
Representations That Learn vs. Learning Representations
During the last decade, we have witnessed tremendous progress in Machine Learning and especially the area of Deep Learning, a.k.a. “Learning Representations” (LearnRep for short). There is even an International Conference ...
Theory IIIb: Generalization in Deep Networks
(Center for Brains, Minds and Machines (CBMM), arXiv.org, 2018-06-29)
The general features of the optimization problem for the case of overparametrized nonlinear networks have been clear for a while: SGD selects with high probability global minima vs local minima. In the overparametrized ...
Theory of Deep Learning IIb: Optimization Properties of SGD
(Center for Brains, Minds and Machines (CBMM), 2017-12-27)
In Theory IIb we characterize with a mix of theory and experiments the optimization of deep convolutional networks by Stochastic Gradient Descent. The main new result in this paper is theoretical and experimental evidence ...