Search
Now showing items 11-20 of 22
Biologically-Plausible Learning Algorithms Can Scale to Large Datasets
(Center for Brains, Minds and Machines (CBMM), 2018-09-27)
The backpropagation (BP) algorithm is often thought to be biologically implausible in the brain. One of the main reasons is that BP requires symmetric weight matrices in the feedforward and feed- back pathways. To address ...
Spatial IQ Test for AI
(2017-12-31)
We introduce SITD (Spatial IQ Test Dataset), a dataset used to evaluate the capabilities of computational models for pattern recognition and visual reasoning. SITD is a generator of images in the style of the Raven Progressive ...
Human-like Learning: A Research Proposal
(2017-09-28)
We propose Human-like Learning, a new machine learning paradigm aiming at training generalist AI systems in a human-like manner with a focus on human-unique skills.
When Is Handcrafting Not a Curse?
(2018-12-31)
Recently, with the proliferation of deep learning, there is a strong trend of abandoning handcrafted sys- tems/features in machine learning and AI by replacing them with “end-to-end” systems “learned from scratch”. These ...
Representations That Learn vs. Learning Representations
(2018-12-31)
During the last decade, we have witnessed tremendous progress in Machine Learning and especially the area of Deep Learning, a.k.a. “Learning Representations” (LearnRep for short). There is even an International Conference ...
Theory IIIb: Generalization in Deep Networks
(Center for Brains, Minds and Machines (CBMM), arXiv.org, 2018-06-29)
The general features of the optimization problem for the case of overparametrized nonlinear networks have been clear for a while: SGD selects with high probability global minima vs local minima. In the overparametrized ...
Theory of Deep Learning IIb: Optimization Properties of SGD
(Center for Brains, Minds and Machines (CBMM), 2017-12-27)
In Theory IIb we characterize with a mix of theory and experiments the optimization of deep convolutional networks by Stochastic Gradient Descent. The main new result in this paper is theoretical and experimental evidence ...
How Important is Weight Symmetry in Backpropagation?
(Center for Brains, Minds and Machines (CBMM), arXiv, 2015-11-29)
Gradient backpropagation (BP) requires symmetric feedforward and feedback connections—the same weights must be used for forward and backward passes. This “weight transport problem” [1] is thought to be one of the main ...
Object-Oriented Deep Learning
(Center for Brains, Minds and Machines (CBMM), 2017-10-31)
We investigate an unconventional direction of research that aims at converting neural networks, a class of distributed, connectionist, sub-symbolic models into a symbolic level with the ultimate goal of achieving AI ...
Exact Equivariance, Disentanglement and Invariance of Transformations
(2017-12-31)
Invariance, equivariance and disentanglement of transformations are important topics in the field of representation learning. Previous models like Variational Autoencoder [1] and Generative Adversarial Networks [2] attempted ...