Search
Now showing items 41-50 of 66
Deep vs. shallow networks : An approximation theory perspective
(Center for Brains, Minds and Machines (CBMM), arXiv, 2016-08-12)
The paper briefly reviews several recent results on hierarchical architectures for learning from examples, that may formally explain the conditions under which Deep Convolutional Neural Networks perform much better in ...
Learning An Invariant Speech Representation
(Center for Brains, Minds and Machines (CBMM), arXiv, 2014-06-15)
Recognition of speech, and in particular the ability to generalize and learn from small sets of labelled examples like humans do, depends on an appropriate representation of the acoustic input. We formulate the problem of ...
I-theory on depth vs width: hierarchical function composition
(Center for Brains, Minds and Machines (CBMM), 2015-12-29)
Deep learning networks with convolution, pooling and subsampling are a special case of hierar- chical architectures, which can be represented by trees (such as binary trees). Hierarchical as well as shallow networks can ...
When Is Handcrafting Not a Curse?
(2018-12-31)
Recently, with the proliferation of deep learning, there is a strong trend of abandoning handcrafted sys- tems/features in machine learning and AI by replacing them with “end-to-end” systems “learned from scratch”. These ...
Spatial IQ Test for AI
(2017-12-31)
We introduce SITD (Spatial IQ Test Dataset), a dataset used to evaluate the capabilities of computational models for pattern recognition and visual reasoning. SITD is a generator of images in the style of the Raven Progressive ...
The Invariance Hypothesis Implies Domain-Specific Regions in Visual Cortex
(Center for Brains, Minds and Machines (CBMM), bioRxiv, 2015-04-26)
Is visual cortex made up of general-purpose information processing machinery, or does it consist of a collection of specialized modules? If prior knowledge, acquired from learning a set of objects is only transferable to ...
Biologically Inspired Mechanisms for Adversarial Robustness
(Center for Brains, Minds and Machines (CBMM), 2020-06-23)
A convolutional neural network strongly robust to adversarial perturbations at reasonable computational and performance cost has not yet been demonstrated. The primate visual ventral stream seems to be robust to small ...
Stable Foundations for Learning: a foundational framework for learning theory in both the classical and modern regime.
(Center for Brains, Minds and Machines (CBMM), 2020-03-25)
We consider here the class of supervised learning algorithms known as Empirical Risk Minimization (ERM). The classical theory by Vapnik and others characterize universal consistency of ERM in the classical regime in which ...
Representations That Learn vs. Learning Representations
(2018-12-31)
During the last decade, we have witnessed tremendous progress in Machine Learning and especially the area of Deep Learning, a.k.a. “Learning Representations” (LearnRep for short). There is even an International Conference ...
On Invariance and Selectivity in Representation Learning
(Center for Brains, Minds and Machines (CBMM), arXiv, 2015-03-23)
We discuss data representation which can be learned automatically from data, are invariant to transformations, and at the same time selective, in the sense that two points have the same representation only if they are one ...