Now showing items 1-3 of 3
I-theory on depth vs width: hierarchical function composition
(Center for Brains, Minds and Machines (CBMM), 2015-12-29)
Deep learning networks with convolution, pooling and subsampling are a special case of hierar- chical architectures, which can be represented by trees (such as binary trees). Hierarchical as well as shallow networks can ...
Complexity of Representation and Inference in Compositional Models with Part Sharing
(Center for Brains, Minds and Machines (CBMM), arXiv, 2015-05-05)
This paper performs a complexity analysis of a class of serial and parallel compositional models of multiple objects and shows that they enable efficient representation and rapid inference. Compositional models are generative ...
Deep Convolutional Networks are Hierarchical Kernel Machines
(Center for Brains, Minds and Machines (CBMM), arXiv, 2015-08-05)
We extend i-theory to incorporate not only pooling but also rectifying nonlinearities in an extended HW module (eHW) designed for supervised learning. The two operations roughly correspond to invariance and selectivity, ...