MIT Libraries homeMIT Libraries logoDSpace@MIT

MIT
Search 
  • DSpace@MIT Home
  • Center for Brains, Minds & Machines
  • Search
  • DSpace@MIT Home
  • Center for Brains, Minds & Machines
  • Search
JavaScript is disabled for your browser. Some features of this site may not work without it.

Search

Show Advanced FiltersHide Advanced Filters

Filters

Use filters to refine the search results.

Now showing items 31-40 of 55

  • Sort Options:
  • Relevance
  • Title Asc
  • Title Desc
  • Issue Date Asc
  • Issue Date Desc
  • Results Per Page:
  • 5
  • 10
  • 20
  • 40
  • 60
  • 80
  • 100
Thumbnail

Holographic Embeddings of Knowledge Graphs 

Nickel, Maximilian; Rosasco, Lorenzo; Poggio, Tomaso (Center for Brains, Minds and Machines (CBMM), arXiv, 2015-11-16)
Learning embeddings of entities and relations is an efficient and versatile method to perform machine learning on relational data such as knowledge graphs. In this work, we propose holographic embeddings (HolE) to learn ...
Thumbnail

How Important is Weight Symmetry in Backpropagation? 

Liao, Qianli; Leibo, Joel Z.; Poggio, Tomaso (Center for Brains, Minds and Machines (CBMM), arXiv, 2015-11-29)
Gradient backpropagation (BP) requires symmetric feedforward and feedback connections—the same weights must be used for forward and backward passes. This “weight transport problem” [1] is thought to be one of the main ...
Thumbnail

Deep vs. shallow networks : An approximation theory perspective 

Mhaskar, Hrushikesh; Poggio, Tomaso (Center for Brains, Minds and Machines (CBMM), arXiv, 2016-08-12)
The paper briefly reviews several recent results on hierarchical architectures for learning from examples, that may formally explain the conditions under which Deep Convolutional Neural Networks perform much better in ...
Thumbnail

Do Deep Neural Networks Suffer from Crowding? 

Volokitin, Anna; Roig, Gemma; Poggio, Tomaso (Center for Brains, Minds and Machines (CBMM), arXiv, 2017-06-26)
Crowding is a visual effect suffered by humans, in which an object that can be recognized in isolation can no longer be recognized when other objects, called flankers, are placed close to it. In this work, we study the ...
Thumbnail

Object-Oriented Deep Learning 

Liao, Qianli; Poggio, Tomaso (Center for Brains, Minds and Machines (CBMM), 2017-10-31)
We investigate an unconventional direction of research that aims at converting neural networks, a class of distributed, connectionist, sub-symbolic models into a symbolic level with the ultimate goal of achieving AI ...
Thumbnail

Exact Equivariance, Disentanglement and Invariance of Transformations 

Liao, Qianli; Poggio, Tomaso (2017-12-31)
Invariance, equivariance and disentanglement of transformations are important topics in the field of representation learning. Previous models like Variational Autoencoder [1] and Generative Adversarial Networks [2] attempted ...
Thumbnail

Biologically-plausible learning algorithms can scale to large datasets 

Xiao, Will; Chen, Honglin; Liao, Qianli; Poggio, Tomaso (Center for Brains, Minds and Machines (CBMM), arXiv.org, 2018-11-08)
The backpropagation (BP) algorithm is often thought to be biologically implausible in the brain. One of the main reasons is that BP requires symmetric weight matrices in the feedforward and feedback pathways. To address ...
Thumbnail

Deep Convolutional Networks are Hierarchical Kernel Machines 

Anselmi, Fabio; Rosasco, Lorenzo; Tan, Cheston; Poggio, Tomaso (Center for Brains, Minds and Machines (CBMM), arXiv, 2015-08-05)
We extend i-theory to incorporate not only pooling but also rectifying nonlinearities in an extended HW module (eHW) designed for supervised learning. The two operations roughly correspond to invariance and selectivity, ...
Thumbnail

An analysis of training and generalization errors in shallow and deep networks 

Mhaskar, Hrushikesh; Poggio, Tomaso (Center for Brains, Minds and Machines (CBMM), arXiv.org, 2018-02-20)
An open problem around deep networks is the apparent absence of over-fitting despite large over-parametrization which allows perfect fitting of the training data. In this paper, we explain this phenomenon when each unit ...
Thumbnail

Classical generalization bounds are surprisingly tight for Deep Networks 

Liao, Qianli; Miranda, Brando; Hidary, Jack; Poggio, Tomaso (Center for Brains, Minds and Machines (CBMM), 2018-07-11)
Deep networks are usually trained and tested in a regime in which the training classification error is not a good predictor of the test error. Thus the consensus has been that generalization, defined as convergence of the ...
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CommunityBy Issue DateAuthorsTitlesSubjects

My Account

Login

Discover

Author
Poggio, Tomaso (55)
Liao, Qianli (22)Rosasco, Lorenzo (11)Anselmi, Fabio (8)Banburski, Andrzej (6)Miranda, Brando (6)Liao, Qianli (5)Mhaskar, Hrushikesh (5)Zhang, Chiyuan (4)Boix, Xavier (3)... View MoreSubjectInvariance (9)Computer vision (8)Machine Learning (7)Hierarchy (5)i-theory (4)AI (2)Artificial Intelligence (2)Batch Normalization (BN) (2)Convolutional Neural Networks (CNN) (2)Deep Convolutional Learning Networks (DCLNs) (2)... View MoreDate Issued2017 (11)2016 (9)2015 (8)2018 (8)2020 (7)2014 (5)2019 (3)2021 (3)Has File(s)Yes (54)No (1)

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries homeMIT Libraries logo

Find us on

Twitter Facebook Instagram YouTube RSS

MIT Libraries navigation

SearchHours & locationsBorrow & requestResearch supportAbout us
PrivacyPermissionsAccessibility
MIT
Massachusetts Institute of Technology
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.