Notice

This is not the latest version of this item. The latest version can be found at:https://dspace.mit.edu/handle/1721.1/132317.2

Show simple item record

dc.contributor.authorAlet, F
dc.contributor.authorWeng, E
dc.contributor.authorPérez, TL
dc.contributor.authorKaelbling, LP
dc.date.accessioned2021-09-20T18:21:48Z
dc.date.available2021-09-20T18:21:48Z
dc.identifier.urihttps://hdl.handle.net/1721.1/132317
dc.description.abstract© 2019 Neural information processing systems foundation. All rights reserved. Graph neural networks (GNNs) are effective models for many dynamical systems consisting of entities and relations. Although most GNN applications assume a single type of entity and relation, many situations involve multiple types of interactions. Relational inference is the problem of inferring these interactions and learning the dynamics from observational data. We frame relational inference as a modular meta-learning problem, where neural modules are trained to be composed in different ways to solve many tasks. This meta-learning framework allows us to implicitly encode time invariance and infer relations in context of one another rather than independently, which increases inference capacity. Framing inference as the inner-loop optimization of meta-learning leads to a model-based approach that is more data-efficient and capable of estimating the state of entities that we do not observe directly, but whose existence can be inferred from their effect on observed entities. To address the large search space of graph neural network compositions, we meta-learn a proposal function that speeds up the inner-loop simulated annealing search within the modular meta-learning algorithm, providing two orders of magnitude increase in the size of problems that can be addressed.en_US
dc.language.isoen
dc.relation.isversionofhttps://papers.nips.cc/paper/2019/hash/b294504229c668e750dfcc4ea9617f0a-Abstract.htmlen_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceNeural Information Processing Systems (NIPS)en_US
dc.titleNeural relational inference with fast modular meta-learningen_US
dc.typeArticleen_US
dc.relation.journalAdvances in Neural Information Processing Systemsen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2020-12-22T18:33:53Z
dspace.orderedauthorsAlet, F; Weng, E; Pérez, TL; Kaelbling, LPen_US
dspace.date.submission2020-12-22T18:33:57Z
mit.journal.volume32en_US
mit.licensePUBLISHER_POLICY
mit.metadata.statusAuthority Work and Publication Information Needed


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

VersionItemDateSummary

*Selected version