Show simple item record

dc.contributor.authorKim, Samuel
dc.contributor.authorLu, Peter Y
dc.contributor.authorMukherjee, Srijon
dc.contributor.authorGilbert, Michael
dc.contributor.authorJing, Li
dc.contributor.authorCeperic, Vladimir
dc.contributor.authorSoljacic, Marin
dc.date.accessioned2022-05-02T15:20:13Z
dc.date.available2022-05-02T15:20:13Z
dc.date.issued2020
dc.identifier.urihttps://hdl.handle.net/1721.1/142227
dc.description.abstractSymbolic regression is a powerful technique to discover analytic equations that describe data, which can lead to explainable models and the ability to predict unseen data. In contrast, neural networks have achieved amazing levels of accuracy on image recognition and natural language processing tasks, but they are often seen as black-box models that are difficult to interpret and typically extrapolate poorly. In this article, we use a neural network-based architecture for symbolic regression called the equation learner (EQL) network and integrate it with other deep learning architectures such that the whole system can be trained end-to-end through backpropagation. To demonstrate the power of such systems, we study their performance on several substantially different tasks. First, we show that the neural network can perform symbolic regression and learn the form of several functions. Next, we present an MNIST arithmetic task where a convolutional network extracts the digits. Finally, we demonstrate the prediction of dynamical systems where an unknown parameter is extracted through an encoder. We find that the EQL-based architecture can extrapolate quite well outside of the training data set compared with a standard neural network-based architecture, paving the way for deep learning to be applied in scientific exploration and discovery.en_US
dc.language.isoen
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionof10.1109/TNNLS.2020.3017010en_US
dc.rightsCreative Commons Attribution 4.0 International Licenseen_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0en_US
dc.sourceIEEEen_US
dc.titleIntegration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discoveryen_US
dc.typeArticleen_US
dc.identifier.citationKim, Samuel, Lu, Peter Y, Mukherjee, Srijon, Gilbert, Michael, Jing, Li et al. 2020. "Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery." IEEE Transactions on Neural Networks, 32 (9).
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
dc.contributor.departmentMassachusetts Institute of Technology. Department of Physics
dc.relation.journalIEEE Transactions on Neural Networksen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2022-05-02T14:19:46Z
dspace.orderedauthorsKim, S; Lu, PY; Mukherjee, S; Gilbert, M; Jing, L; Ceperic, V; Soljacic, Men_US
dspace.date.submission2022-05-02T14:19:48Z
mit.journal.volume32en_US
mit.journal.issue9en_US
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record