MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery

Author(s)
Kim, Samuel; Lu, Peter Y; Mukherjee, Srijon; Gilbert, Michael; Jing, Li; Ceperic, Vladimir; Soljacic, Marin; ... Show more Show less
Thumbnail
DownloadPublished version (2.240Mb)
Publisher with Creative Commons License

Publisher with Creative Commons License

Creative Commons Attribution

Terms of use
Creative Commons Attribution 4.0 International License https://creativecommons.org/licenses/by/4.0
Metadata
Show full item record
Abstract
Symbolic regression is a powerful technique to discover analytic equations that describe data, which can lead to explainable models and the ability to predict unseen data. In contrast, neural networks have achieved amazing levels of accuracy on image recognition and natural language processing tasks, but they are often seen as black-box models that are difficult to interpret and typically extrapolate poorly. In this article, we use a neural network-based architecture for symbolic regression called the equation learner (EQL) network and integrate it with other deep learning architectures such that the whole system can be trained end-to-end through backpropagation. To demonstrate the power of such systems, we study their performance on several substantially different tasks. First, we show that the neural network can perform symbolic regression and learn the form of several functions. Next, we present an MNIST arithmetic task where a convolutional network extracts the digits. Finally, we demonstrate the prediction of dynamical systems where an unknown parameter is extracted through an encoder. We find that the EQL-based architecture can extrapolate quite well outside of the training data set compared with a standard neural network-based architecture, paving the way for deep learning to be applied in scientific exploration and discovery.
Date issued
2020
URI
https://hdl.handle.net/1721.1/142227
Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science; Massachusetts Institute of Technology. Department of Physics
Journal
IEEE Transactions on Neural Networks
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Kim, Samuel, Lu, Peter Y, Mukherjee, Srijon, Gilbert, Michael, Jing, Li et al. 2020. "Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery." IEEE Transactions on Neural Networks, 32 (9).
Version: Final published version

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.