The MIT Libraries is completing a major upgrade to DSpace@MIT. Starting May 5 2026, DSpace will remain functional, viewable, searchable, and downloadable, however, you will not be able to edit existing collections or add new material. We are aiming to have full functionality restored by May 18, 2026, but intermittent service interruptions may occur. Please email dspace-lib@mit.edu with any questions. Thank you for your patience as we implement this important upgrade.

Show simple item record

dc.contributor.authorRackauckas, C
dc.contributor.authorEdelman, A
dc.contributor.authorFischer, K
dc.contributor.authorInnes, M
dc.contributor.authorSaba, E
dc.contributor.authorShah, VB
dc.contributor.authorTebbutt, W
dc.date.accessioned2021-11-04T11:58:19Z
dc.date.available2021-11-04T11:58:19Z
dc.identifier.urihttps://hdl.handle.net/1721.1/137320
dc.description.abstractCopyright © 2020, for this paper by its authors. Scientific computing is increasingly incorporating the advancements in machine learning to allow for data-driven physics-informed modeling approaches. However, re-targeting existing scientific computing workloads to machine learning frameworks is both costly and limiting, as scientific simulations tend to use the full feature set of a general purpose programming language. In this manuscript we develop an infrastructure for incorporating deep learning into existing scientific computing code through Differentiable Programming (∂P). We describe a ∂P system that is able to take gradients of full Julia programs, making Automatic Differentiation a first class language feature and compatibility with deep learning pervasive. Our system utilizes the one-language nature of Julia package development to augment the existing package ecosystem with deep learning, supporting almost all language constructs (control flow, recursion, mutation, etc.) while generating high-performance code without requiring any user intervention or refactoring to stage computations. We showcase several examples of physics-informed learning which directly utilizes this extension to existing simulation code: neural surrogate models, machine learning on simulated quantum hardware, and data-driven stochastic dynamical model discovery with neural stochastic differential equations.en_US
dc.language.isoen
dc.relation.isversionofhttp://ceur-ws.org/Vol-2587/en_US
dc.rightsCreative Commons Attribution 4.0 International licenseen_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_US
dc.sourceMIT web domainen_US
dc.titleGeneralized physics-informed learning through language-wide differentiable programmingen_US
dc.typeArticleen_US
dc.identifier.citationRackauckas, C, Edelman, A, Fischer, K, Innes, M, Saba, E et al. "Generalized physics-informed learning through language-wide differentiable programming." CEUR Workshop Proceedings, 2587.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mathematics
dc.relation.journalCEUR Workshop Proceedingsen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2021-05-19T17:42:11Z
dspace.orderedauthorsRackauckas, C; Edelman, A; Fischer, K; Innes, M; Saba, E; Shah, VB; Tebbutt, Wen_US
dspace.date.submission2021-05-19T17:42:13Z
mit.journal.volume2587en_US
mit.licensePUBLISHER_CC


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record