The MIT Libraries is completing a major upgrade to DSpace@MIT.
Starting May 5 2026, DSpace will remain functional, viewable, searchable, and downloadable, however, you will not be able to edit existing collections or add new material.
We are aiming to have full functionality restored by May 18, 2026, but intermittent service interruptions may occur.
Please email dspace-lib@mit.edu with any questions.
Thank you for your patience as we implement this important upgrade.
Generalized physics-informed learning through language-wide differentiable programming
| dc.contributor.author | Rackauckas, C | |
| dc.contributor.author | Edelman, A | |
| dc.contributor.author | Fischer, K | |
| dc.contributor.author | Innes, M | |
| dc.contributor.author | Saba, E | |
| dc.contributor.author | Shah, VB | |
| dc.contributor.author | Tebbutt, W | |
| dc.date.accessioned | 2021-11-04T11:58:19Z | |
| dc.date.available | 2021-11-04T11:58:19Z | |
| dc.identifier.uri | https://hdl.handle.net/1721.1/137320 | |
| dc.description.abstract | Copyright © 2020, for this paper by its authors. Scientific computing is increasingly incorporating the advancements in machine learning to allow for data-driven physics-informed modeling approaches. However, re-targeting existing scientific computing workloads to machine learning frameworks is both costly and limiting, as scientific simulations tend to use the full feature set of a general purpose programming language. In this manuscript we develop an infrastructure for incorporating deep learning into existing scientific computing code through Differentiable Programming (∂P). We describe a ∂P system that is able to take gradients of full Julia programs, making Automatic Differentiation a first class language feature and compatibility with deep learning pervasive. Our system utilizes the one-language nature of Julia package development to augment the existing package ecosystem with deep learning, supporting almost all language constructs (control flow, recursion, mutation, etc.) while generating high-performance code without requiring any user intervention or refactoring to stage computations. We showcase several examples of physics-informed learning which directly utilizes this extension to existing simulation code: neural surrogate models, machine learning on simulated quantum hardware, and data-driven stochastic dynamical model discovery with neural stochastic differential equations. | en_US |
| dc.language.iso | en | |
| dc.relation.isversionof | http://ceur-ws.org/Vol-2587/ | en_US |
| dc.rights | Creative Commons Attribution 4.0 International license | en_US |
| dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | en_US |
| dc.source | MIT web domain | en_US |
| dc.title | Generalized physics-informed learning through language-wide differentiable programming | en_US |
| dc.type | Article | en_US |
| dc.identifier.citation | Rackauckas, C, Edelman, A, Fischer, K, Innes, M, Saba, E et al. "Generalized physics-informed learning through language-wide differentiable programming." CEUR Workshop Proceedings, 2587. | |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Mathematics | |
| dc.relation.journal | CEUR Workshop Proceedings | en_US |
| dc.eprint.version | Final published version | en_US |
| dc.type.uri | http://purl.org/eprint/type/ConferencePaper | en_US |
| eprint.status | http://purl.org/eprint/status/NonPeerReviewed | en_US |
| dc.date.updated | 2021-05-19T17:42:11Z | |
| dspace.orderedauthors | Rackauckas, C; Edelman, A; Fischer, K; Innes, M; Saba, E; Shah, VB; Tebbutt, W | en_US |
| dspace.date.submission | 2021-05-19T17:42:13Z | |
| mit.journal.volume | 2587 | en_US |
| mit.license | PUBLISHER_CC |
