Show simple item record

dc.contributor.authorChang, Yue
dc.contributor.authorChen, Peter Yichen
dc.contributor.authorWang, Zhecheng
dc.contributor.authorChiaramonte, Maurizio M.
dc.contributor.authorCarlberg, Kevin
dc.contributor.authorGrinspun, Eitan
dc.date.accessioned2024-01-02T20:08:15Z
dc.date.available2024-01-02T20:08:15Z
dc.date.issued2023-12-10
dc.identifier.isbn979-8-4007-0315-7
dc.identifier.urihttps://hdl.handle.net/1721.1/153261
dc.description.abstractLinear reduced-order modeling (ROM) simplifies complex simulations by approximating the behavior of a system using a simplified kinematic representation. Typically, ROM is trained on input simulations created with a specific spatial discretization, and then serves to accelerate simulations with the same discretization. This discretization-dependence is restrictive. Becoming independent of a specific discretization would provide flexibility to mix and match mesh resolutions, connectivity, and type (tetrahedral, hexahedral) in training data; to accelerate simulations with novel discretizations unseen during training; and to accelerate adaptive simulations that temporally or parametrically change the discretization. We present a flexible, discretization-independent approach to reduced-order modeling. Like traditional ROM, we represent the configuration as a linear combination of displacement fields. Unlike traditional ROM, our displacement fields are continuous maps from every point on the reference domain to a corresponding displacement vector; these maps are represented as implicit neural fields. With linear continuous ROM (LiCROM), our training set can include multiple geometries undergoing multiple loading conditions, independent of their discretization. This opens the door to novel applications of reduced order modeling. We can now accelerate simulations that modify the geometry at runtime, for instance via cutting, hole punching, and even swapping the entire mesh. We can also accelerate simulations of geometries unseen during training. We demonstrate one-shot generalization, training on a single geometry and subsequently simulating various unseen geometries.en_US
dc.publisherACM|SIGGRAPH Asia 2023 Conference Papersen_US
dc.relation.isversionofhttps://doi.org/10.1145/3610548.3618158en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceAssociation for Computing Machineryen_US
dc.titleLiCROM: Linear-Subspace Continuous Reduced Order Modeling with Neural Fieldsen_US
dc.typeArticleen_US
dc.identifier.citationChang, Yue, Chen, Peter Yichen, Wang, Zhecheng, Chiaramonte, Maurizio M., Carlberg, Kevin et al. 2023. "LiCROM: Linear-Subspace Continuous Reduced Order Modeling with Neural Fields."
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
dc.identifier.mitlicensePUBLISHER_POLICY
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2024-01-01T08:45:59Z
dc.language.rfc3066en
dc.rights.holderThe author(s)
dspace.date.submission2024-01-01T08:46:00Z
mit.licensePUBLISHER_POLICY
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record