DEEP-squared: deep learning powered De-scattering with Excitation Patterning
Author(s)
Wijethilake, Navodini; Anandakumar, Mithunjha; Zheng, Cheng; So, Peter T. C.; Yildirim, Murat; Wadduwage, Dushan N.; ... Show more Show less
DownloadPublished version (7.314Mb)
Publisher with Creative Commons License
Publisher with Creative Commons License
Creative Commons Attribution
Terms of use
Metadata
Show full item recordAbstract
Limited throughput is a key challenge in in vivo deep tissue imaging using nonlinear optical microscopy. Point scanning multiphoton microscopy, the current gold standard, is slow especially compared to the widefield imaging modalities used for optically cleared or thin specimens. We recently introduced “De-scattering with Excitation Patterning” or “DEEP” as a widefield alternative to point-scanning geometries. Using patterned multiphoton excitation, DEEP encodes spatial information inside tissue before scattering. However, to de-scatter at typical depths, hundreds of such patterned excitations were needed. In this work, we present DEEP<jats:sup>2</jats:sup>, a deep learning-based model that can de-scatter images from just tens of patterned excitations instead of hundreds. Consequently, we improve DEEP’s throughput by almost an order of magnitude. We demonstrate our method in multiple numerical and experimental imaging studies, including in vivo cortical vasculature imaging up to 4 scattering lengths deep in live mice.
Date issued
2023-09-13Department
Massachusetts Institute of Technology. Department of Mechanical Engineering; Massachusetts Institute of Technology. Laser Biomedical Research Center; Massachusetts Institute of Technology. Department of Biological Engineering; Picower Institute for Learning and MemoryJournal
Light: Science & Applications
Publisher
Springer Science and Business Media LLC
Citation
Wijethilake, N., Anandakumar, M., Zheng, C. et al. DEEP-squared: deep learning powered De-scattering with Excitation Patterning. Light Sci Appl 12, 228 (2023).
Version: Final published version
ISSN
2047-7538