Phase imaging with an untrained neural network
Author(s)
Wang, Fei; Bian, Yaoming; Wang, Haichao; Lyu, Meng; Pedrini, Giancarlo; Osten, Wolfgang; Barbastathis, George; Situ, Guohai; ... Show more Show less
DownloadPublished version (1.413Mb)
Publisher with Creative Commons License
Publisher with Creative Commons License
Creative Commons Attribution
Terms of use
Metadata
Show full item recordAbstract
Most of the neural networks proposed so far for computational imaging (CI) in optics employ a supervised training strategy, and thus need a large training set to optimize their weights and biases. Setting aside the requirements of environmental and system stability during many hours of data acquisition, in many practical applications, it is unlikely to be possible to obtain sufficient numbers of ground-truth images for training. Here, we propose to overcome this limitation by incorporating into a conventional deep neural network a complete physical model that represents the process of image formation. The most significant advantage of the resulting physics-enhanced deep neural network (PhysenNet) is that it can be used without training beforehand, thus eliminating the need for tens of thousands of labeled data. We take single-beam phase imaging as an example for demonstration. We experimentally show that one needs only to feed PhysenNet a single diffraction pattern of a phase object, and it can automatically optimize the network and eventually produce the object phase through the interplay between the neural network and the physical model. This opens up a new paradigm of neural network design, in which the concept of incorporating a physical model into a neural network can be generalized to solve many other CI problems. ©2020, The Author(s).
Date issued
2020-05Department
Massachusetts Institute of Technology. Department of Mechanical EngineeringJournal
Light: Science and Applications
Publisher
Springer Science and Business Media LLC
Citation
Wang, Fei et al., "Phase imaging with an untrained neural network." Light: Science and Applications 9 (May 2020): no. 77 doi. 10.1038/s41377-020-0302-3 ©2020 Author(s)
Version: Final published version
ISSN
2047-7538