Modularization of deep networks allows cross-modality reuse: lesson learnt
Author(s)
Husvogt, Lennart; Fujimoto, James G
DownloadSubmitted version (3.120Mb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Fundus photography and Optical Coherence Tomography Angiography (OCT-A) are two commonly used modalities in ophthalmic imaging. With the development of deep learning algorithms, fundus image processing, especially retinal vessel segmentation, has been extensively studied. Built upon the known operator theory, interpretable deep network pipelines with well-defined modules have been constructed on fundus images. In this work, we firstly train a modularized network pipeline for the task of retinal vessel segmentation on the fundus database DRIVE. The pretrained preprocessing module from the pipeline is then directly transferred onto OCT-A data for image quality enhancement without further fine-tuning. Output images show that the preprocessing net can balance the contrast, suppress noise and thereby produce vessel trees with improved connectivity in both image modalities. The visual impression is confirmed by an observer study with five OCT-A experts. Statistics of the grades by the experts indicate that the transferred module improves both the image quality and the diagnostic quality. Our work provides an example that modules within network pipelines that are built upon the known operator theory facilitate cross-modality reuse without additional training or transfer learning.
Date issued
2020-02Department
Massachusetts Institute of Technology. Research Laboratory of Electronics; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
Informatik aktuell
Publisher
Springer Fachmedien Wiesbaden
Citation
Wu, Weilin et al. “Modularization of deep networks allows cross-modality reuse: lesson learnt.” Informatik aktuell (February 2020): 274-279 © 2020 The Author(s)
Version: Original manuscript
ISBN
9783658292676
ISSN
2628-8958