Recent Developments in the Sparse Fourier Transform: A compressed Fourier transform for big data
Author(s)
Iwen, Mark; Gilbert, Anna Rebecca; Indyk, Piotr; Schmidt, Ludwig
DownloadRecent developments.pdf (531.1Kb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
The discrete Fourier transform (DFT) is a fundamental component of numerous computational techniques in signal processing and scientific computing. The most popular means of computing the DFT is the fast Fourier transform (FFT). However, with the emergence of big data problems, in which the size of the processed data sets can easily exceed terabytes, the "fast" in FFT is often no longer fast enough. In addition, in many big data applications it is hard to acquire a sufficient amount of data to compute the desired Fourier transform in the first place. The sparse Fourier transform (SFT) addresses the big data setting by computing a compressed Fourier transform using only a subset of the input data, in time smaller than the data set size. The goal of this article is to survey these recent developments, explain the basic techniques with examples and applications in big data, demonstrate tradeoffs in empirical performance of the algorithms, and discuss the connection between the SFT and other techniques for massive data analysis such as streaming algorithms and compressive sensing.
Date issued
2014-08Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
IEEE Signal Processing Magazine
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Gilbert, Anna C., et al. “Recent Developments in the Sparse Fourier Transform: A Compressed Fourier Transform for Big Data.” IEEE Signal Processing Magazine, vol. 31, no. 5, Sept. 2014, pp. 91–100.
Version: Author's final manuscript
ISSN
1053-5888