Guessing noise, not code-words
Author(s)
Duffy, Ken R.; Li, Jiange; Medard, Muriel
DownloadAccepted version (175.1Kb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
© 2018 IEEE. We introduce a new algorithm for Maximum Likelihood (ML) decoding for channels with memory. The algorithm is based on the principle that the receiver rank orders noise sequences from most likely to least likely. Subtracting noise from the received signal in that order, the first instance that results in an element of the code-book is the ML decoding. In contrast to traditional approaches, this novel scheme has the desirable property that it becomes more efficient as the code-book rate increases. We establish that the algorithm is capacity achieving for randomly selected code-books. When the code-book rate is less than capacity, we identify asymptotic error exponents as the block length becomes large. When the code-book rate is beyond capacity, we identify asymptotic success exponents. We determine properties of the complexity of the scheme in terms of the number of computations the receiver must perform per block symbol. Worked examples are presented for binary memoryless and Markovian noise. These demonstrate that block-lengths that offer a good complexity-rate tradeoff are typically smaller than the reciprocal of the bit error rate.
Date issued
2018-06Department
Massachusetts Institute of Technology. Research Laboratory of ElectronicsPublisher
IEEE
Citation
Duffy, Ken R., Li, Jiange and Medard, Muriel. 2018. "Guessing noise, not code-words."
Version: Author's final manuscript