Necessary and sufficient conditions for high-dimensional salient feature subset recovery
Author(s)
Tan, Vincent Yan Fu; Johnson, Matthew James; Willsky, Alan S.
DownloadWillsky_Necessary and sufficient.pdf (156.5Kb)
PUBLISHER_POLICY
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
We consider recovering the salient feature subset for distinguishing between two probability models from i.i.d. samples. Identifying the salient set improves discrimination performance and reduces complexity. The focus in this work is on the high-dimensional regime where the number of variables d, the number of salient variables k and the number of samples n all grow. The definition of saliency is motivated by error exponents in a binary hypothesis test and is stated in terms of relative entropies. It is shown that if n grows faster than max{ck log((d-k)/k), exp(c'k)} for constants c, c', then the error probability in selecting the salient set can be made arbitrarily small. Thus, n can be much smaller than d. The exponential rate of decay and converse theorems are also provided. An efficient and consistent algorithm is proposed when the distributions are graphical models which are Markov on trees.
Date issued
2010-07Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science; Massachusetts Institute of Technology. Laboratory for Information and Decision SystemsJournal
Proceedings of the IEEE International Symposium on Information Theory Proceedings (ISIT), 2010
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Tan, Vincent Y. F., Matthew Johnson, and Alan S. Willsky. “Necessary and Sufficient Conditions for High-dimensional Salient Feature Subset Recovery.” IEEE International Symposium on Information Theory Proceedings (ISIT), 2010. 1388–1392. ©2010 IEEE
Version: Final published version
ISBN
978-1-4244-7891-0
978-1-4244-7890-3