메뉴 건너뛰기




Volumn 55, Issue 12, 2009, Pages 5728-5741

Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting

Author keywords

1 relaxation; Compressed sensing; Fano's method; High dimensional statistical inference; Information theoretic bounds; Lasso; Model selection; Signal denoising; Sparsity pattern; Sparsity recovery; Subset selection; Support recovery

Indexed keywords

COMPRESSED SENSING; FANO'S METHOD; HIGH-DIMENSIONAL; INFORMATION THEORETIC BOUNDS; LASSO; MODEL SELECTION; SIGNAL DENOISING; SPARSITY PATTERNS; SPARSITY RECOVERY; SUBSET SELECTION;

EID: 73849097267     PISSN: 00189448     EISSN: None     Source Type: Journal    
DOI: 10.1109/TIT.2009.2032816     Document Type: Article
Times cited : (326)

References (36)
  • 1
    • 46749134451 scopus 로고    scopus 로고
    • Information-theoretic bounds to sensing capacity of sensor networks under fixed snr
    • San Diego, CA, Sep.
    • S. Aeron, M. Zhao, and S. Venkatesh, "Information-theoretic bounds to sensing capacity of sensor networks under fixed snr,;quot; in Proc. IEEE Information Theory Workshop, San Diego, CA, Sep. 2007.
    • (2007) Proc. IEEE Information Theory Workshop
    • Aeron, S.1    Zhao, M.2    Venkatesh, S.3
  • 4
    • 0042613398 scopus 로고    scopus 로고
    • An alternative point of view on Lepski's method
    • ser. IMS Lecture Notes no. Beachwood, OH: Inst. Math. Statist.
    • L. Birgé, "An alternative point of view on Lepski's method," in State of the Art in Probability and Statistics, ser. IMS Lecture Notes, no.37. Beachwood, OH: Inst. Math. Statist., 2001, pp. 113-133.
    • (2001) State of the Art in Probability and Statistics , Issue.37 , pp. 113-133
    • Birgé, L.1
  • 5
    • 29144439194 scopus 로고    scopus 로고
    • Decoding by linear programming
    • Dec.
    • E. Candés and T. Tao,;quot;Decoding by linear programming,;quot; IEEE Trans.Inf. Theory, vol.51, no.12, pp. 4203-4215, Dec. 2005.
    • (2005) IEEE Trans.Inf. Theory , vol.51 , Issue.12 , pp. 4203-4215
    • Candés, E.1    Tao, T.2
  • 6
    • 34548275795 scopus 로고    scopus 로고
    • The Dantzig selector: Statistical estimation when p is much larger n than
    • E. Candés and T. Tao, "The Dantzig selector: Statistical estimation when p is much larger νthan," Ann. Statist., vol.35, no.6, pp. 2313-2351, 2007.
    • (2007) Ann. Statist. , vol.35 , Issue.6 , pp. 2313-2351
    • Candés, E.1    Tao, T.2
  • 7
    • 0032131292 scopus 로고    scopus 로고
    • Atomic decomposition by basis pursuit
    • S. Chen, D. L. Donoho, and M. A. Saunders, "Atomic decomposition by basis pursuit, " SIAM J. Sci. Comput., vol.20, no.1, pp. 33-61, 1998.
    • (1998) SIAM J. Sci. Comput. , vol.20 , Issue.1 , pp. 33-61
    • Chen, S.1    Donoho, D.L.2    Saunders, M.A.3
  • 9
    • 0012070851 scopus 로고    scopus 로고
    • Local operator theory, random matrices,and Banach spaces
    • Amsterdan, The Netherlands: Elsevier
    • K. R. Davidson and S. J. Szarek, "Local operator theory, random matrices,and Banach spaces, " in Handbook of Banach Spaces. Amsterdan, The Netherlands: Elsevier, 2001, vol.1, pp. 317-336.
    • (2001) Handbook of Banach Spaces. , vol.1 , pp. 317-336
    • Davidson, K.R.1    Szarek, S.J.2
  • 11
    • 33645712892 scopus 로고    scopus 로고
    • Compressed sensing
    • Apr.
    • D. Donoho, "Compressed sensing, " IEEE Trans. Inf. Theory, vol.52, no.4, pp. 1289-1306, Apr. 2006.
    • (2006) IEEE Trans. Inf. Theory , vol.52 , Issue.4 , pp. 1289-1306
    • Donoho, D.1
  • 12
    • 33646365077 scopus 로고    scopus 로고
    • 1-norm solution is also the sparsest solution
    • DOI 10.1002/cpa.20132
    • 1 norm solution is also the sparsest solution, " Commun. Pure and Appl. Math., vol.59, no.6, pp. 797-829, Jun. 2006. (Pubitemid 43667226)
    • (2006) Communications on Pure and Applied Mathematics , vol.59 , Issue.6 , pp. 797-829
    • Donoho, D.L.1
  • 13
    • 33744552752 scopus 로고    scopus 로고
    • 1-norm near-solution approximates the sparsest near-solution
    • Jul.
    • 1norm near-solution approximates the sparsest near-solution," Commun. Pure and Appl. Math., vol.59, no.7, pp. 907-934, Jul. 2006.
    • (2006) Commun. Pure and Appl. Math. , vol.59 , Issue.7 , pp. 907-934
    • Donoho, D.L.1
  • 14
    • 57349138185 scopus 로고    scopus 로고
    • Counting faces of randomly-projected polytopes when the projection radically lowers dimension
    • Jul.
    • D. L. Donoho and J. M. Tanner, "Counting faces of randomly-projected polytopes when the projection radically lowers dimension," J. Amer. Math. Soc., vol.22, pp. 1-53, Jul. 2009.
    • (2009) J. Amer. Math. Soc. , vol.22 , pp. 1-53
    • Donoho, D.L.1    Tanner, J.M.2
  • 15
    • 52349094152 scopus 로고    scopus 로고
    • Necessary and sufficient conditions on sparsity pattern recovery
    • Apr. [Online]. Available: arXiv:cs.IT/0804.1839
    • A. K. Fletcher, S. Rangan, and V. K. Goyal, "Necessary and Sufficient Conditions on Sparsity Pattern Recovery," Univ. Calif., Berkeley, Tech. Rep., Apr. 2008 [Online]. Available: arXiv:cs.IT/0804.1839
    • (2008) Univ. Calif. Berkeley Tech. Rep.
    • Fletcher, A.K.1    Rangan, S.2    Goyal, V.K.3
  • 16
    • 33846796161 scopus 로고    scopus 로고
    • Denoising by sparse approximation: Error bounds based on rate-distortion theory
    • A. K. Fletcher, S. Rangan, V. K. Goyal, and K. Ramchandran, "Denoising by sparse approximation: Error bounds based on rate-distortion theory, " J. Appl. Signal Process., vol.10, pp. 1-19, 2006.
    • (2006) J. Appl. Signal Process. , vol.10 , pp. 1-19
    • Fletcher, A.K.1    Rangan, S.2    Goyal, V.K.3    Ramchandran, K.4
  • 17
    • 4644223536 scopus 로고    scopus 로고
    • Recovery of exact sparse representations in the presence of noise
    • Montreal, QC, Canada
    • J. J. Fuchs,"Recovery of exact sparse representations in the presence of noise," in Proc. Int. Conf. Acoustics, Speech and Signal Processing, Montreal, QC, Canada, 2004, vol.2, pp. 533-536.
    • (2004) Proc. Int. Conf. Acoustics, Speech and Signal Processing , vol.2 , pp. 533-536
    • Fuchs, J.J.1
  • 18
    • 0009387419 scopus 로고
    • A lower bound on the risks of nonparametric estimates of densities in the uniform metric
    • R. Z. Has'minskii,"A lower bound on the risks of nonparametric estimates of densities in the uniform metric," Theory Prob. Appl., vol.23, pp. 794-798, 1978.
    • (1978) Theory Prob. Appl. , vol.23 , pp. 794-798
    • Has'minskii, R.Z.1
  • 21
    • 0034287154 scopus 로고    scopus 로고
    • Adaptive estimation of a quadratic functional by model selection
    • B. Laurent and P. Massart,"Adaptive estimation of a quadratic functional by model selection, " Ann. Statist., vol.28, no.5, pp. 1303-1338, 1998.
    • (1998) Ann. Statist. , vol.28 , Issue.5 , pp. 1303-1338
    • Laurent, B.1    Massart, P.2
  • 22
    • 33747163541 scopus 로고    scopus 로고
    • High-dimensional graphs and variable selection with the Lasso
    • N. Meinshausen and P. Bühlmann, "High-dimensional graphs and variable selection with the Lasso, " Ann. Statist., vol.34, pp. 1436-1462, 2006.
    • (2006) Ann. Statist. , vol.34 , pp. 1436-1462
    • Meinshausen, N.1    Bühlmann, P.2
  • 23
    • 37749037801 scopus 로고    scopus 로고
    • Lasso-type recovery of sparse representations for high-dimensional data
    • to be published
    • N. Meinshausen and B. Yu, "Lasso-type recovery of sparse representations for high-dimensional data, " Ann. Statist., to be published.
    • Ann. Statist
    • Meinshausen, N.1    Yu, B.2
  • 25
    • 0029291966 scopus 로고
    • Sparse approximate solutions to linear systems
    • B. K. Natarajan,"Sparse approximate solutions to linear systems,"SIAM J. Comput., vol.24, no.2, pp. 227-234, 1995.
    • (1995) SIAM J. Comput. , vol.24 , Issue.2 , pp. 227-234
    • Natarajan, B.K.1
  • 26
    • 52349112508 scopus 로고    scopus 로고
    • Sampling bounds for sparse support recoveryin the presence of noise
    • Toronto, ON, Canada, Jul.
    • G. Reeves and M. Gastpar, "Sampling bounds for sparse support recoveryin the presence of noise, " in Proc. IEEE Int. Symp. Information Theory, Toronto, ON, Canada, Jul. 2008, pp. 2187-2191.
    • (2008) Proc. IEEE Int. Symp. Information Theory , pp. 2187-2191
    • Reeves, G.1    Gastpar, M.2
  • 28
    • 0001287271 scopus 로고    scopus 로고
    • Regression shrinkage and selection via the lasso
    • ser. B
    • R. Tibshirani, "Regression shrinkage and selection via the lasso," J. Roy. Statist. Soc., ser. B, vol.58, no.1, pp. 267-288, 1996.
    • (1996) J. Roy. Statist. Soc. , vol.58 , Issue.1 , pp. 267-288
    • Tibshirani, R.1
  • 29
    • 33645712308 scopus 로고    scopus 로고
    • Just relax: Convex programming methods for identifying sparse signals in noise
    • Mar.
    • J. Tropp,"Just relax: Convex programming methods for identifying sparse signals in noise," IEEE Trans. Inf. Theory, vol.52, no.3, pp. 1030-1051, Mar. 2006.
    • (2006) IEEE Trans. Inf. Theory , vol.52 , Issue.3 , pp. 1030-1051
    • Tropp, J.1
  • 30
    • 77955635279 scopus 로고    scopus 로고
    • Dep. Statist., Univ. Calif., Berkeley, Tech. Rep. 725 Jan. [Online]. Available: arxiv:math.ST/0702301, presented at the IEEE Int. Symp. Information Theory, Nice, France, June 2007
    • M. J. Wainwright, "Information-Theoretic Bounds for Sparsity Recovery in the High-Dimensional and Noisy Setting," Dep. Statist., Univ. Calif., Berkeley, Tech. Rep. 725, Jan. 2007 [Online]. Available: arxiv:math.ST/0702301, presented at the IEEE Int. Symp. Information Theory, Nice, France, June 2007
    • (2007) Information-theoretic Bounds for Sparsity Recovery in the High-dimensional and Noisy Setting
    • Wainwright, M.J.1
  • 31
    • 65749083666 scopus 로고    scopus 로고
    • 1 -constrained quadratic programming (Lasso)
    • May
    • 1 -constrained quadratic programming (Lasso)," IEEE Trans. Inf. Theory, vol.55, no.5, pp. 2183-2202, May 2009.
    • (2009) IEEE Trans. Inf. Theory , vol.55 , Issue.5 , pp. 2183-2202
    • Wainwright, M.J.1
  • 33
    • 77955633392 scopus 로고    scopus 로고
    • Multi-stage variable selection: Screen and clean
    • to be published
    • L. Wasserman and K. Roeder, "Multi-stage variable selection: Screen and clean," Ann. Statist., to be published.
    • Ann. Statist
    • Wasserman, L.1    Roeder, K.2
  • 34
    • 0033233737 scopus 로고    scopus 로고
    • Information-theoretic determination of minimax rates of convergence
    • Y. Yang and A. Barron,"Information-theoretic determination of minimax rates of convergence,;quot; Ann. Statist., vol.27, no.5, pp. 1564-1599, 1999.
    • (1999) Ann. Statist. , vol.27 , Issue.5 , pp. 1564-1599
    • Yang, Y.1    Barron, A.2
  • 36
    • 33845263263 scopus 로고    scopus 로고
    • On model selection consistency of Lasso
    • P. Zhao and B.Yu,"On model selection consistency of Lasso;quot; J. Mach.Learn. Res., vol.7, pp. 2541-2567, 2006.
    • (2006) J. Mach.Learn. Res. , vol.7 , pp. 2541-2567
    • Zhao, P.1    Yu, B.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.