메뉴 건너뛰기




Volumn 13, Issue , 2012, Pages 1839-1864

Estimation and selection via absolute penalized convex minimization and its multistage adaptive applications

Author keywords

Generalized linear models; Oracle inequality; Penalized estimation; Selection consistency; Sparsity; Variable selection

Indexed keywords

GENERALIZED LINEAR MODEL; ORACLE INEQUALITY; PENALIZED ESTIMATION; SELECTION CONSISTENCY; SPARSITY; VARIABLE SELECTION;

EID: 84864948194     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (62)

References (38)
  • 1
    • 68649086910 scopus 로고    scopus 로고
    • Simultaneous analysis of Lasso and Dantzig selector
    • P. J. Bickel, Y. Ritov, and A. Tsybakov. Simultaneous analysis of Lasso and Dantzig selector. Annals of Statistics, 37(4):1705-1732, 2009.
    • (2009) Annals of Statistics , vol.37 , Issue.4 , pp. 1705-1732
    • Bickel, P.J.1    Ritov, Y.2    Tsybakov, A.3
  • 2
    • 49949144765 scopus 로고
    • The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming
    • L. M. Bregman. The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Computational Mathematics and Mathematical Physics, 7:200-217, 1967.
    • (1967) USSR Computational Mathematics and Mathematical Physics , vol.7 , pp. 200-217
    • Bregman, L.M.1
  • 4
    • 29144439194 scopus 로고    scopus 로고
    • Decoding by linear programming
    • DOI 10.1109/TIT.2005.858979
    • E. J. Candes and T. Tao. Decoding by linear programming. IEEE Trans. on Information Theory, 51:4203-4215, 2005. (Pubitemid 41800353)
    • (2005) IEEE Transactions on Information Theory , vol.51 , Issue.12 , pp. 4203-4215
    • Candes, E.J.1    Tao, T.2
  • 5
    • 34548275795 scopus 로고    scopus 로고
    • The dantzig selector: Statistical estimation when p is much larger than n (with discussion)
    • E. J. Candes and T. Tao. The dantzig selector: statistical estimation when p is much larger than n (with discussion). Annals of Statistics, 35:2313-2404, 2007.
    • (2007) Annals of Statistics , vol.35 , pp. 2313-2404
    • Candes, E.J.1    Tao, T.2
  • 7
    • 1542784498 scopus 로고    scopus 로고
    • Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
    • J. Fan and R. Li. Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association, 96:1348-1360, 2001. (Pubitemid 33695585)
    • (2001) Journal of the American Statistical Association , vol.96 , Issue.456 , pp. 1348-1360
    • Fan, J.1    Li, R.2
  • 8
    • 31344454903 scopus 로고    scopus 로고
    • Persistence in high-dimensional linear predictor selection and the virtue of overparametrization
    • DOI 10.3150/bj/1106314846
    • E. Greenshtein and Y. Ritov. Persistence in high-dimensional linear predictor selection and the virtue of overparametrization. Bernoulli, 10:971-988, 2004. (Pubitemid 44242744)
    • (2004) Bernoulli , vol.10 , Issue.6 , pp. 971-988
    • Greenshtein, E.1    Ritov, Y.2
  • 9
    • 51049096710 scopus 로고    scopus 로고
    • Adaptive lasso for sparse high-dimensional regression models
    • J. Huang, S. Ma, and C.-H. Zhang. Adaptive lasso for sparse high-dimensional regression models. Statistica Sinica, 18:1603-1618, 2008.
    • (2008) Statistica Sinica , vol.18 , pp. 1603-1618
    • Huang, J.1    Ma, S.2    Zhang, C.-H.3
  • 10
    • 26444617168 scopus 로고    scopus 로고
    • Variable selection using MM algorithms
    • DOI 10.1214/009053605000000200
    • D. R. Hunter and R. Li. Variable selection using mm algorithms. Annals of Statistics, 33:1617-1642, 2005. (Pubitemid 41423982)
    • (2005) Annals of Statistics , vol.33 , Issue.4 , pp. 1617-1642
    • Hunter, D.R.1    Li, R.2
  • 11
    • 72249100613 scopus 로고    scopus 로고
    • The dantzig selector and sparsity oracle inequalities
    • V. Koltchinskii. The dantzig selector and sparsity oracle inequalities. Bernoulli, 15:799-828, 2009.
    • (2009) Bernoulli , vol.15 , pp. 799-828
    • Koltchinskii, V.1
  • 13
    • 84897542602 scopus 로고    scopus 로고
    • Smoothing l1-penalized estimators for high-dimensional time-course data
    • L. Meier and P. B̈uhlmann. Smoothing l1-penalized estimators for high-dimensional time-course data. Electronic Journal of Statistics, 1:597-615, 2007.
    • (2007) Electronic Journal of Statistics , vol.1 , pp. 597-615
    • Meier, L.1    B̈uhlmann, P.2
  • 14
    • 33747163541 scopus 로고    scopus 로고
    • High-dimensional graphs and variable selection with the Lasso
    • DOI 10.1214/009053606000000281
    • N. Meinshausen and P. B̈uhlmann. High-dimensional graphs and variable selection with the lasso. Annals of Statistics, 34:1436-1462, 2006. (Pubitemid 44231168)
    • (2006) Annals of Statistics , vol.34 , Issue.3 , pp. 1436-1462
    • Meinshausen, N.1    Buhlmann, P.2
  • 15
    • 65349193793 scopus 로고    scopus 로고
    • Lasso-type recovery of sparse representations for high-dimensional data
    • N. Meinshausen and B. Yu. Lasso-type recovery of sparse representations for high-dimensional data. Annals of Statistics, 37:246-270, 2009.
    • (2009) Annals of Statistics , vol.37 , pp. 246-270
    • Meinshausen, N.1    Yu, B.2
  • 20
    • 77955057877 scopus 로고    scopus 로고
    • L1-penalization for mixture regression models (with discussion)
    • N. Sẗadler, P. B̈uhlmann, and S. van de Geer. l1-penalization for mixture regression models (with discussion). Test, 19(2):209-285, 2010.
    • (2010) Test , vol.19 , Issue.2 , pp. 209-285
    • Sẗadler, N.1    B̈uhlmann, P.2    Van De Geer, S.3
  • 23
    • 79954994522 scopus 로고    scopus 로고
    • The solution path of the generalized lasso
    • R. Tibshirani and J. Taylor. The solution path of the generalized lasso. The Annals of Statistics, 39: 1335-1371, 2011.
    • (2011) The Annals of Statistics , vol.39 , pp. 1335-1371
    • Tibshirani, R.1    Taylor, J.2
  • 24
    • 33645712308 scopus 로고    scopus 로고
    • Just relax: Convex programming methods for identifying sparse signals in noise
    • DOI 10.1109/TIT.2005.864420
    • J. A. Tropp. Just relax: convex programming methods for identifying sparse signals in noise. IEEE Transactions on Information Theory, 52:1030-1051, 2006. (Pubitemid 46444890)
    • (2006) IEEE Transactions on Information Theory , vol.52 , Issue.3 , pp. 1030-1051
    • Tropp, J.A.1
  • 25
    • 79551473604 scopus 로고    scopus 로고
    • Technical Report 140, ETH Zurich, Switzerland
    • S. van de Geer. The deterministic lasso. Technical Report 140, ETH Zurich, Switzerland, 2007.
    • (2007) The Deterministic Lasso
    • Van De Geer, S.1
  • 26
    • 51049121146 scopus 로고    scopus 로고
    • High-dimensional generalized linear models and the lasso
    • S. van de Geer. High-dimensional generalized linear models and the lasso. Annals of Statistics, 36: 614-645, 2008.
    • (2008) Annals of Statistics , vol.36 , pp. 614-645
    • Van De Geer, S.1
  • 27
    • 77955054299 scopus 로고    scopus 로고
    • On the conditions used to prove oracle results for the lasso
    • S. van de Geer and P. B̈uhlmann. On the conditions used to prove oracle results for the lasso. Electronic Journal of Statistics, 3:1360-1392, 2009.
    • (2009) Electronic Journal of Statistics , vol.3 , pp. 1360-1392
    • Van De Geer, S.1    B̈uhlmann, P.2
  • 28
    • 65749083666 scopus 로고    scopus 로고
    • Sharp thresholds for noisy and high-dimensional recovery of sparsity using l1-constrained quadratic programming (lasso)
    • M. J. Wainwright. Sharp thresholds for noisy and high-dimensional recovery of sparsity using l1-constrained quadratic programming (lasso). IEEE Transactions on Information Theory, 55: 2183-2202, 2009.
    • (2009) IEEE Transactions on Information Theory , vol.55 , pp. 2183-2202
    • Wainwright, M.J.1
  • 29
    • 79551503968 scopus 로고    scopus 로고
    • Rate minimaxity of the lasso and dantzig selector for the lq loss in lr balls
    • F. Ye and C.-H. Zhang. Rate minimaxity of the lasso and dantzig selector for the lq loss in lr balls. Journal of Machine Learning Research, 11:3481-3502, 2010.
    • (2010) Journal of Machine Learning Research , vol.11 , pp. 3481-3502
    • Ye, F.1    Zhang, C.-H.2
  • 31
    • 77649284492 scopus 로고    scopus 로고
    • Nearly unbiased variable selection under minimax concave penalty
    • C.-H. Zhang. Nearly unbiased variable selection under minimax concave penalty. The Annals of Statistics, 38:894-942, 2010a.
    • (2010) The Annals of Statistics , vol.38 , pp. 894-942
    • Zhang, C.-H.1
  • 32
    • 50949096321 scopus 로고    scopus 로고
    • The sparsity and bias of the Lasso selection in high-dimensional linear regression
    • C.-H. Zhang and J. Huang. The sparsity and bias of the Lasso selection in high-dimensional linear regression. Annals of Statistics, 36(4):1567-1594, 2008.
    • (2008) Annals of Statistics , vol.36 , Issue.4 , pp. 1567-1594
    • Zhang, C.-H.1    Huang, J.2
  • 33
    • 77951191949 scopus 로고    scopus 로고
    • Analysis of multi-stage convex relaxation for sparse regularization
    • T. Zhang. Analysis of multi-stage convex relaxation for sparse regularization. Journal of Machine Learning Research, 11:1087-1107, 2010b.
    • (2010) Journal of Machine Learning Research , vol.11 , pp. 1087-1107
    • Zhang, T.1
  • 34
    • 79959549699 scopus 로고    scopus 로고
    • Adaptive forward-backward greedy algorithm for learning sparse representations
    • T. Zhang. Adaptive forward-backward greedy algorithm for learning sparse representations. IEEE Transactions on Information Theory, 57:4689-4708, 2011a.
    • (2011) IEEE Transactions on Information Theory , vol.57 , pp. 4689-4708
    • Zhang, T.1
  • 36
  • 38
    • 51049104549 scopus 로고    scopus 로고
    • One-step sparse estimates in nonconcave penalized likelihood models
    • H. Zou and R. Li. One-step sparse estimates in nonconcave penalized likelihood models. Annals of Statistics, 36(4):1509-1533, 2008.
    • (2008) Annals of Statistics , vol.36 , Issue.4 , pp. 1509-1533
    • Zou, H.1    Li, R.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.