메뉴 건너뛰기




Volumn 1, Issue , 2014, Pages 233-253

Structured regularizers for high-dimensional problems: Statistical and computational issues

Author keywords

Algorithms; High dimensional statistics; M estimation; Regularization; Statistical machine learning

Indexed keywords


EID: 84906567304     PISSN: 23268298     EISSN: 2326831X     Source Type: Journal    
DOI: 10.1146/annurev-statistics-022513-115643     Document Type: Article
Times cited : (83)

References (101)
  • 1
    • 84873371070 scopus 로고    scopus 로고
    • Fast global convergence of gradient methods for highdimensional statistical recovery
    • Agarwal A, Negahban S, Wainwright MJ. 2012a. Fast global convergence of gradient methods for highdimensional statistical recovery. Ann. Stat. 40(5):2452-82
    • (2012) Ann. Stat. , vol.40 , Issue.5 , pp. 2452-2482
    • Agarwal, A.1    Negahban, S.2    Wainwright, M.J.3
  • 2
    • 84872015802 scopus 로고    scopus 로고
    • Noisy matrix decomposition via convex relaxation: Optimal rates in high dimensions
    • Agarwal A, Negahban S,Wainwright MJ. 2012b. Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions. Ann. Stat. 40(2):1171-97
    • (2012) Ann. Stat. , vol.40 , Issue.2 , pp. 1171-1197
    • Agarwal, A.1    Negahban, S.2    Wainwright, M.J.3
  • 3
    • 69049101180 scopus 로고    scopus 로고
    • High-dimensional analysis of semidefinite relaxations for sparse principal component analysis
    • Amini AA, Wainwright MJ. 2009. High-dimensional analysis of semidefinite relaxations for sparse principal component analysis. Ann. Stat. 37(5B):2877-921
    • (2009) Ann. Stat. , vol.37 , Issue.5 B , pp. 2877-2921
    • Amini, A.A.1    Wainwright, M.J.2
  • 4
    • 46249124832 scopus 로고    scopus 로고
    • Consistency of trace norm minimization
    • Bach F. 2008. Consistency of trace norm minimization. J. Mach. Learn. Res. 9:1019-48
    • (2008) J. Mach. Learn. Res. , vol.9 , pp. 1019-1048
    • Bach, F.1
  • 7
    • 84893322123 scopus 로고    scopus 로고
    • Computational lower bounds for sparse PCA
    • Princeton Univ., Princeton, NJ.
    • BerthetQ, Rigollet P. 2013. Computational lower bounds for sparse PCA.Tech. Rep., Princeton Univ., Princeton, NJ.http://arxiv1304.0828
    • (2013) Tech. Rep.
    • Berthet, Q.1    Rigollet, P.2
  • 9
    • 33845678003 scopus 로고    scopus 로고
    • Regularization in statistics
    • Bickel P, Li B. 2006. Regularization in statistics. TEST 15(2):271-344
    • (2006) TEST , vol.15 , Issue.2 , pp. 271-344
    • Bickel, P.1    Li, B.2
  • 10
    • 68649086910 scopus 로고    scopus 로고
    • Simultaneous analysis of Lasso and Dantzig selector
    • Bickel P, Ritov Y, Tsybakov A. 2009. Simultaneous analysis of Lasso and Dantzig selector. Ann. Stat. 37(4):1705-32
    • (2009) Ann. Stat. , vol.37 , Issue.4 , pp. 1705-1732
    • Bickel, P.1    Ritov, Y.2    Tsybakov, A.3
  • 12
    • 57349100926 scopus 로고    scopus 로고
    • Linear convergence of iterative soft thresholding
    • BrediesK, Lorenz DA. 2008. Linear convergence of iterative soft thresholding. J. Fourier Anal. Appl. 14:813-37
    • (2008) J. Fourier Anal. Appl. , vol.14 , pp. 813-837
    • Bredies, K.1    Lorenz, D.A.2
  • 15
    • 79960110811 scopus 로고    scopus 로고
    • A constrained ℓ1-minimization approach to sparse precision matrix estimation
    • Cai T, Liu W, Luo X. 2011. A constrained ℓ1-minimization approach to sparse precision matrix estimation. J. Am. Stat. Assoc. 106:594-607
    • (2011) J. Am. Stat. Assoc. , vol.106 , pp. 594-607
    • Cai, T.1    Liu, W.2    Luo, X.3
  • 16
    • 84906890159 scopus 로고    scopus 로고
    • Estimating sparse precision matrices: Optimal rates of convergence and adaptive estimation
    • Pennsylvania, PA.
    • Cai TT, Liu W, Zhou HH. 2012. Estimating sparse precision matrices: optimal rates of convergence and adaptive estimation. Tech. Rep.,Wharton Sch., Univ. Pa., Pennsylvania, PA. http://arxiv1212.2882
    • (2012) Tech. Rep.,Wharton Sch., Univ. Pa.
    • Cai, T.T.1    Liu, W.2    Zhou, H.H.3
  • 17
  • 18
    • 79952823272 scopus 로고    scopus 로고
    • Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements
    • Cand̀es EJ, Plan Y. 2011. Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements. IEEE Trans. Inf. Theory 57(4):2342-59
    • (2011) IEEE Trans. Inf. Theory , vol.57 , Issue.4 , pp. 2342-2359
    • Cand̀es, E.J.1    Plan, Y.2
  • 19
    • 71049116435 scopus 로고    scopus 로고
    • Exact matrix completion via convex optimization
    • Cand̀es EJ,Recht B. 2009. Exact matrix completion via convex optimization. Found. Comput. Math. 9(6):717-72
    • (2009) Found. Comput. Math. , vol.9 , Issue.6 , pp. 717-772
    • Cand̀es, E.J.1    Recht, B.2
  • 20
    • 34548275795 scopus 로고    scopus 로고
    • The Dantzig selector: Statistical estimation when p is much larger than n
    • Cand̀es EJ, Tao T. 2007. The Dantzig selector: statistical estimation when p is much larger than n. Ann. Stat. 35(6):2313-51
    • (2007) Ann. Stat. , vol.35 , Issue.6 , pp. 2313-2351
    • Cand̀es, E.J.1    Tao, T.2
  • 21
    • 84871998365 scopus 로고    scopus 로고
    • Latent variable graphical model selection via convex optimization
    • Chandrasekaran V, Parrilo PA,Willsky AS. 2012. Latent variable graphical model selection via convex optimization. Ann. Stat. 40(4):1935-67
    • (2012) Ann. Stat. , vol.40 , Issue.4 , pp. 1935-1967
    • Chandrasekaran, V.1    Parrilo, P.A.2    Willsky, A.S.3
  • 24
    • 57349181932 scopus 로고    scopus 로고
    • Compressed sensing and best k-term approximation
    • Cohen A, Dahmen W, DeVore R. 2009. Compressed sensing and best k-term approximation. J. Am. Math. Soc. 22(1):211-31
    • (2009) J. Am. Math. Soc. , vol.22 , Issue.1 , pp. 211-231
    • Cohen, A.1    Dahmen, W.2    Devore, R.3
  • 26
    • 33645712892 scopus 로고    scopus 로고
    • Compressed sensing
    • Donoho DL. 2006. Compressed sensing. IEEE Trans. Inf. Theory 52(4):1289-306
    • (2006) IEEE Trans. Inf. Theory , vol.52 , Issue.4 , pp. 1289-1306
    • Donoho, D.L.1
  • 27
    • 57349138185 scopus 로고    scopus 로고
    • Counting faces of randomly-projected polytopes when the projection radically lowers dimension
    • Donoho DL,Tanner JM. 2008. Counting faces of randomly-projected polytopes when the projection radically lowers dimension. J. Am. Math. Soc. 22:1-53
    • (2008) J. Am. Math. Soc. , vol.22 , pp. 1-53
    • Donoho, D.L.1    Tanner, J.M.2
  • 29
    • 1542784498 scopus 로고    scopus 로고
    • Variable selection via non-concave penalized likelihood and its oracle properties
    • Fan J, Li R. 2001. Variable selection via non-concave penalized likelihood and its oracle properties. J. Am. Stat. Assoc. 96(456):1348-60
    • (2001) J. Am. Stat. Assoc. , vol.96 , Issue.456 , pp. 1348-1360
    • Fan, J.1    Li, R.2
  • 30
    • 0142257025 scopus 로고    scopus 로고
    • PhD Thesis, Stanford Univ., Stanford, CA
    • Fazel M. 2002. Matrix rank minimization with applications. PhD Thesis, Stanford Univ., Stanford, CA. http://faculty.washington.edu/mfazel/thesis-final. pdf
    • (2002) Matrix Rank Minimization with Applications.
    • Fazel, M.1
  • 31
    • 45849134070 scopus 로고    scopus 로고
    • Sparse inverse covariance estimation with the graphical lasso
    • Friedman J, Hastie T, Tibshirani R. 2008. Sparse inverse covariance estimation with the graphical lasso. Biostatistics 9:432-41
    • (2008) Biostatistics , vol.9 , pp. 432-441
    • Friedman, J.1    Hastie, T.2    Tibshirani, R.3
  • 32
    • 0032361278 scopus 로고    scopus 로고
    • Penalized regression: The bridge versus the Lasso
    • Fu WJ. 2001. Penalized regression: the bridge versus the Lasso. J. Comput. Graph. Stat. 7(3):397-416
    • (2001) J. Comput. Graph. Stat. , vol.7 , Issue.3 , pp. 397-416
    • Fu, W.J.1
  • 33
    • 31344454903 scopus 로고    scopus 로고
    • Persistency in high dimensional linear predictor-selection and the virtue of over-parametrization
    • Greenshtein E, Ritov Y. 2004. Persistency in high dimensional linear predictor-selection and the virtue of over-parametrization. Bernoulli 10:971-88
    • (2004) Bernoulli , vol.10 , pp. 971-988
    • Greenshtein, E.1    Ritov, Y.2
  • 34
    • 79951886985 scopus 로고    scopus 로고
    • Recovering low-rank matrices from few coefficients in any basis
    • Gross D. 2011. Recovering low-rank matrices from few coefficients in any basis. IEEE Trans. Inf. Theory 57(3):1548-66
    • (2011) IEEE Trans. Inf. Theory , vol.57 , Issue.3 , pp. 1548-1566
    • Gross, D.1
  • 36
    • 84942484786 scopus 로고
    • Ridge regression: Biased estimation for nonorthogonal problems
    • Hoerl AE, Kennard RW. 1970. Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12:55-67
    • (1970) Technometrics , vol.12 , pp. 55-67
    • Hoerl, A.E.1    Kennard, R.W.2
  • 37
    • 81255189015 scopus 로고    scopus 로고
    • Robust matrix decomposition with sparse corruptions
    • Hsu D, Kakade SM, Zhang T. 2011. Robust matrix decomposition with sparse corruptions. IEEE Trans. Inf. Theory 57(11):7221-34
    • (2011) IEEE Trans. Inf. Theory , vol.57 , Issue.11 , pp. 7221-7234
    • Hsu, D.1    Kakade, S.M.2    Zhang, T.3
  • 38
    • 77955136689 scopus 로고    scopus 로고
    • The benefit of group sparsity
    • Huang J, Zhang T. 2010. The benefit of group sparsity. Ann. Stat. 38(4):1978-2004
    • (2010) Ann. Stat. , vol.38 , Issue.4 , pp. 1978-2004
    • Huang, J.1    Zhang, T.2
  • 41
    • 84891904619 scopus 로고    scopus 로고
    • Hypothesis testing in high-dimensional regression under the Gaussian random design model: Asymptotic theory
    • Stanford, CA
    • Javanmard A, Montanari A. 2013. Hypothesis testing in high-dimensional regression under the Gaussian random design model: asymptotic theory. Tech. Rep., Stanford Univ., Stanford, CA. http://arxiv1301.4240
    • (2013) Tech. Rep., Stanford Univ.
    • Javanmard, A.1    Montanari, A.2
  • 42
    • 33746126624 scopus 로고    scopus 로고
    • Blockwise sparse regression
    • Kim Y, Kim J, Kim Y. 2006. Blockwise sparse regression. Stat. Sin. 16(2):375-90
    • (2006) .Stat. Sin. , vol.16 , Issue.2 , pp. 375-390
    • Kim, Y.1    Kim, J.2    Kim, Y.3
  • 43
    • 82655171609 scopus 로고    scopus 로고
    • Nuclear-norm penalization and optimal rates for noisy lowrank matrix completion
    • Koltchinskii V, Lounici K, Tsybakov AB. 2011. Nuclear-norm penalization and optimal rates for noisy lowrank matrix completion. Ann. Stat. 39:2302-29
    • (2011) Ann. Stat. , vol.39 , pp. 2302-2329
    • Koltchinskii, V.1    Lounici, K.2    Tsybakov, A.B.3
  • 44
    • 84860650487 scopus 로고    scopus 로고
    • Sparse recovery in large ensembles of kernel machines
    • Theory, 21st, Helsinki, Finland
    • Koltchinskii V, Yuan M. 2008. Sparse recovery in large ensembles of kernel machines. Presented at Annu. Conf. Learn. Theory, 21st, Helsinki, Finland
    • (2008) Presented at Annu. Conf. Learn
    • Koltchinskii, V.1    Yuan, M.2
  • 45
    • 78650166948 scopus 로고    scopus 로고
    • Sparsity in multiple kernel learning
    • Koltchinskii V, Yuan M. 2010. Sparsity in multiple kernel learning. Ann. Stat. 38:3660-95
    • (2010) Ann. Stat. , vol.38 , pp. 3660-3695
    • Koltchinskii, V.1    Yuan, M.2
  • 46
    • 73949122606 scopus 로고    scopus 로고
    • Sparsistency and rates of convergence in large covariance matrix estimation
    • Lam C, Fan J. 2009. Sparsistency and rates of convergence in large covariance matrix estimation. Ann. Stat. 37:4254-78
    • (2009) Ann. Stat. , vol.37 , pp. 4254-4278
    • Lam, C.1    Fan, J.2
  • 47
    • 33847350805 scopus 로고    scopus 로고
    • Component selection and smoothing in multivariate nonparametric regression
    • Lin Y, Zhang HH. 2006. Component selection and smoothing in multivariate nonparametric regression. Ann. Stat. 34:2272-97
    • (2006) Ann. Stat. , vol.34 , pp. 2272-2297
    • Lin, Y.1    Zhang, H.H.2
  • 48
    • 70450277253 scopus 로고    scopus 로고
    • The nonparanormal: Semiparametric estimation of high-dimensional undirected graphs
    • Liu H, Lafferty J,Wasserman L. 2009. The nonparanormal: semiparametric estimation of high-dimensional undirected graphs. J. Mach. Learn. Res. 10:1-37
    • (2009) J. Mach. Learn. Res. , vol.10 , pp. 1-37
    • Liu, H.1    Lafferty, J.2    Wasserman, L.3
  • 49
    • 84872078104 scopus 로고    scopus 로고
    • High-dimensional regression with noisy and missing data: Provable guarantees with non-convexity
    • Loh P, WainwrightMJ. 2012. High-dimensional regression with noisy and missing data: provable guarantees with non-convexity. Ann. Stat. 40(3):1637-64
    • (2012) Ann. Stat. , vol.40 , Issue.3 , pp. 1637-1664
    • Loh, P.1    Wainwright, M.J.2
  • 50
    • 84855412474 scopus 로고    scopus 로고
    • Oracle inequalities and optimal inference under group sparsity
    • Lounici K, Pontil M, Tsybakov AB, van de Geer S. 2011. Oracle inequalities and optimal inference under group sparsity. Ann. Stat. 39(4):2164-204
    • (2011) Ann. Stat. , vol.39 , Issue.4 , pp. 2164-2204
    • Lounici, K.1    Pontil, M.2    Tsybakov, A.B.3    Van De Geer, S.4
  • 51
    • 77956944781 scopus 로고    scopus 로고
    • Spectral regularization algorithms for learning large incomplete matrices
    • Mazumder R, Hastie T, Tibshirani R. 2010. Spectral regularization algorithms for learning large incomplete matrices. J. Mach. Learn. Res. 11:2287-322
    • (2010) J. Mach. Learn. Res. , vol.11 , pp. 2287-2322
    • Mazumder, R.1    Hastie, T.2    Tibshirani, R.3
  • 53
    • 33747163541 scopus 로고    scopus 로고
    • High-dimensional graphs and variable selection with the Lasso
    • Meinshausen N, B̈uhlmann P. 2006. High-dimensional graphs and variable selection with the Lasso. Ann. Stat. 34:1436-62
    • (2006) Ann. Stat. , vol.34 , pp. 1436-1462
    • Meinshausen, N.1    B̈uhlmann, P.2
  • 55
    • 84871600478 scopus 로고    scopus 로고
    • A unified framework for high-dimensional analysis of M-estimators with decomposable regularizers
    • Negahban S, Ravikumar P,Wainwright MJ, Yu B. 2012. A unified framework for high-dimensional analysis of M-estimators with decomposable regularizers. Stat. Sci. 27(4):538-57
    • (2012) Stat. Sci. , vol.27 , Issue.4 , pp. 538-557
    • Negahban, S.1    Ravikumar, P.2    Wainwright, M.J.3    Yu, B.4
  • 56
    • 79952934740 scopus 로고    scopus 로고
    • Estimation of (near) low-rank matrices with noise and high-dimensional scaling
    • Negahban S, WainwrightMJ. 2011a. Estimation of (near) low-rank matrices with noise and high-dimensional scaling. Ann. Stat. 39(2):1069-97
    • (2011) Ann. Stat. , vol.39 , Issue.2 , pp. 1069-1097
    • Negahban, S.1    Wainwright, M.J.2
  • 57
    • 79957634445 scopus 로고    scopus 로고
    • Simultaneous support recovery in high-dimensional regression: Benefits and perils of ℓ1,8-regularization
    • Negahban S, WainwrightMJ. 2011b. Simultaneous support recovery in high-dimensional regression: benefits and perils of ℓ1,8-regularization. IEEE Trans. Inf. Theory 57(6):3841-63
    • (2011) IEEE Trans. Inf. Theory , vol.57 , Issue.6 , pp. 3841-3863
    • Negahban, S.1    Wainwright, M.J.2
  • 58
    • 84862020232 scopus 로고    scopus 로고
    • Restricted strong convexity and (weighted) matrix completion: Optimal bounds with noise
    • Negahban S, Wainwright MJ. 2012. Restricted strong convexity and (weighted) matrix completion: optimal bounds with noise. J. Mach. Learn. Res. 13:1665-97
    • (2012) J. Mach. Learn. Res. , vol.13 , pp. 1665-1697
    • Negahban, S.1    Wainwright, M.J.2
  • 60
    • 67651063011 scopus 로고    scopus 로고
    • Gradient methods for minimizing composite objective function
    • Cent. Oper. Res. Econom., Catholic Univ., Louvain, Belg
    • Nesterov Y. 2007. Gradient methods for minimizing composite objective function. Tech. Rep. 76, Cent. Oper. Res. Econom., Catholic Univ., Louvain, Belg.
    • (2007) Tech. Rep. , vol.76
    • Nesterov, Y.1
  • 61
    • 84865692149 scopus 로고    scopus 로고
    • Efficiency of coordinate descent methods on huge-scale optimization problems
    • Nesterov Y. 2012. Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optim. 22(2):341-62
    • (2012) SIAM J. Optim. , vol.22 , Issue.2 , pp. 341-362
    • Nesterov, Y.1
  • 62
    • 79551607002 scopus 로고    scopus 로고
    • Union support recovery in high-dimensional multivariate regression
    • Obozinski G, Wainwright MJ, Jordan MI. 2011. Union support recovery in high-dimensional multivariate regression. Ann. Stat. 39(1):1-47
    • (2011) Ann. Stat. , vol.39 , Issue.1 , pp. 1-47
    • Obozinski, G.1    Wainwright, M.J.2    Jordan, M.I.3
  • 63
    • 84891361314 scopus 로고    scopus 로고
    • Simultaneously structured models with applications to sparse and low-rank matrices
    • Pasadena, CA
    • Oymak S, Jalali A, Fazel M, Eldar YC, Hassibi B. 2012. Simultaneously structured models with applications to sparse and low-rank matrices. Tech. Rep., Calif. Inst. Technol., Pasadena, CA. http://arxiv1212.3753
    • (2012) Tech. Rep., Calif. Inst. Technol
    • Oymak, S.1    Jalali, A.2    Fazel, M.3    Eldar, Y.C.4    Hassibi, B.5
  • 64
    • 77956925453 scopus 로고    scopus 로고
    • Restricted eigenvalue conditions for correlated Gaussian designs
    • Raskutti G, Wainwright MJ, Yu B. 2010. Restricted eigenvalue conditions for correlated Gaussian designs. J. Mach. Learn. Res. 11:2241-59
    • (2010) J. Mach. Learn. Res. , vol.11 , pp. 2241-2259
    • Raskutti, G.1    Wainwright, M.J.2    Yu, B.3
  • 65
    • 80053974183 scopus 로고    scopus 로고
    • Minimax rates of estimation for high-dimensional linear regression over ℓq -balls
    • Raskutti G, Wainwright MJ, Yu B. 2011. Minimax rates of estimation for high-dimensional linear regression over ℓq -balls. IEEE Trans. Inf. Theory 57(10):6976-94
    • (2011) IEEE Trans. Inf. Theory , vol.57 , Issue.10 , pp. 6976-6994
    • Raskutti, G.1    Wainwright, M.J.2    Yu, B.3
  • 66
    • 84857824105 scopus 로고    scopus 로고
    • Minimax-optimal rates for sparse additive models over kernel classes via convex programming
    • Raskutti G, WainwrightMJ, Yu B. 2012. Minimax-optimal rates for sparse additive models over kernel classes via convex programming. J. Mach. Learn. Res. 12:389-427
    • (2012) J. Mach. Learn. Res. , vol.12 , pp. 389-427
    • Raskutti, G.1    Wainwright, M.J.2    Yu, B.3
  • 68
    • 80555142374 scopus 로고    scopus 로고
    • High-dimensional covariance estimation by minimizing ℓ1-penalized log-determinant divergence
    • RavikumarP,WainwrightMJ, Raskutti G, Yu B. 2011. High-dimensional covariance estimation by minimizing ℓ1-penalized log-determinant divergence. Electron. J. Stat. 5:935-80
    • (2010) Electron. J. Stat. , vol.5 , pp. 935-980
    • Ravikumar, P.1    Wainwright, M.J.2    Raskutti, G.3    Yu, B.4
  • 69
    • 84856009825 scopus 로고    scopus 로고
    • A simpler approach to matrix completion
    • Recht B. 2011. A simpler approach to matrix completion. J. Mach. Learn. Res. 12:3413-30
    • (2011) J. Mach. Learn. Res. , vol.12 , pp. 3413-3430
    • Recht, B.1
  • 70
    • 78549288866 scopus 로고    scopus 로고
    • Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
    • Recht B, FazelM, Parrilo P. 2010. Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM Rev. 52(3):471-501
    • (2010) SIAM Rev. , vol.52 , Issue.3 , pp. 471-501
    • Recht, B.1    Fazel, M.2    Parrilo, P.3
  • 71
    • 79952902758 scopus 로고    scopus 로고
    • Estimation of high-dimensional low-rank matrices
    • Rohde A, Tsybakov A. 2011. Estimation of high-dimensional low-rank matrices. Ann. Stat. 39(2):887-930
    • (2011) Ann. Stat. , vol.39 , Issue.2 , pp. 887-930
    • Rohde, A.1    Tsybakov, A.2
  • 73
    • 84877881209 scopus 로고    scopus 로고
    • Reconstruction from anisotropic random measurements
    • Rudelson M, Zhou S. 2012. Reconstruction from anisotropic random measurements. IEEE Trans. Inf. Theory 59:3434-47
    • (2012) IEEE Trans. Inf. Theory , vol.59 , pp. 3434-3447
    • Rudelson, M.1    Zhou, S.2
  • 74
    • 26944475424 scopus 로고    scopus 로고
    • Generalization error bounds for collaborative prediction with low-rank matrices
    • 17th, Vancouver
    • Srebro N, Alon N, Jaakkola TS. 2005. Generalization error bounds for collaborative prediction with low-rank matrices. Presented at Neural Inf. Proc. Syst., 17th, Vancouver
    • (2005) Presented at Neural Inf. Proc. Syst.
    • Srebro, N.1    Alon, N.2    Jaakkola, T.S.3
  • 76
    • 68249141421 scopus 로고    scopus 로고
    • On the reconstruction of block-sparse signals with an optimal number of measurements
    • StojnicM, Parvaresh F, Hassibi B. 2009. On the reconstruction of block-sparse signals with an optimal number of measurements. IEEE Trans. Signal Process. 57(8):3075-85
    • (2009) IEEE Trans. Signal Process. , vol.57 , Issue.8 , pp. 3075-3385
    • Stojnic, M.1    Parvaresh, F.2    Hassibi, B.3
  • 77
    • 0000439527 scopus 로고
    • Optimal global rates of convergence for non-parametric regression
    • Stone CJ. 1982. Optimal global rates of convergence for non-parametric regression. Ann. Stat. 10(4):1040-53
    • (1982) Ann. Stat. , vol.10 , Issue.4 , pp. 1040-1053
    • Stone, C.J.1
  • 78
    • 0001227575 scopus 로고
    • Additive regression and other non-parametric models
    • Stone CJ. 1985. Additive regression and other non-parametric models. Ann. Stat. 13(2):689-705
    • (1985) Ann. Stat. , vol.13 , Issue.2 , pp. 689-705
    • Stone, C.J.1
  • 79
    • 0001287271 scopus 로고    scopus 로고
    • Regression shrinkage and selection via the Lasso
    • Tibshirani R. 1996. Regression shrinkage and selection via the Lasso. J. R. Stat. Soc. Ser. B 58(1):267-88
    • (1996) J. R. Stat. Soc. Ser. B , vol.58 , Issue.1 , pp. 267-288
    • Tibshirani, R.1
  • 80
    • 0000418073 scopus 로고
    • On the stability of inverse problems. C. R. (Doklady)
    • Tikhonov AN. 1943. On the stability of inverse problems. C. R. (Doklady) Acad. Sci. SSSR 39:176-79
    • (1943) Acad. Sci. SSSR , vol.39 , pp. 176-179
    • Tikhonov, A.N.1
  • 81
    • 33645712308 scopus 로고    scopus 로고
    • Just relax: Convex programming methods for identifying sparse signals in noise
    • Tropp JA. 2006. Just relax: convex programming methods for identifying sparse signals in noise. IEEE Trans. Inf. Theory 52(3):1030-51
    • (2006) IEEE Trans. Inf. Theory , vol.52 , Issue.3 , pp. 1030-1051
    • Tropp, J.A.1
  • 82
    • 30844445842 scopus 로고    scopus 로고
    • Algorithms for simultaneous sparse approximation. Part I: Greedy pursuit
    • Tropp JA, Gilbert AC, Strauss MJ. 2006. Algorithms for simultaneous sparse approximation. Part I: greedy pursuit. Signal Process. 86:572-88
    • (2006) Signal Process , vol.86 , pp. 572-588
    • Tropp, J.A.1    Gilbert, A.C.2    Strauss, M.J.3
  • 83
    • 0035533631 scopus 로고    scopus 로고
    • Convergence of block coordinate descent method for nondifferentiable maximization
    • Tseng P. 2001. Convergence of block coordinate descent method for nondifferentiable maximization. J. Opt. Theory Appl. 109(3):474-94
    • (2001) J. Opt. Theory Appl. , vol.109 , Issue.3 , pp. 474-494
    • Tseng, P.1
  • 84
    • 60349101047 scopus 로고    scopus 로고
    • A block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
    • Tseng P, Yun S. 2009. A block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization. J. Optim. Theory Appl. 140:513-35
    • (2009) J. Optim. Theory Appl. , vol.140 , pp. 513-535
    • Tseng, P.1    Yun, S.2
  • 87
    • 51049121146 scopus 로고    scopus 로고
    • High-dimensional generalized linear models and the Lasso
    • van de Geer S. 2008. High-dimensional generalized linear models and the Lasso. Ann. Stat. 36:614-45
    • (2008) Ann. Stat. , vol.36 , pp. 614-645
    • Van De Geer, S.1
  • 88
    • 84898994592 scopus 로고    scopus 로고
    • Weakly decomposable regularization penalties and structured sparsity
    • van de Geer S. 2012. Weakly decomposable regularization penalties and structured sparsity. Tech. Rep., ETH Zurich, Switz. http://arxiv1204.4813v2
    • (2012) Tech. Rep., ETH Zurich, Switz
    • Van De Geer, S.1
  • 89
    • 77955054299 scopus 로고    scopus 로고
    • On the conditions used to prove oracle results for the Lasso. Electron
    • van de Geer S, Buhlmann P. 2009. On the conditions used to prove oracle results for the Lasso. Electron. J. Stat. 3:1360-92
    • (2009) J. Stat. , vol.3 , pp. 1360-1392
    • Van De Geer, S.1    Buhlmann, P.2
  • 90
    • 84897740465 scopus 로고    scopus 로고
    • On asymptotically optimal confidence regions and tests for highdimensional models
    • van de Geer S, Buhlmann P, Ritov Y. 2013. On asymptotically optimal confidence regions and tests for highdimensional models. Tech. Rep., ETH Zurich, Switz. http://arxiv1303.0518
    • (2013) Tech. Rep., ETH Zurich, Switz.
    • Van De Geer, S.1    Buhlmann, P.2    Ritov, Y.3
  • 91
    • 65749083666 scopus 로고    scopus 로고
    • Sharp thresholds for high-dimensional and noisy sparsity recovery using ℓ1-constrained quadratic programming (Lasso)
    • WainwrightMJ. 2009. Sharp thresholds for high-dimensional and noisy sparsity recovery using ℓ1-constrained quadratic programming (Lasso). IEEE Trans. Inf. Theory 55:2183-202
    • (2009) IEEE Trans. Inf. Theory , vol.55 , pp. 2183-2202
    • Wainwright, M.J.1
  • 92
    • 84863879353 scopus 로고    scopus 로고
    • Coordinate descent algorithms for Lasso-penalized regression
    • Wu TT, Lange K. 2008. Coordinate descent algorithms for Lasso-penalized regression. Ann. Appl. Stat. 2(1):224-44
    • (2008) Ann. Appl. Stat. , vol.2 , Issue.1 , pp. 224-244
    • Wu, T.T.1    Lange, K.2
  • 94
    • 77956916683 scopus 로고    scopus 로고
    • High-dimensional inverse covariance matrix estimation via linear programming
    • Yuan M. 2010. High-dimensional inverse covariance matrix estimation via linear programming. J. Mach. Learn. Res. 11:2261-86
    • (2010) J. Mach. Learn. Res. , vol.11 , pp. 2261-2286
    • Yuan, M.1
  • 95
    • 33645035051 scopus 로고    scopus 로고
    • Model selection and estimation in regression with grouped variables
    • Yuan M, Lin Y. 2006. Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. B 68:49-67
    • (2006) J. R. Stat. Soc. B , vol.68 , pp. 49-67
    • Yuan, M.1    Lin, Y.2
  • 96
    • 50949096321 scopus 로고    scopus 로고
    • The sparsity and bias of the Lasso selection in high-dimensional linear regression
    • Zhang CH, Huang J. 2008. The sparsity and bias of the Lasso selection in high-dimensional linear regression. Ann. Stat. 36(4):1567-94
    • (2008) Ann. Stat. , vol.36 , Issue.4 , pp. 1567-1594
    • Zhang, C.H.1    Huang, J.2
  • 97
    • 84866857887 scopus 로고    scopus 로고
    • Confidence intervals for low-dimensional parameters with high-dimensional data
    • New Brunswick, NJ
    • ZhangCH, Zhang SS. 2011. Confidence intervals for low-dimensional parameters with high-dimensional data. Tech. Rep., Rutgers Univ., New Brunswick, NJ. http://arxiv1110.2563
    • (2011) Tech. Rep., Rutgers Univ.
    • Zhang, C.H.1    Zhang, S.S.2
  • 98
    • 84871532743 scopus 로고    scopus 로고
    • A general theory of concave regularization for high-dimensional sparse estimation problems
    • Zhang CH, ZhangT. 2012. A general theory of concave regularization for high-dimensional sparse estimation problems. Stat. Sci. 27(4):576-93
    • (2012) Stat. Sci. , vol.27 , Issue.4 , pp. 576-593
    • Zhang, C.H.1    Zhang, T.2
  • 99
    • 69949155103 scopus 로고    scopus 로고
    • Grouped and hierarchical model selection through composite absolute penalties
    • Zhao P, RochaG,Yu B. 2009. Grouped and hierarchical model selection through composite absolute penalties. Ann. Stat. 37(6A):3468-97
    • (2009) Ann. Stat. , vol.37 , Issue.6 A , pp. 3468-3497
    • Zhao, P.1    Rocha, G.2    Yu, B.3
  • 100
    • 33845263263 scopus 로고    scopus 로고
    • On model selection consistency of Lasso
    • Zhao P, Yu B. 2006. On model selection consistency of Lasso. J. Mach. Learn. Res. 7:2541-67
    • (2006) J. Mach. Learn. Res. , vol.7 , pp. 2541-2567
    • Zhao, P.1    Yu, B.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.