메뉴 건너뛰기




Volumn 2, Issue January, 2014, Pages 1008-1016

Constant nullspace strong convexity and fast convergence of proximal methods under high-dimensional settings

Author keywords

[No Author keywords available]

Indexed keywords

CONVEX OPTIMIZATION; INFORMATION SCIENCE; NEWTON-RAPHSON METHOD;

EID: 84937930992     PISSN: 10495258     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (12)

References (21)
  • 1
    • 84899013349 scopus 로고    scopus 로고
    • Proximal newton-type methods for minimizing composite functions
    • J. D. Lee, Y. Sun, and M. A. Saunders. Proximal newton-type methods for minimizing composite functions. In NIPS, 2012.
    • (2012) NIPS
    • Lee, J.D.1    Sun, Y.2    Saunders, M.A.3
  • 2
    • 85162490550 scopus 로고    scopus 로고
    • Sparse inverse covariance estimation using quadratic approximation
    • C.-J. Hsieh, M. A. Sustik, I. S. Dhillon, and P. Ravikumar. Sparse inverse covariance estimation using quadratic approximation. In NIPS 2011.
    • (2011) NIPS
    • Hsieh, C.-J.1    Sustik, M.A.2    Dhillon, I.S.3    Ravikumar, P.4
  • 4
    • 84886577495 scopus 로고    scopus 로고
    • Iteration complexity of feasible descent methods for convex optimization
    • National Taiwan University, Taipei, Taiwan
    • P.-W. Wang and C.-J. Lin. Iteration Complexity of Feasible Descent Methods for Convex Optimization. Technical report, Department of Computer Science, National Taiwan University, Taipei, Taiwan, 2013.
    • (2013) Technical Report, Department of Computer Science
    • Wang, P.-W.1    Lin, C.-J.2
  • 5
    • 85162018878 scopus 로고    scopus 로고
    • Fast global convergence rates of gradient methods for high-dimensional statistical recovery
    • A. Agarwal, S. Negahban, and M. Wainwright. Fast Global Convergence Rates of Gradient Methods for High-Dimensional Statistical Recovery. In NIPS 2010.
    • (2010) NIPS
    • Agarwal, A.1    Negahban, S.2    Wainwright, M.3
  • 7
    • 84867128532 scopus 로고    scopus 로고
    • A proximal-gradient homotopy method for the l1-regularized leastsquares problem
    • L. Xiao and T. Zhang, A proximal-gradient homotopy method for the l1-regularized leastsquares problem, in ICML, 2012.
    • (2012) ICML
    • Xiao, L.1    Zhang, T.2
  • 8
    • 46749146509 scopus 로고    scopus 로고
    • A coordinate gradient descent method for nonsmooth separable minimization
    • P. Tseng and S. Yun, A coordinate gradient descent method for nonsmooth separable minimization, Math. Prog. B. 117 (2009).
    • (2009) Math. Prog. B. , pp. 117
    • Tseng, P.1    Yun, S.2
  • 9
    • 84864920041 scopus 로고    scopus 로고
    • An improved GLMNET for l1-regularized logistic regression
    • G.-X. Yuan, C.-H. Ho, and C.-J. Lin, An improved GLMNET for l1-regularized logistic regression, Journal of Machine Learning Research, Vol. 13, pp. 1999-2030, 2012
    • (2012) Journal of Machine Learning Research , vol.13 , pp. 1999-2030
    • Yuan, G.-X.1    Ho, C.-H.2    Lin, C.-J.3
  • 12
    • 85162449444 scopus 로고    scopus 로고
    • Greedy algorithms for structurally constrained high dimensional problems
    • Tewari, A, Ravikumar, P, and Dhillon, I S. Greedy Algorithms for Structurally Constrained High Dimensional Problems. In NIPS, 2011.
    • (2011) NIPS
    • Tewari, A.1    Ravikumar, P.2    Dhillon, S.3
  • 13
    • 84858717588 scopus 로고    scopus 로고
    • A unified framework for high-dimensional analysis of M-estimators with decomposable regularizers
    • S. Negahban, P. Ravikumar, M. J. Wainwright, and B. Yu. A unified framework for high-dimensional analysis of M-estimators with decomposable regularizers. In NIPS, 2009.
    • (2009) NIPS
    • Negahban, S.1    Ravikumar, P.2    Wainwright, M.J.3    Yu, B.4
  • 14
    • 85014561619 scopus 로고    scopus 로고
    • A fast iterative shrinkage-thresholding algorithm for linear inverse problems
    • A. Beck and M. Teboulle. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM Journal on Imaging Sciences, 2(1): 183-202, 2009.
    • (2009) SIAM Journal on Imaging Sciences , vol.2 , Issue.1 , pp. 183-202
    • Beck, A.1    Teboulle, M.2
  • 16
    • 21344480786 scopus 로고
    • Error bounds and convergence analysis of feasible descent methods: A general approach
    • Z. Q. Luo and P. Tseng. Error bounds and convergence analysis of feasible descent methods: a general approach. Annals of Operations Research, 46-47: 157-178, 1993.
    • (1993) Annals of Operations Research , vol.46-47 , pp. 157-178
    • Luo, Z.Q.1    Tseng, P.2
  • 17
    • 71149117997 scopus 로고    scopus 로고
    • Gradient descent with sparsification: An iterative algorithm for sparse recovery with restricted isometry property
    • Rahul Garg and Rohit Khandekar. Gradient Descent with Sparsification: an iterative algorithm for sparse recovery with restricted isometry property. In ICML 2009.
    • (2009) ICML
    • Garg, R.1    Khandekar, R.2
  • 18
    • 71149103464 scopus 로고    scopus 로고
    • An accelerated gradient method for trace norm minimization
    • S. Ji and J. Ye. An accelerated gradient method for trace norm minimization. In ICML, 2009.
    • (2009) ICML
    • Ji, S.1    Ye, J.2
  • 21
    • 84937945271 scopus 로고    scopus 로고
    • Practical inexact proximal quasi-newton method with global complexity analysis
    • arXiv: 1311.6547
    • K. Scheinberg, X. Tang. Practical Inexact Proximal Quasi-Newton Method with Global Complexity Analysis. COR@L Technical Report at Lehigh University. arXiv: 1311.6547, 2013.
    • (2013) COR@L Technical Report at Lehigh University
    • Scheinberg, K.1    Tang, X.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.