메뉴 건너뛰기




Volumn 25, Issue 4, 2015, Pages 1997-2023

Accelerated, Parallel, and PROXimal coordinate descent

Author keywords

Acceleration; Big data; Complexity; Convex optimization; Parallel methods; Partial separability; Proximal methods; Randomized coordinate descent

Indexed keywords

ACCELERATION; BIG DATA; CONVEX OPTIMIZATION; FUNCTIONS; OPTIMIZATION; RATE CONSTANTS;

EID: 84953234319     PISSN: 10526234     EISSN: None     Source Type: Journal    
DOI: 10.1137/130949993     Document Type: Article
Times cited : (291)

References (30)
  • 1
    • 85014561619 scopus 로고    scopus 로고
    • A fast iterative shrinkage-thresholding algorithm for linear inverse problems
    • A. BECK AND M. TEBOULLE, A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., 2(2009), pp. 183-202.
    • (2009) SIAM J. Imaging Sci. , vol.2 , pp. 183-202
    • Beck, A.1    Teboulle, M.2
  • 7
    • 77956625417 scopus 로고    scopus 로고
    • Randomized methods for linear constraints: Convergence rates and conditioning
    • D. LEVENTHAL AND A. S. LEWIS, Randomized methods for linear constraints: Convergence rates and conditioning, Math. Oper. Res., 35(2010), pp. 641-654.
    • (2010) Math. Oper. Res. , vol.35 , pp. 641-654
    • Leventhal, D.1    Lewis, A.S.2
  • 11
    • 84874622647 scopus 로고    scopus 로고
    • Efficient parallel coordinate descent algorithm for convex optimization problems with separable constraints: Application to distributed MPC
    • I. NECOARA AND D. CLIPICI, Efficient parallel coordinate descent algorithm for convex optimization problems with separable constraints: application to distributed MPC, J. Process Control, 23(2013), pp. 243-253.
    • (2013) J. Process Control , vol.23 , pp. 243-253
    • Necoara, I.1    Clipici, D.2
  • 13
    • 34548480020 scopus 로고
    • A method of solving a convex programming problem with convergence rate O (1/k2)
    • Y. NESTEROV, A method of solving a convex programming problem with convergence rate O (1/k2), Soviet Math. Dokl, 27(1983), pp. 372-376.
    • (1983) Soviet Math. Dokl , vol.27 , pp. 372-376
    • Nesterov, Y.1
  • 14
    • 17444406259 scopus 로고    scopus 로고
    • Smooth minimization of nonsmooth functions
    • Y. NESTEROV, Smooth minimization of nonsmooth functions, Math. Program., 103(2005), pp. 127-152.
    • (2005) Math. Program. , vol.103 , pp. 127-152
    • Nesterov, Y.1
  • 15
    • 84865692149 scopus 로고    scopus 로고
    • Efficiency of coordinate descent methods on huge-scale optimization problems
    • Y. NESTEROV, Efficiency of coordinate descent methods on huge-scale optimization problems, SIAM J. Optim., 22(2012), pp. 341-362.
    • (2012) SIAM J. Optim. , vol.22 , pp. 341-362
    • Nesterov, Y.1
  • 16
    • 0003120218 scopus 로고    scopus 로고
    • Fast training of support vector machines using sequential minimal optimization
    • B. Scholkopf, C. Burges, and A. Smola, eds., MIT Press, Cambridge, MA
    • J. C. PLATT, Fast training of support vector machines using sequential minimal optimization, in Advances in Kernel Methods - Support Vector Learning, B. Scholkopf, C. Burges, and A. Smola, eds., MIT Press, Cambridge, MA, 1999, pp. 185-208.
    • (1999) Advances in Kernel Methods - Support Vector Learning , pp. 185-208
    • Platt, J.C.1
  • 19
    • 84934779711 scopus 로고    scopus 로고
    • On optimal probabilities in stochastic coordinate descent methods
    • P. RICHTÁRIK AND M. TAKÁČ, On optimal probabilities in stochastic coordinate descent methods, Optim. Lett., 2015, DOI: 10.1007/s11590-015-0916-1.
    • (2015) Optim. Lett.
    • Richtárik, P.1    Takáč, M.2
  • 20
    • 84897116612 scopus 로고    scopus 로고
    • Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
    • P. RICHTÁRIK AND M. TAKÁČ, Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function, Math. Program., 144(2014), pp. 1-38.
    • (2014) Math. Program. , vol.144 , pp. 1-38
    • Richtárik, P.1    Takáč, M.2
  • 21
    • 84958109123 scopus 로고    scopus 로고
    • Parallel coordinate descent methods for big data optimization problems
    • P. RICHTÁRIK AND M. TAKÁČ, Parallel coordinate descent methods for big data optimization problems, Math. Program., Ser. A, (2015), DOI: 10.1007/s10107-015-0901-6.
    • (2015) Math. Program., Ser. A
    • Richtárik, P.1    Takáč, M.2
  • 28
    • 70049111607 scopus 로고    scopus 로고
    • On accelerated proximal gradient methods for convex-concave optimization
    • submitted
    • P. TSENG, On accelerated proximal gradient methods for convex-concave optimization, SIAM J. Optim., 2008, submitted.
    • (2008) SIAM J. Optim.
    • Tseng, P.1
  • 29
    • 84863879353 scopus 로고    scopus 로고
    • Coordinate descent algorithms for lasso penalized regression
    • T. T. WU AND K. LANGE, Coordinate descent algorithms for lasso penalized regression, Ann. Appl. Stat., (2008), pp. 224-244.
    • (2008) Ann. Appl. Stat. , pp. 224-244
    • Wu, T.T.1    Lange, K.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.