메뉴 건너뛰기




Volumn 17, Issue , 2016, Pages

Distributed coordinate descent method for learning with big data

Author keywords

Boosting; Distributed algorithms; Parallel coordinate descent; Stochastic methods

Indexed keywords

ITERATIVE METHODS; PARALLEL ALGORITHMS; STOCHASTIC SYSTEMS;

EID: 84979902257     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (173)

References (21)
  • 1
    • 80053451705 scopus 로고    scopus 로고
    • Parallel coordinate descent for l1-regularized loss minimization
    • J. Bradley, A. Kyrola, D. Bickson, and C. Guestrin. Parallel coordinate descent for l1-regularized loss minimization. In ICML, 2011.
    • (2011) ICML
    • Bradley, J.1    Kyrola, A.2    Bickson, D.3    Guestrin, C.4
  • 2
    • 84899444154 scopus 로고    scopus 로고
    • Parallel coordinate descent for the AdaBoost problem
    • O. Fercoq. Parallel coordinate descent for the AdaBoost problem. In ICMLA, 2013.
    • (2013) ICMLA
    • Fercoq, O.1
  • 4
    • 84953234319 scopus 로고    scopus 로고
    • Accelerated, parallel and proximal coordinate descent
    • O. Fercoq and P. Richtárik. Accelerated, parallel and proximal coordinate descent. SIAM Journal on Optimization, 25(4):1997-2023, 2015.
    • (2015) SIAM Journal on Optimization , vol.25 , Issue.4 , pp. 1997-2023
    • Fercoq, O.1    Richtárik, P.2
  • 6
    • 84979874359 scopus 로고    scopus 로고
    • Libsvm. Datasets
    • Libsvm. Datasets. http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/binary.html.
  • 7
    • 84937977284 scopus 로고    scopus 로고
    • On the complexity analysis of randomized block-coordinate descent methods
    • Z. Lu and L. Xiao. On the complexity analysis of randomized block-coordinate descent methods. Mathematical Programming, 152(1):615-642, 2015.
    • (2015) Mathematical Programming , vol.152 , Issue.1 , pp. 615-642
    • Lu, Z.1    Xiao, L.2
  • 10
    • 84865692149 scopus 로고    scopus 로고
    • Efficiency of coordinate descent methods on huge-scale optimization problems
    • Yu. Nesterov. Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM Journal on Optimization, 22(2):341-362, 2012.
    • (2012) SIAM Journal on Optimization , vol.22 , Issue.2 , pp. 341-362
    • Nesterov, Yu.1
  • 11
    • 84879800501 scopus 로고    scopus 로고
    • Gradient methods for minimizing composite objective function
    • Yu. Nesterov. Gradient methods for minimizing composite objective function. Mathematical Programming, 140(1):125-161, 2013.
    • (2013) Mathematical Programming , vol.140 , Issue.1 , pp. 125-161
    • Nesterov, Yu.1
  • 12
    • 84947110023 scopus 로고    scopus 로고
    • Parallel coordinate descent methods for big data optimization
    • P. Richtárik and M. Takáč. Parallel coordinate descent methods for big data optimization. Mathematical Programming, pages 1-52, 2015.
    • (2015) Mathematical Programming , pp. 1-52
    • Richtárik, P.1    Takáč, M.2
  • 13
    • 84877742402 scopus 로고    scopus 로고
    • Efficient serial and parallel coordinate descent methods for huge-scale truss topology design
    • Springer
    • P. Richtárik and M. Takáč. Efficient serial and parallel coordinate descent methods for huge-scale truss topology design. In Operations Research Proceedings, pages 27-32. Springer, 2012.
    • (2012) Operations Research Proceedings , pp. 27-32
    • Richtárik, P.1    Takáč, M.2
  • 14
    • 84897116612 scopus 로고    scopus 로고
    • Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
    • P. Richtárik and M. Takáč. Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function. Mathematical Programming, 144(2):1-38, 2014.
    • (2014) Mathematical Programming , vol.144 , Issue.2 , pp. 1-38
    • Richtárik, P.1    Takáč, M.2
  • 16
    • 84899021802 scopus 로고    scopus 로고
    • Accelerated mini-batch stochastic dual coordinate ascent
    • S. Shalev-Shwartz and T. Zhang. Accelerated mini-batch stochastic dual coordinate ascent. In NIPS, pages 378-385, 2013a.
    • (2013) NIPS , pp. 378-385
    • Shalev-Shwartz, S.1    Zhang, T.2
  • 17
    • 84875134236 scopus 로고    scopus 로고
    • Stochastic dual coordinate ascent methods for regularized loss minimization
    • S. Shalev-Shwartz and T. Zhang. Stochastic dual coordinate ascent methods for regularized loss minimization. Journal of Machine Learning Research, 14:567-599, 2013b.
    • (2013) Journal of Machine Learning Research , vol.14 , pp. 567-599
    • Shalev-Shwartz, S.1    Zhang, T.2
  • 21
    • 84933677250 scopus 로고    scopus 로고
    • Separable approximations and decomposition methods for the augmented Lagrangian
    • R. Tappenden, P. Richtárik, and B. Büke. Separable approximations and decomposition methods for the augmented Lagrangian. Optimization Methods and Software, 30(3):643-668, 2015.
    • (2015) Optimization Methods and Software , vol.30 , Issue.3 , pp. 643-668
    • Tappenden, R.1    Richtárik, P.2    Büke, B.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.