메뉴 건너뛰기




Volumn 16, Issue , 2015, Pages 285-322

An asynchronous parallel stochastic coordinate descent algorithm

Author keywords

Asynchronous parallel optimization; Stochastic coordinate descent

Indexed keywords

ALGORITHMS; CONSTRAINED OPTIMIZATION; FUNCTIONS; NONLINEAR EQUATIONS; OPTIMIZATION; PARALLEL PROCESSING SYSTEMS;

EID: 84930680874     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (207)

References (32)
  • 1
    • 85162387277 scopus 로고    scopus 로고
    • Distributed delayed stochastic optimization
    • A. Agarwal and J. C. Duchi. Distributed delayed stochastic optimization. In Advances in Neural Information Processing Systems 24, pages 873-881. 2011. URL http://papers. nips.cc/paper/4247-distributed-delayed-stochastic-optimization.pdf.
    • (2011) Advances in Neural Information Processing Systems 24 , pp. 873-881
    • Agarwal, A.1    Duchi, J.C.2
  • 2
    • 84906673146 scopus 로고    scopus 로고
    • Revisiting asynchronous linear solvers: Provable convergence rate through randomization
    • H. Avron, A. Druinsky, and A. Gupta. Revisiting asynchronous linear solvers: Provable convergence rate through randomization. IPDPS, 2014.
    • (2014) IPDPS
    • Avron, H.1    Druinsky, A.2    Gupta, A.3
  • 3
    • 85014561619 scopus 로고    scopus 로고
    • A fast iterative shrinkage-thresholding algorithm for linear inverse problems
    • A. Beck and M. Teboulle. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sciences, 2(1):183-202, 2009.
    • (2009) SIAM J. Imaging Sciences , vol.2 , Issue.1 , pp. 183-202
    • Beck, A.1    Teboulle, M.2
  • 4
    • 84892868336 scopus 로고    scopus 로고
    • On the convergence of block coordinate descent type methods
    • A. Beck and L. Tetruashvili. On the convergence of block coordinate descent type methods. SIAM Journal on Optimization, 23(4):2037-2060, 2013.
    • (2013) SIAM Journal on Optimization , vol.23 , Issue.4 , pp. 2037-2060
    • Beck, A.1    Tetruashvili, L.2
  • 6
    • 80051762104 scopus 로고    scopus 로고
    • Distributed optimization and statistical learning via the alternating direction method of multipliers
    • S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends in Machine Learning, 3(1):1-122, 2011.
    • (2011) Foundations and Trends in Machine Learning , vol.3 , Issue.1 , pp. 1-122
    • Boyd, S.1    Parikh, N.2    Chu, E.3    Peleato, B.4    Eckstein, J.5
  • 8
  • 9
    • 85162498265 scopus 로고    scopus 로고
    • Better mini-batch algorithms via accelerated gradient methods
    • A. Cotter, O. Shamir, N. Srebro, and K. Sridharan. Better mini-batch algorithms via accelerated gradient methods. In Advances in Neural Information Processing Systems 24, pages 1647-1655. 2011. URL http://papers.nips.cc/paper/ 4432-better-mini-batch-algorithms-via-accelerated-gradient-methods.pdf.
    • (2011) Advances in Neural Information Processing Systems 24 , pp. 1647-1655
    • Cotter, A.1    Shamir, O.2    Srebro, N.3    Sridharan, K.4
  • 10
    • 84857708133 scopus 로고    scopus 로고
    • Dual averaging for distributed optimization: Convergence analysis and network scaling
    • J. C. Duchi, A. Agarwal, and M. J.Wainwright. Dual averaging for distributed optimization: Convergence analysis and network scaling. IEEE Transactions on Automatic Control, 57 (3):592-606, 2012.
    • (2012) IEEE Transactions on Automatic Control , vol.57 , Issue.3 , pp. 592-606
    • Duchi, J.C.1    Agarwal, A.2    Wainwright, M.J.3
  • 12
    • 84865692740 scopus 로고    scopus 로고
    • Fast multiple-splitting algorithms for convex optimization
    • D. Goldfarb and S. Ma. Fast multiple-splitting algorithms for convex optimization. SIAM Journal on Optimization, 22(2):533-556, 2012.
    • (2012) SIAM Journal on Optimization , vol.22 , Issue.2 , pp. 533-556
    • Goldfarb, D.1    Ma, S.2
  • 14
    • 0026678659 scopus 로고
    • On the convergence of the coordinate descent method for convex differentiable minimization
    • Z. Q. Luo and P. Tseng. On the convergence of the coordinate descent method for convex differentiable minimization. Journal of Optimization Theory and Applications, 72:7-35, 1992.
    • (1992) Journal of Optimization Theory and Applications , vol.72 , pp. 7-35
    • Luo, Z.Q.1    Tseng, P.2
  • 15
    • 0001208950 scopus 로고
    • Parallel gradient distribution in unconstrained optimization
    • O. L. Mangasarian. Parallel gradient distribution in unconstrained optimization. SIAM Journal on Optimization, 33(1):916-1925, 1995.
    • (1995) SIAM Journal on Optimization , vol.33 , Issue.1 , pp. 916-1925
    • Mangasarian, O.L.1
  • 18
    • 84865692149 scopus 로고    scopus 로고
    • Efficiency of coordinate descent methods on huge-scale optimization problems
    • Y. Nesterov. Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM Journal on Optimization, 22(2):341-362, 2012.
    • (2012) SIAM Journal on Optimization , vol.22 , Issue.2 , pp. 341-362
    • Nesterov, Y.1
  • 21
    • 84897116612 scopus 로고    scopus 로고
    • Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
    • P. Richtárik and M. Takáč. Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function. Mathematrical Programming, 144:1-38, 2012a.
    • (2012) Mathematrical Programming , vol.144 , pp. 1-38
    • Richtárik, P.1    Takáč, M.2
  • 27
    • 0035533631 scopus 로고    scopus 로고
    • Convergence of a block coordinate descent method for nondifferentiable minimization
    • P. Tseng. Convergence of a block coordinate descent method for nondifferentiable minimization. Journal of Optimization Theory and Applications, 109:475-494, 2001.
    • (2001) Journal of Optimization Theory and Applications , vol.109 , pp. 475-494
    • Tseng, P.1
  • 28
    • 46749146509 scopus 로고    scopus 로고
    • A coordinate gradient descent method for nonsmooth separable minimization
    • June
    • P. Tseng and S. Yun. A coordinate gradient descent method for nonsmooth separable minimization. Mathematical Programming, Series B, 117:387-423, June 2009.
    • (2009) Mathematical Programming, Series B , vol.117 , pp. 387-423
    • Tseng, P.1    Yun, S.2
  • 29
    • 77956736675 scopus 로고    scopus 로고
    • A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
    • P. Tseng and S. Yun. A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training. Computational Optimization and Applications, 47(2):179-206, 2010.
    • (2010) Computational Optimization and Applications , vol.47 , Issue.2 , pp. 179-206
    • Tseng, P.1    Yun, S.2
  • 30
    • 84901632905 scopus 로고    scopus 로고
    • Iteration complexity of feasible descent methods for convex optimization
    • P.-W. Wang and C.-J. Lin. Iteration complexity of feasible descent methods for convex optimization. Journal of Machine Learning Research, 15:1523-1548, 2014.
    • (2014) Journal of Machine Learning Research , vol.15 , pp. 1523-1548
    • Wang, P.-W.1    Lin, C.-J.2
  • 31
    • 84861594597 scopus 로고    scopus 로고
    • Accelerated block-coordinate relaxation for regularized optimization
    • S. J. Wright. Accelerated block-coordinate relaxation for regularized optimization. SIAM Journal on Optimization, 22(1):159-186, 2012.
    • (2012) SIAM Journal on Optimization , vol.22 , Issue.1 , pp. 159-186
    • Wright, S.J.1
  • 32
    • 84898970556 scopus 로고    scopus 로고
    • Trading computation for communication: Distributed stochastic dual coordinate ascent
    • T. Yang. Trading computation for communication: Distributed stochastic dual coordinate ascent. Advances in Neural Information Processing Systems 26, pages 629-637, 2013.
    • (2013) Advances in Neural Information Processing Systems 26 , pp. 629-637
    • Yang, T.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.