메뉴 건너뛰기




Volumn 2, Issue , 2014, Pages 1840-1848

An asynchronous parallel stochastic coordinate descent algorithm

Author keywords

[No Author keywords available]

Indexed keywords

ALGORITHMS; ARTIFICIAL INTELLIGENCE; CONSTRAINED OPTIMIZATION; FUNCTIONS; LEARNING SYSTEMS; OPTIMIZATION; PARALLEL PROCESSING SYSTEMS;

EID: 84919932688     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (51)

References (34)
  • 1
    • 84874223900 scopus 로고    scopus 로고
    • Distributed delayed stochastic optimization
    • Agarwal, A. and Duchi, J. C. Distributed delayed stochastic optimization. CDC, pp. 5451-5452, 2012.
    • (2012) CDC , pp. 5451-5452
    • Agarwal, A.1    Duchi, J.C.2
  • 2
    • 84906673146 scopus 로고    scopus 로고
    • Revisiting asynchronous linear solvers: Provable convergence rate through randomization
    • Avron, H., Druinsky, A., and Gupta, A. Revisiting asynchronous linear solvers: Provable convergence rate through randomization. IPDPS, 2014.
    • (2014) IPDPS
    • Avron, H.1    Druinsky, A.2    Gupta, A.3
  • 3
    • 85014561619 scopus 로고    scopus 로고
    • A fast iterative shrinkage-thresholding algorithm for linear inverse problems
    • Beck, A. and Teboulle, M. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sciences., 2(1): 183-202, 2009.
    • (2009) SIAM J. Imaging Sciences , vol.2 , Issue.1 , pp. 183-202
    • Beck, A.1    Teboulle, M.2
  • 4
    • 84892868336 scopus 로고    scopus 로고
    • On the convergence of block coordinate descent type methods, 2013
    • To appear in
    • Beck, A. and Tetruashvili, L. On the convergence of block coordinate descent type methods, 2013. To appear in SIAM Journal on Optimization.
    • SIAM Journal on Optimization
    • Beck, A.1    Tetruashvili, L.2
  • 6
    • 80051762104 scopus 로고    scopus 로고
    • Distributed optimization and statistical learning via the alternating direction method of multipliers
    • Boyd, S., Parikh, N., Chu, E., Peleato, B., and Eckstein, J. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends in Machine Learning, 3(1): 1-122, 2011.
    • (2011) Foundations and Trends in Machine Learning , vol.3 , Issue.1 , pp. 1-122
    • Boyd, S.1    Parikh, N.2    Chu, E.3    Peleato, B.4    Eckstein, J.5
  • 10
    • 84857708133 scopus 로고    scopus 로고
    • Dual averaging for distributed optimization: Convergence analysis and network scaling
    • Duchi, J. C., Agarwal, A., and Wainwright, M. J. Dual averaging for distributed optimization: Convergence analysis and network scaling. IEEE Transactions on Automatic Control, 57 (3):592-606, 2012.
    • (2012) IEEE Transactions on Automatic Control , vol.57 , Issue.3 , pp. 592-606
    • Duchi, J.C.1    Agarwal, A.2    Wainwright, M.J.3
  • 12
    • 84865692740 scopus 로고    scopus 로고
    • Fast multiple-splitting algorithms for convex optimization
    • Goldfarb, D. and Ma, S. Fast multiple-splitting algorithms for convex optimization. SIAM Journal on Optimization, 22(2): 533-556, 2012.
    • (2012) SIAM Journal on Optimization , vol.22 , Issue.2 , pp. 533-556
    • Goldfarb, D.1    Ma, S.2
  • 14
    • 84919896013 scopus 로고    scopus 로고
    • On the complexity analysis of randomized block-coordinate descent methods
    • Lu, Z. and Xiao, L. On the complexity analysis of randomized block-coordinate descent methods. TechReport, 2013.
    • (2013) TechReport
    • Lu, Z.1    Xiao, L.2
  • 15
    • 0026678659 scopus 로고
    • On the convergence of the coordinate descent method for convex differentiable minimization
    • Luo, Z. Q. and Tseng, P. On the convergence of the coordinate descent method for convex differentiable minimization. Journal of Optimization Theory and Applications, 72:7-35, 1992.
    • (1992) Journal of Optimization Theory and Applications , vol.72 , pp. 7-35
    • Luo, Z.Q.1    Tseng, P.2
  • 16
    • 0001208950 scopus 로고
    • Parallel gradient distribution in unconstrained optimization
    • Mangasarian, O. L. Parallel gradient distribution in unconstrained optimization. SIAM Journal on Optimization, 33(1):916-1925, 1995.
    • (1995) SIAM Journal on Optimization , vol.33 , Issue.1 , pp. 916-1925
    • Mangasarian, O.L.1
  • 17
  • 19
    • 84865692149 scopus 로고    scopus 로고
    • Efficiency of coordinate descent methods on huge- scale optimization problems
    • Nesterov, Y. Efficiency of coordinate descent methods on huge- scale optimization problems. SIAM Journal on Optimization, 22(2):341-362, 2012.
    • (2012) SIAM Journal on Optimization , vol.22 , Issue.2 , pp. 341-362
    • Nesterov, Y.1
  • 24
    • 84877770112 scopus 로고    scopus 로고
    • Feature clustering for accelerating parallel coordinate descent
    • Scherrer, C., Tewari, A., Halappanavar, M., and Haglin, D. Feature clustering for accelerating parallel coordinate descent. NIPS, pp. 28-36, 2012.
    • (2012) NIPS , pp. 28-36
    • Scherrer, C.1    Tewari, A.2    Halappanavar, M.3    Haglin, D.4
  • 26
    • 84897554805 scopus 로고    scopus 로고
    • Stochastic gradient descent for non- smooth optimization: Convergence results and optimal averaging schemes
    • Shamir, O. and Zhang, T. Stochastic gradient descent for non- smooth optimization: Convergence results and optimal averaging schemes. ICML, 2013.
    • (2013) ICML
    • Shamir, O.1    Zhang, T.2
  • 28
    • 0035533631 scopus 로고    scopus 로고
    • Convergence of a block coordinate descent method for nondifferentiable minimization
    • Tseng, P. Convergence of a block coordinate descent method for nondifferentiable minimization. Journal of Optimization Theory and Applications, 109:475-494, 2001.
    • (2001) Journal of Optimization Theory and Applications , vol.109 , pp. 475-494
    • Tseng, P.1
  • 29
    • 46749146509 scopus 로고    scopus 로고
    • A coordinate gradient descent method for nonsmooth separable minimization
    • June
    • Tseng, P. and Yun, S. A coordinate gradient descent method for nonsmooth separable minimization. Mathematical Programming, Series B, 117:387-423, June 2009.
    • (2009) Mathematical Programming, Series B , vol.117 , pp. 387-423
    • Tseng, P.1    Yun, S.2
  • 30
    • 77956736675 scopus 로고    scopus 로고
    • A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
    • Tseng, P. and Yun, S. A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training. Computational Optimization and Applications, 47(2): 179-206, 2010.
    • (2010) Computational Optimization and Applications , vol.47 , Issue.2 , pp. 179-206
    • Tseng, P.1    Yun, S.2
  • 31
    • 84886577495 scopus 로고    scopus 로고
    • Iteration complexity of feasible descent methods for convex optimization
    • Wang, P.-W. and Lin, C.-J. Iteration complexity of feasible descent methods for convex optimization. Technical report, 2013.
    • (2013) Technical Report
    • Wang, P.-W.1    Lin, C.-J.2
  • 32
    • 84861594597 scopus 로고    scopus 로고
    • Accelerated block-coordinate relaxation for regularized optimization
    • Wright, S. J. Accelerated block-coordinate relaxation for regularized optimization. SIAM Journal on Optimization, 22(1): 159-186, 2012.
    • (2012) SIAM Journal on Optimization , vol.22 , Issue.1 , pp. 159-186
    • Wright, S.J.1
  • 33
    • 84898970556 scopus 로고    scopus 로고
    • Trading computation for communication: Distributed stochastic dual coordinate ascent
    • Yang, T. Trading computation for communication: Distributed stochastic dual coordinate ascent. NIPS, pp. 629-637, 2013.
    • (2013) NIPS , pp. 629-637
    • Yang, T.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.