메뉴 건너뛰기




Volumn 3, Issue , 2014, Pages 2665-2681

Communication-efficient distributed optimization using an approximate Newton-type method

Author keywords

[No Author keywords available]

Indexed keywords

ARTIFICIAL INTELLIGENCE; OPTIMIZATION;

EID: 84919935839     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (204)

References (22)
  • 1
    • 84893288242 scopus 로고    scopus 로고
    • A reliable effective terascale linear learning system
    • abs/1110.4198
    • Agarwal, A., Chapelle, O., Dudfk, M., and Langford, J. A reliable effective terascale linear learning system. CoRR, abs/1110.4198, 2011.
    • (2011) CoRR
    • Agarwal, A.1    Chapelle, O.2    Dudfk, M.3    Langford, J.4
  • 2
    • 0037403111 scopus 로고    scopus 로고
    • Mirror descent and nonlinear projected sub gradient methods for convex optimization
    • Beck, A. and Teboulle, M. Mirror descent and nonlinear projected sub gradient methods for convex optimization. Oper. Res. Lett., 31(3): 167-175, 2003.
    • (2003) Oper. Res. Lett. , vol.31 , Issue.3 , pp. 167-175
    • Beck, A.1    Teboulle, M.2
  • 4
    • 80051762104 scopus 로고    scopus 로고
    • Distributed optimization and statistical learning via the alternating direction method of multipliers
    • Boyd, S.P., Parikh, N., Chu, E., Peleato, B. and Eckstein, J. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends in Machine Learning, 3(1): 1-122, 2011.
    • (2011) Foundations and Trends in Machine Learning , vol.3 , Issue.1 , pp. 1-122
    • Boyd, S.P.1    Parikh, N.2    Chu, E.3    Peleato, B.4    Eckstein, J.5
  • 5
    • 85162498265 scopus 로고    scopus 로고
    • Better mini-batch algorithms via accelerated gradient methods
    • Cotter, A., Shamir, O., Srebro, N., and Sridharan, K. Better mini-batch algorithms via accelerated gradient methods. In NIPS, 2011.
    • (2011) NIPS
    • Cotter, A.1    Shamir, O.2    Srebro, N.3    Sridharan, K.4
  • 7
    • 84878490420 scopus 로고    scopus 로고
    • On the global and linear convergence of the generalized alternating direction method of multipliers
    • Deng, W. and Yin, W. On the global and linear convergence of the generalized alternating direction method of multipliers. Technical report, Rice University Technical Report TR12-14, 2012.
    • (2012) Technical Report, Rice University Technical Report TR12-14
    • Deng, W.1    Yin, W.2
  • 8
    • 84857708133 scopus 로고    scopus 로고
    • Dual averaging for distributed optimization: Convergence analysis and network scaling
    • Duchi, J., Agarwal, A., and Wainwright, M. Dual averaging for distributed optimization: Convergence analysis and network scaling. IEEE Trans. Automat. Contr., 57(3): 592-606, 2012.
    • (2012) IEEE Trans. Automat. Contr. , vol.57 , Issue.3 , pp. 592-606
    • Duchi, J.1    Agarwal, A.2    Wainwright, M.3
  • 9
    • 85013592906 scopus 로고    scopus 로고
    • On the linear convergence of the alternating direction method of multipliers
    • abs/1208.3922
    • Hong, M. and Luo, Z.-Q. On the linear convergence of the alternating direction method of multipliers. CoRR, abs/1208.3922, 2012.
    • (2012) CoRR
    • Hong, M.1    Luo, Z.-Q.2
  • 10
    • 84919886582 scopus 로고    scopus 로고
    • A parallel SGD method with strong convergence
    • abs/1311.0636
    • Mahajan, D., Keerthy, S., Sundararajan, S., and Bottou, L. A parallel sgd method with strong convergence. CoRR, abs/1311.0636, 2013.
    • (2013) CoRR
    • Mahajan, D.1    Keerthy, S.2    Sundararajan, S.3    Bottou, L.4
  • 11
    • 4043065592 scopus 로고
    • On Cesaro's convergence of the gradient descent method for finding saddle points of convex-concave functions
    • Nemirovski, A. and Yudin, D. On cesaro's convergence of the gradient descent method for finding saddle points of convex-concave functions. Doklady Akademii Nauk SSSR, 239(4), 1978.
    • (1978) Doklady Akademii Nauk SSSR , vol.239 , Issue.4
    • Nemirovski, A.1    Yudin, D.2
  • 13
    • 84867120686 scopus 로고    scopus 로고
    • Making gradient descent optimal for strongly convex stochastic optimization
    • Rakhlin, A., Shamir, O., and Sridharan, K. Making gradient descent optimal for strongly convex stochastic optimization. In ICML, 2012.
    • (2012) ICML
    • Rakhlin, A.1    Shamir, O.2    Sridharan, K.3
  • 14
    • 85162467517 scopus 로고    scopus 로고
    • Hog wild: A lock-free approach to parallelizing stochastic gradient descent
    • Recht, B., Re, C., Wright, S. and Niu, F. Hog wild: A lock-free approach to parallelizing stochastic gradient descent. In NIPS, 2011.
    • (2011) NIPS
    • Recht, B.1    Re, C.2    Wright, S.3    Niu, F.4
  • 15
    • 84899031876 scopus 로고    scopus 로고
    • Distributed coordinate descent method for learning with big data
    • abs/1310.2059
    • Richtarik, P. and Takac, M. Distributed coordinate descent method for learning with big data. CoRR, abs/1310.2059, 2013.
    • (2013) CoRR
    • Richtarik, P.1    Takac, M.2
  • 16
    • 84875134236 scopus 로고    scopus 로고
    • Stochastic dual coordinate ascent methods for regularized loss
    • Shalev-Shwartz, S. and Zhang, T. Stochastic dual coordinate ascent methods for regularized loss. Journal of Machine Learning Research, 14(1):567-599, 2013.
    • (2013) Journal of Machine Learning Research , vol.14 , Issue.1 , pp. 567-599
    • Shalev-Shwartz, S.1    Zhang, T.2
  • 19
    • 84864315555 scopus 로고    scopus 로고
    • User-friendly tail bounds for sums of random matrices
    • Tropp, J. User-friendly tail bounds for sums of random matrices. Foundations of Computational Mathematics, 12(4):389-434, 2012.
    • (2012) Foundations of Computational Mathematics , vol.12 , Issue.4 , pp. 389-434
    • Tropp, J.1
  • 20
    • 84898970556 scopus 로고    scopus 로고
    • Trading computation for communication: Distributed stochastic dual coordinate ascent
    • Yang, T. Trading computation for communication: Distributed stochastic dual coordinate ascent. In NIPS, 2013.
    • (2013) NIPS
    • Yang, T.1
  • 21
    • 84890032023 scopus 로고    scopus 로고
    • Communication- efficient algorithms for statistical optimization
    • Zhang, Y., Duchi, J., and Wainwright, M. Communication- efficient algorithms for statistical optimization. Journal of Machine Learning Research, 14:3321-3363, 2013.
    • (2013) Journal of Machine Learning Research , vol.14 , pp. 3321-3363
    • Zhang, Y.1    Duchi, J.2    Wainwright, M.3
  • 22
    • 85161967549 scopus 로고    scopus 로고
    • Parallelized stochastic gradient descent
    • Zinkevich, M., Weimer, M., Smola, A., and Li, L. Parallelized stochastic gradient descent. In NIPS, pp. 2595- 2603, 2010.
    • (2010) NIPS , pp. 2595-2603
    • Zinkevich, M.1    Weimer, M.2    Smola, A.3    Li, L.4


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.