메뉴 건너뛰기




Volumn 2017-December, Issue , 2017, Pages 1710-1721

QSGD: Communication-efficient SGD via gradient quantization and encoding

Author keywords

[No Author keywords available]

Indexed keywords

BANDWIDTH; DEEP NEURAL NETWORKS; INFORMATION DISSEMINATION; INFORMATION THEORY; ITERATIVE METHODS; SIGNAL ENCODING; SPEECH RECOGNITION; STOCHASTIC SYSTEMS;

EID: 85042820163     PISSN: 10495258     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (1868)

References (45)
  • 5
    • 84965119190 scopus 로고    scopus 로고
    • Communication complexity of distributed convex learning and optimization
    • Yossi Arjevani and Ohad Shamir. Communication complexity of distributed convex learning and optimization. In NIPS, 2015.
    • (2015) NIPS
    • Arjevani, Y.1    Shamir, O.2
  • 7
    • 84983143287 scopus 로고    scopus 로고
    • Convex optimization: Algorithms and complexity
    • Sébastien Bubeck. Convex optimization: Algorithms and complexity. Foundations and Trends® in Machine Learning, 8(3-4):231-357, 2015.
    • (2015) Foundations and Trends® in Machine Learning , vol.8 , Issue.3-4 , pp. 231-357
    • Bubeck, S.1
  • 8
    • 85069497682 scopus 로고    scopus 로고
    • Project adam: Building an efficient and scalable deep learning training system
    • October
    • Trishul Chilimbi, Yutaka Suzue, Johnson Apacible, and Karthik Kalyanaraman. Project adam: Building an efficient and scalable deep learning training system. In OSDI, October 2014.
    • (2014) OSDI
    • Chilimbi, T.1    Suzue, Y.2    Apacible, J.3    Kalyanaraman, K.4
  • 9
    • 85047002261 scopus 로고    scopus 로고
    • Accessed: 2017-02-24
    • Cntk brainscript file for alexnet. https://github.com/Microsoft/CNTK/tree/master/Examples/Image/Classification/AlexNet/BrainScript. Accessed: 2017-02-24.
    • Cntk Brainscript File for Alexnet
  • 10
    • 84965170302 scopus 로고    scopus 로고
    • Taming the wild: A unified analysis of hogwild-style algorithms
    • Christopher M De Sa, Ce Zhang, Kunle Olukotun, and Christopher Ré. Taming the wild: A unified analysis of hogwild-style algorithms. In NIPS, 2015.
    • (2015) NIPS
    • De Sa Christopher, M.1    Zhang, C.2    Olukotun, K.3    Ré, C.4
  • 13
    • 85060524241 scopus 로고    scopus 로고
    • Asynchronous stochastic convex optimization
    • John C Duchi, Sorathan Chaturapruek, and Christopher Ré. Asynchronous stochastic convex optimization. NIPS, 2015.
    • (2015) NIPS
    • Duchi, J.C.1    Chaturapruek, S.2    Ré, C.3
  • 14
    • 0016486577 scopus 로고
    • Universal codeword sets and representations of the integers
    • Peter Elias. Universal codeword sets and representations of the integers. IEEE transactions on information theory, 21(2):194-203, 1975.
    • (1975) IEEE Transactions on Information Theory , vol.21 , Issue.2 , pp. 194-203
    • Elias, P.1
  • 15
    • 84892854517 scopus 로고    scopus 로고
    • Stochastic first- and zeroth-order methods for nonconvex stochastic programming
    • Saeed Ghadimi and Guanghui Lan. Stochastic first- and zeroth-order methods for nonconvex stochastic programming. SIAM Journal on Optimization, 23(4):2341-2368, 2013.
    • (2013) SIAM Journal on Optimization , vol.23 , Issue.4 , pp. 2341-2368
    • Ghadimi, S.1    Lan, G.2
  • 16
    • 84970003080 scopus 로고    scopus 로고
    • Deep learning with limited numerical precision
    • Suyog Gupta, Ankur Agrawal, Kailash Gopalakrishnan, and Pritish Narayanan. Deep learning with limited numerical precision. In ICML, pages 1737-1746, 2015.
    • (2015) ICML , pp. 1737-1746
    • Gupta, S.1    Agrawal, A.2    Gopalakrishnan, K.3    Narayanan, P.4
  • 23
    • 84898963415 scopus 로고    scopus 로고
    • Accelerating stochastic gradient descent using predictive variance reduction
    • Rie Johnson and Tong Zhang. Accelerating stochastic gradient descent using predictive variance reduction. In NIPS, 2013.
    • (2013) NIPS
    • Johnson, R.1    Zhang, T.2
  • 29
    • 84965099508 scopus 로고    scopus 로고
    • Asynchronous parallel stochastic gradient for nonconvex optimization
    • Xiangru Lian, Yijun Huang, Yuncheng Li, and Ji Liu. Asynchronous parallel stochastic gradient for nonconvex optimization. In NIPS. 2015.
    • (2015) NIPS
    • Lian, X.1    Huang, Y.2    Li, Y.3    Liu, J.4
  • 31
    • 85047010441 scopus 로고    scopus 로고
    • Accessed: 2017-11-4
    • Cntk implementation of qsgd. https://gitlab.com/demjangrubic/QSGD. Accessed: 2017-11-4.
    • Cntk Implementation of Qsgd
  • 32
    • 85162467517 scopus 로고    scopus 로고
    • Hogwild: A lock-free approach to parallelizing stochastic gradient descent
    • Benjamin Recht, Christopher Re, Stephen Wright, and Feng Niu. Hogwild: A lock-free approach to parallelizing stochastic gradient descent. In NIPS, 2011.
    • (2011) NIPS
    • Recht, B.1    Re, C.2    Wright, S.3    Niu, F.4
  • 35
    • 84910069984 scopus 로고    scopus 로고
    • 1-bit stochastic gradient descent and its application to data-parallel distributed training of speech dnns
    • Frank Seide, Hao Fu, Jasha Droppo, Gang Li, and Dong Yu. 1-bit stochastic gradient descent and its application to data-parallel distributed training of speech dnns. In INTERSPEECH, 2014.
    • (2014) INTERSPEECH
    • Seide, F.1    Fu, H.2    Droppo, J.3    Li, G.4    Yu, D.5
  • 37
    • 84959142008 scopus 로고    scopus 로고
    • Scalable distributed DNN training using commodity GPU cloud computing
    • Nikko Strom. Scalable distributed DNN training using commodity GPU cloud computing. In INTERSPEECH, 2015.
    • (2015) INTERSPEECH
    • Strom, N.1
  • 40
    • 34547376584 scopus 로고
    • Communication complexity of convex optimization
    • John N Tsitsiklis and Zhi-Quan Luo. Communication complexity of convex optimization. Journal of Complexity, 3(3), 1987.
    • (1987) Journal of Complexity , vol.3 , Issue.3
    • Tsitsiklis, J.N.1    Luo, Z.-Q.2
  • 44
    • 84899033780 scopus 로고    scopus 로고
    • Information-theoretic lower bounds for distributed statistical estimation with communication constraints
    • Yuchen Zhang, John Duchi, Michael I Jordan, and Martin J Wainwright. Information-theoretic lower bounds for distributed statistical estimation with communication constraints. In NIPS, 2013.
    • (2013) NIPS
    • Zhang, Y.1    Duchi, J.2    Jordan, M.I.3    Wainwright, M.J.4


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.