메뉴 건너뛰기




Volumn , Issue PART 1, 2013, Pages 71-79

Stochastic gradient descent for non-smooth optimization: Convergence results and optimal averaging schemes

Author keywords

[No Author keywords available]

Indexed keywords

LEARNING SYSTEMS; TIME VARYING NETWORKS;

EID: 84897554805     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (573)

References (13)
  • 1
    • 84860244324 scopus 로고    scopus 로고
    • Information-theoretic lower bounds on the oracle complexity of stochastic convex optimization
    • Agarwal, A., Bartlett, P., Ravikumar, P., and Wainwright, M. Information-theoretic lower bounds on the oracle complexity of stochastic convex optimization. IEEE Transactions on Information Theory, 58(5):3235-3249, 2012.
    • (2012) IEEE Transactions on Information Theory , vol.58 , Issue.5 , pp. 3235-3249
    • Agarwal, A.1    Bartlett, P.2    Ravikumar, P.3    Wainwright, M.4
  • 2
    • 85162480829 scopus 로고    scopus 로고
    • Non-asymptotic analysis of stochastic approximation algorithms for machine learning
    • Bach, F. and Moulines, E. Non-asymptotic analysis of stochastic approximation algorithms for machine learning. In NIPS, 2011.
    • (2011) NIPS
    • Bach, F.1    Moulines, E.2
  • 3
    • 84875000998 scopus 로고    scopus 로고
    • Beyond the regret minimization barrier: An optimal algorithm for stochastic strongly-convex optimization
    • Hazan, E. and Kale, S. Beyond the regret minimization barrier: An optimal algorithm for stochastic strongly-convex optimization. In COLT, 2011.
    • (2011) COLT
    • Hazan, E.1    Kale, S.2
  • 4
    • 35348918820 scopus 로고    scopus 로고
    • Logarithmic regret algorithms for online convex optimization
    • DOI 10.1007/s10994-007-5016-8, Special Issue on COLT 2006; Guest Editors: Avrim Blum, Gabor Lugosi and Hans Ulrich Simon
    • Hazan, E., Agarwal, A., and Kale, S. Logarithmic regret algorithms for online convex optimization. Machine Learning, 69(2-3):169-192, 2007. (Pubitemid 47574314)
    • (2007) Machine Learning , vol.69 , Issue.2-3 , pp. 169-192
    • Hazan, E.1    Agarwal, A.2    Kale, S.3
  • 6
    • 84996741690 scopus 로고    scopus 로고
    • A simpler approach to obtaining an o(l/t) convergence rate for projected stochastic subgradient descent
    • abs/1212.2002
    • Lacoste-Julien, S., Schmidt, M., and Bach, F. A simpler approach to obtaining an o(l/t) convergence rate for projected stochastic subgradient descent. CoRR, abs/1212.2002, 2012.
    • (2012) CoRR
    • Lacoste-Julien, S.1    Schmidt, M.2    Bach, F.3
  • 7
    • 84867137256 scopus 로고    scopus 로고
    • Stochastic smoothing for nonsmooth minimizations: Accelerating sgd by exploiting structure
    • Ouyang, H. and Gray, A. Stochastic smoothing for nonsmooth minimizations: Accelerating sgd by exploiting structure. In ICML, 2012.
    • (2012) ICML
    • Ouyang, H.1    Gray, A.2
  • 8
    • 84897559891 scopus 로고    scopus 로고
    • Making gradient descent optimal for strongly convex stochastic optimization
    • abs/1109.5647
    • Rakhlin, A., Shamir, O., and Sridharan, K. Making gradient descent optimal for strongly convex stochastic optimization. CoRR, abs/1109.5647, 2011.
    • (2011) CoRR
    • Rakhlin, A.1    Shamir, O.2    Sridharan, K.3
  • 11
    • 84897475313 scopus 로고    scopus 로고
    • Is averaging needed for strongly convex stochastic gradient descent?
    • Open problem presented at
    • Shamir, O. Is averaging needed for strongly convex stochastic gradient descent? Open problem presented at COLT, 2012.
    • COLT, 2012
    • Shamir, O.1
  • 12
    • 14344259207 scopus 로고    scopus 로고
    • Solving large scale linear prediction problems using stochastic gradient descent algorithms
    • Zhang, T. Solving large scale linear prediction problems using stochastic gradient descent algorithms. In ICML, 2004.
    • (2004) ICML
    • Zhang, T.1
  • 13
    • 1942484421 scopus 로고    scopus 로고
    • Online convex programming and generalized infinitesimal gradient ascent
    • Zinkevich, M. Online convex programming and generalized infinitesimal gradient ascent. In ICML, 2003.
    • (2003) ICML
    • Zinkevich, M.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.