메뉴 건너뛰기




Volumn 5, Issue , 2009, Pages 560-566

Variable metric stochastic approximation theory

Author keywords

[No Author keywords available]

Indexed keywords

BFGS ALGORITHM; CONVERGENCE THEORY; EXPERT ADVICE; ON-LINE VARIABLES; ONLINE VERSIONS; STOCHASTIC APPROXIMATIONS;

EID: 84862282880     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Conference Paper
Times cited : (12)

References (17)
  • 2
  • 4
    • 0000396062 scopus 로고    scopus 로고
    • Natural gradient works efficiently in learning
    • S. Amari. Natural gradient works efficiently in learning. Neural Computation, 10(2):251-276, 1998.
    • (1998) Neural Computation , vol.10 , Issue.2 , pp. 251-276
    • Amari, S.1
  • 6
    • 0008815681 scopus 로고    scopus 로고
    • Exponentiated gradient versus gradient descent for linear predictors
    • J. Kivinen and M. K. Warmuth. Exponentiated gradient versus gradient descent for linear predictors. Information and Computation, 132(1):164, 1997.
    • (1997) Information and Computation , vol.132 , Issue.1 , pp. 164
    • Kivinen, J.1    Warmuth, M.K.2
  • 7
    • 0036631778 scopus 로고    scopus 로고
    • Fast curvature matrix-vector products for second-order gradient descent
    • N. N. Schraudolph. Fast curvature matrix-vector products for second-order gradient descent. Neural Computation, 14(7):1723-1738, 2002.
    • (2002) Neural Computation , vol.14 , Issue.7 , pp. 1723-1738
    • Schraudolph, N.N.1
  • 8
    • 0035370926 scopus 로고    scopus 로고
    • Relative loss bounds for online density estimation with the exponential family of distributions
    • Special issue on Theoretical Advances in Online Learning, Game Theory and Boosting
    • K. Azoury and M. K. Warmuth. Relative loss bounds for online density estimation with the exponential family of distributions. Machine Learning, 43 (3):211-246, 2001. Special issue on Theoretical Advances in Online Learning, Game Theory and Boosting.
    • (2001) Machine Learning , vol.43 , Issue.3 , pp. 211-246
    • Azoury, K.1    Warmuth, M.K.2
  • 10
    • 35348918820 scopus 로고    scopus 로고
    • Logarithmic regret algorithms for online convex optimization
    • E. Hazan, A. Agarwal, and S. Kale. Logarithmic regret algorithms for online convex optimization. Machine Learning, 69(23):169-192, 2007.
    • (2007) Machine Learning , vol.69 , Issue.23 , pp. 169-192
    • Hazan, E.1    Agarwal, A.2    Kale, S.3
  • 11
    • 1942484421 scopus 로고    scopus 로고
    • Online convex programming and generalised infinitesimal gradient ascent
    • M. Zinkevich. Online convex programming and generalised infinitesimal gradient ascent. In Proc. Intl. Conf. Machine Learning, pages 928-936, 2003.
    • (2003) Proc. Intl. Conf. Machine Learning , pp. 928-936
    • Zinkevich, M.1
  • 13
    • 0000792515 scopus 로고
    • Multidimensional stochastic approximation methods
    • J. Blum. Multidimensional stochastic approximation methods. Annals of Mathematical Statistics, 25:737-744, 1954.
    • (1954) Annals of Mathematical Statistics , vol.25 , pp. 737-744
    • Blum, J.1
  • 14
    • 0002686402 scopus 로고
    • A convergence theorem for non negative almost supermartingales and some applications
    • Ohio State Univ. Columbus, Ohio, Academic Press, New York
    • H. E. Robbins and D. O. Siegmund. A convergence theorem for non negative almost supermartingales and some applications. In Proc. Sympos. Optimiz ing Methods in Statistics, pages 233-257, Ohio State Univ., Columbus, Ohio, 1971. Academic Press, New York.
    • (1971) Proc. Sympos. Optimiz Ing Methods in Statistics , pp. 233-257
    • Robbins, H.E.1    Siegmund, D.O.2
  • 15
    • 0013309537 scopus 로고    scopus 로고
    • Online algorithms and stochastic approximations
    • D. Saad, editor, Online Lea, Cambridge University Press, Cambridge, UK
    • L. Bottou. Online algorithms and stochastic approximations. In D. Saad, editor, Online Learning and Neural Networks. Cambridge University Press, Cambridge, UK, 1998.
    • (1998) Rning and Neural Networks
    • Bottou, L.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.