메뉴 건너뛰기




Volumn , Issue , 2013, Pages

Adaptive learning rates and parallelization for stochastic, sparse, non-smooth gradients

Author keywords

[No Author keywords available]

Indexed keywords

FINITE DIFFERENCE METHOD; GRADIENT METHODS; ORTHOGONAL FUNCTIONS;

EID: 85083952393     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (19)

References (12)
  • 2
    • 0024137490 scopus 로고
    • Increased rates of convergence through learning rate adaptation
    • January
    • Jacobs, R. A. Increased rates of convergence through learning rate adaptation. Neural Networks, 1(4):295–307, January 1988.
    • (1988) Neural Networks , vol.1 , Issue.4 , pp. 295-307
    • Jacobs, R.A.1
  • 4
    • 33748998787 scopus 로고    scopus 로고
    • Adaptive stepsizes for recursive estimation with applications in approximate dynamic programming
    • May
    • George, A. P and Powell, W. B. Adaptive stepsizes for recursive estimation with applications in approximate dynamic programming. Machine Learning, 65(1):167–198, May 2006.
    • (2006) Machine Learning , vol.65 , Issue.1 , pp. 167-198
    • George, A.P.1    Powell, W.B.2
  • 7
    • 0034201611 scopus 로고    scopus 로고
    • Adaptive method of realizing natural gradient learning for multilayer perceptrons
    • Amari, S, Park, H, and Fukumizu, K. Adaptive method of realizing natural gradient learning for multilayer perceptrons. Neural Computation, 12(6):1399–1409, 2000.
    • (2000) Neural Computation , vol.12 , Issue.6 , pp. 1399-1409
    • Amari, S.1    Park, H.2    Fukumizu, K.3
  • 9
    • 84865685824 scopus 로고    scopus 로고
    • Sample size selection in optimization methods for machine learning
    • Byrd, R, Chin, G, Nocedal, J, and Wu, Y. Sample size selection in optimization methods for machine learning. Mathematical Programming, 2012.
    • (2012) Mathematical Programming
    • Byrd, R.1    Chin, G.2    Nocedal, J.3    Wu, Y.4
  • 10
    • 84860625894 scopus 로고    scopus 로고
    • Hogwild!: A lock-free approach to parallelizing stochastic gradient descent
    • Niu, F, Recht, B, Re, C, and Wright, S. J. Hogwild!: A lock-free approach to parallelizing stochastic gradient descent. Matrix, (1):21, 2011.
    • (2011) Matrix , Issue.1 , pp. 21
    • Niu, F.1    Recht, B.2    Re, C.3    Wright, S.J.4


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.