메뉴 건너뛰기




Volumn , Issue , 2016, Pages 4069-4077

Adaptive Newton method for empirical risk minimization to statistical accuracy

Author keywords

[No Author keywords available]

Indexed keywords

ADAPTIVE ALGORITHMS; NEWTON-RAPHSON METHOD; SAMPLING; SELF ORGANIZING MAPS;

EID: 85019113979     PISSN: 10495258     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (41)

References (26)
  • 2
    • 85014561619 scopus 로고    scopus 로고
    • A fast iterative shrinkage-thresholding algorithm for linear inverse problems
    • Amir Beck and Marc Teboulle. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM Journal on Imaging Sciences, 2(1): 183-202, 2009.
    • (2009) SIAM Journal on Imaging Sciences , vol.2 , Issue.1 , pp. 183-202
    • Beck, A.1    Teboulle, M.2
  • 4
    • 0004055894 scopus 로고    scopus 로고
    • Cambridge University Press, New York, NY, USA
    • Stephen Boyd and Lieven Vandenberghe. Convex Optimization. Cambridge University Press, New York, NY, USA, 2004.
    • (2004) Convex Optimization
    • Boyd, S.1    Vandenberghe, L.2
  • 6
    • 84937908747 scopus 로고    scopus 로고
    • SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives
    • Montreal, Quebec, Canada
    • Aaron Defazio, Francis R. Bach, and Simon Lacoste-Julien. SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives. In Advances in Neural Information Processing Systems 27, Montreal, Quebec, Canada, pages 1646-1654, 2014.
    • (2014) Advances in Neural Information Processing Systems 27 , pp. 1646-1654
    • Defazio, A.1    Bach, F.R.2    Lacoste-Julien, S.3
  • 10
    • 84898963415 scopus 로고    scopus 로고
    • Accelerating stochastic gradient descent using predictive variance reduction
    • Nevada, United States
    • Rie Johnson and Tong Zhang. Accelerating stochastic gradient descent using predictive variance reduction. In Advances in Neural Information Processing Systems 26. Lake Tahoe, Nevada, United States, pages 315-323, 2013.
    • (2013) Advances in Neural Information Processing Systems 26. Lake Tahoe , pp. 315-323
    • Johnson, R.1    Zhang, T.2
  • 12
  • 13
    • 84961737543 scopus 로고    scopus 로고
    • Global convergence of online limited memory BFGS
    • Aryan Mokhtari and Alejandro Ribeiro. Global convergence of online limited memory BFGS. Journal of Machine Learning Research, 16: 3151-3181, 2015.
    • (2015) Journal of Machine Learning Research , vol.16 , pp. 3151-3181
    • Mokhtari, A.1    Ribeiro, A.2
  • 20
    • 84877725219 scopus 로고    scopus 로고
    • A stochastic gradient method with an exponential convergence rate for finite training sets
    • Nevada, United States
    • Nicolas Le Roux, Mark W. Schmidt, and Francis R. Bach. A stochastic gradient method with an exponential convergence rate for finite training sets. In Advances in Neural Information Processing Systems 25. Lake Tahoe, Nevada, United States, pages 2672-2680, 2012.
    • (2012) Advances in Neural Information Processing Systems 25. Lake Tahoe , pp. 2672-2680
    • Le Roux, N.1    Schmidt, M.W.2    Bach, F.R.3
  • 24
    • 84953283129 scopus 로고    scopus 로고
    • Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
    • Shai Shalev-Shwartz and Tong Zhang. Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization. Mathematical Programming, 155(1-2): 105-145, 2016.
    • (2016) Mathematical Programming , vol.155 , Issue.1-2 , pp. 105-145
    • Shalev-Shwartz, S.1    Zhang, T.2
  • 26
    • 84919793228 scopus 로고    scopus 로고
    • A proximal stochastic gradient method with progressive variance reduction
    • Lin Xiao and Tong Zhang. A proximal stochastic gradient method with progressive variance reduction. SIAM Journal on Optimization, 24(4): 2057-2075, 2014.
    • (2014) SIAM Journal on Optimization , vol.24 , Issue.4 , pp. 2057-2075
    • Xiao, L.1    Zhang, T.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.