메뉴 건너뛰기




Volumn 40, Issue 2015, 2015, Pages

Competing with the empirical risk minimizer in a single pass

Author keywords

[No Author keywords available]

Indexed keywords

ALGORITHMS; COMPUTATION THEORY; SAMPLING;

EID: 84984710701     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Conference Paper
Times cited : (42)

References (36)
  • 2
    • 84860244324 scopus 로고    scopus 로고
    • Information-theoretic lower bounds on the oracle complexity of stochastic convex optimization
    • May
    • A. Agarwal, P. L. Bartlett, P. Ravikumar, and M. J.Wainwright. Information-theoretic lower bounds on the oracle complexity of stochastic convex optimization. IEEE Transactions on Information Theory, 58 (5): 3235-3249, May 2012.
    • (2012) IEEE Transactions on Information Theory , vol.58 , Issue.5 , pp. 3235-3249
    • Agarwal, A.1    Bartlett, P.L.2    Ravikumar, P.3    Wainwright, M.J.4
  • 4
    • 80555158386 scopus 로고    scopus 로고
    • Self-concordant analysis for logistic regression
    • F. Bach. Self-concordant analysis for logistic regression. Electronic Journal of Statistics, 4:384-414, 2010.
    • (2010) Electronic Journal of Statistics , vol.4 , pp. 384-414
    • Bach, F.1
  • 5
    • 85162480829 scopus 로고    scopus 로고
    • Non-asymptotic analysis of stochastic approximation algorithms for machine learning
    • F. Bach and E. Moulines. Non-asymptotic analysis of stochastic approximation algorithms for machine learning. In Neural Information Processing Systems (NIPS), 2011.
    • (2011) Neural Information Processing Systems (NIPS)
    • Bach, F.1    Moulines, E.2
  • 6
    • 84899001337 scopus 로고    scopus 로고
    • Non-strongly-convex smooth stochastic approximation with convergence rate O (1=n)
    • F. Bach and E. Moulines. Non-strongly-convex smooth stochastic approximation with convergence rate O (1=n). In Neural Information Processing Systems (NIPS), 2013.
    • (2013) Neural Information Processing Systems (NIPS)
    • Bach, F.1    Moulines, E.2
  • 8
  • 11
    • 68949134067 scopus 로고
    • Asymptotically efficient stochastic approximation; The RM case
    • V. Fabian. Asymptotically efficient stochastic approximation; the RM case. Annals of Statistics, 1 (3), 1973.
    • (1973) Annals of Statistics , vol.1 , Issue.3
    • Fabian, V.1
  • 12
    • 84907359690 scopus 로고    scopus 로고
    • Beyond the regret minimization barrier: Optimal algorithms for stochastic strongly-convex optimization
    • E. Hazan and S. Kale. Beyond the regret minimization barrier: Optimal algorithms for stochastic strongly-convex optimization. Journal of Machine Learning Research, 15:2489-2512, 2014. URL http://jmlr.org/papers/v15/hazan14a.html.
    • (2014) Journal of Machine Learning Research , vol.15 , pp. 2489-2512
    • Hazan, E.1    Kale, S.2
  • 14
    • 84859392380 scopus 로고    scopus 로고
    • Tail inequalities for sums of random matrices that depend on the intrinsic dimension
    • D. Hsu, S. M. Kakade, and T. Zhang. Tail inequalities for sums of random matrices that depend on the intrinsic dimension. Electronic Communications in Probability, 17 (14): 1-13, 2012.
    • (2012) Electronic Communications in Probability , vol.17 , Issue.14 , pp. 1-13
    • Hsu, D.1    Kakade, S.M.2    Zhang, T.3
  • 20
  • 22
    • 70450197241 scopus 로고    scopus 로고
    • Robust stochastic approximation approach to stochastic programming
    • A. Nemirovski, A. Juditsky, G. Lan, and A. Shapiro. Robust stochastic approximation approach to stochastic programming. SIAM Journal on Optimization, 19 (4): 1574-1609, 2009.
    • (2009) SIAM Journal on Optimization , vol.19 , Issue.4 , pp. 1574-1609
    • Nemirovski, A.1    Juditsky, A.2    Lan, G.3    Shapiro, A.4
  • 25
    • 0026899240 scopus 로고
    • Acceleration of stochastic approximation by averaging
    • July
    • B. T. Polyak and A. B. Juditsky. Acceleration of stochastic approximation by averaging. SIAM Journal on Control and Optimization, 30 (4): 838-855, July 1992. ISSN 0363-0129.
    • (1992) SIAM Journal on Control and Optimization , vol.30 , Issue.4 , pp. 838-855
    • Polyak, B.T.1    Juditsky, A.B.2
  • 29
    • 84875134236 scopus 로고    scopus 로고
    • Stochastic dual coordinate ascent methods for regularized loss
    • February
    • S. Shalev-Shwartz and T. Zhang. Stochastic dual coordinate ascent methods for regularized loss. Journal of Machine Learning Research (JMLR), 14 (1): 567-599, February 2013.
    • (2013) Journal of Machine Learning Research (JMLR) , vol.14 , Issue.1 , pp. 567-599
    • Shalev-Shwartz, S.1    Zhang, T.2
  • 33
    • 84906663901 scopus 로고    scopus 로고
    • Online learning as stochastic approximation of regularization paths: Optimality and almost-sure convergence
    • Pierre Tarres and Yuan Yao. Online learning as stochastic approximation of regularization paths: Optimality and almost-sure convergence. IEEE Transactions on Information Theory, 60 (9): 5716-5735, 2014.
    • (2014) IEEE Transactions on Information Theory , vol.60 , Issue.9 , pp. 5716-5735
    • Tarres, P.1    Yao, Y.2
  • 35
    • 78649381411 scopus 로고    scopus 로고
    • On complexity issues of online learning algorithms
    • Yuan Yao. On complexity issues of online learning algorithms. IEEE Transactions on Information Theory, 56 (12), 2010.
    • (2010) IEEE Transactions on Information Theory , vol.56 , Issue.12
    • Yao, Y.1
  • 36
    • 52949113792 scopus 로고    scopus 로고
    • Online gradient descent learning algorithms
    • Yiming Ying and Massimiliano Pontil. Online gradient descent learning algorithms. Foundations of Computational Mathematics, 8 (5): 561-596, 2008.
    • (2008) Foundations of Computational Mathematics , vol.8 , Issue.5 , pp. 561-596
    • Ying, Y.1    Pontil, M.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.