-
1
-
-
84860244324
-
Information-theoretic lower bounds on the oracle complexity of stochastic convex optimization
-
Agarwal, A., Bartlett, P., Ravikumar, P., and Wainwright, M. Information-theoretic lower bounds on the oracle complexity of stochastic convex optimization. IEEE Transactions on Information Theory, 58(5):3235-3249, 2012.
-
(2012)
IEEE Transactions on Information Theory
, vol.58
, Issue.5
, pp. 3235-3249
-
-
Agarwal, A.1
Bartlett, P.2
Ravikumar, P.3
Wainwright, M.4
-
2
-
-
85162480829
-
Non-asymptotic analysis of stochastic approximation algorithms for machine learning
-
Bach, F. and Moulines, E. Non-asymptotic analysis of stochastic approximation algorithms for machine learning. In NIPS, 2011.
-
(2011)
NIPS
-
-
Bach, F.1
Moulines, E.2
-
3
-
-
84875000998
-
Beyond the regret minimization barrier: An optimal algorithm for stochastic strongly-convex optimization
-
Hazan, E. and Kale, S. Beyond the regret minimization barrier: An optimal algorithm for stochastic strongly-convex optimization. In COLT, 2011.
-
(2011)
COLT
-
-
Hazan, E.1
Kale, S.2
-
4
-
-
35348918820
-
Logarithmic regret algorithms for online convex optimization
-
DOI 10.1007/s10994-007-5016-8, Special Issue on COLT 2006; Guest Editors: Avrim Blum, Gabor Lugosi and Hans Ulrich Simon
-
Hazan, E., Agarwal, A., and Kale, S. Logarithmic regret algorithms for online convex optimization. Machine Learning, 69(2-3):169-192, 2007. (Pubitemid 47574314)
-
(2007)
Machine Learning
, vol.69
, Issue.2-3
, pp. 169-192
-
-
Hazan, E.1
Agarwal, A.2
Kale, S.3
-
6
-
-
84996741690
-
A simpler approach to obtaining an o(l/t) convergence rate for projected stochastic subgradient descent
-
abs/1212.2002
-
Lacoste-Julien, S., Schmidt, M., and Bach, F. A simpler approach to obtaining an o(l/t) convergence rate for projected stochastic subgradient descent. CoRR, abs/1212.2002, 2012.
-
(2012)
CoRR
-
-
Lacoste-Julien, S.1
Schmidt, M.2
Bach, F.3
-
7
-
-
84867137256
-
Stochastic smoothing for nonsmooth minimizations: Accelerating sgd by exploiting structure
-
Ouyang, H. and Gray, A. Stochastic smoothing for nonsmooth minimizations: Accelerating sgd by exploiting structure. In ICML, 2012.
-
(2012)
ICML
-
-
Ouyang, H.1
Gray, A.2
-
8
-
-
84897559891
-
Making gradient descent optimal for strongly convex stochastic optimization
-
abs/1109.5647
-
Rakhlin, A., Shamir, O., and Sridharan, K. Making gradient descent optimal for strongly convex stochastic optimization. CoRR, abs/1109.5647, 2011.
-
(2011)
CoRR
-
-
Rakhlin, A.1
Shamir, O.2
Sridharan, K.3
-
9
-
-
84898064829
-
Stochastic convex optimization
-
Shalev-Shwartz, S., Shamir, O., Srebro, N., and Sridharan, K. Stochastic convex optimization. In COLT, 2009.
-
(2009)
COLT
-
-
Shalev-Shwartz, S.1
Shamir, O.2
Srebro, N.3
Sridharan, K.4
-
10
-
-
79952748054
-
Pegasos: Primal estimated sub-gradient solver for svm
-
Shalev-Shwartz, S., Singer, Y., Srebro, N., and Cotter, A. Pegasos: primal estimated sub-gradient solver for svm. Mathematical Programming, 127(1):3-30, 2011.
-
(2011)
Mathematical Programming
, vol.127
, Issue.1
, pp. 3-30
-
-
Shalev-Shwartz, S.1
Singer, Y.2
Srebro, N.3
Cotter, A.4
-
11
-
-
84897475313
-
Is averaging needed for strongly convex stochastic gradient descent?
-
Open problem presented at
-
Shamir, O. Is averaging needed for strongly convex stochastic gradient descent? Open problem presented at COLT, 2012.
-
COLT, 2012
-
-
Shamir, O.1
-
12
-
-
14344259207
-
Solving large scale linear prediction problems using stochastic gradient descent algorithms
-
Zhang, T. Solving large scale linear prediction problems using stochastic gradient descent algorithms. In ICML, 2004.
-
(2004)
ICML
-
-
Zhang, T.1
-
13
-
-
1942484421
-
Online convex programming and generalized infinitesimal gradient ascent
-
Zinkevich, M. Online convex programming and generalized infinitesimal gradient ascent. In ICML, 2003.
-
(2003)
ICML
-
-
Zinkevich, M.1
|