-
2
-
-
85014561619
-
A fast iterative shrinkage-thresholding algorithm for linear inverse problems
-
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)
-
(2009)
SIAM J. Imaging Sci.
, vol.2
, Issue.1
, pp. 183-202
-
-
Beck, A.1
Teboulle, M.2
-
3
-
-
50949133940
-
Exponentiated gradient algorithms for conditional random fields and max-margin markov networks
-
Collins, M., Globerson, A., Koo, T., Carreras, X., Bartlett, P.: Exponentiated gradient algorithms for conditional random fields and max-margin markov networks. J. Mach. Learn. Res. 9, 1775–1822 (2008)
-
(2008)
J. Mach. Learn. Res.
, vol.9
, pp. 1775-1822
-
-
Collins, M.1
Globerson, A.2
Koo, T.3
Carreras, X.4
Bartlett, P.5
-
4
-
-
84897484927
-
-
Cotter, A., Shamir, O., Srebro, N., Sridharan, K.: Better mini-batch algorithms via accelerated gradient methods. arXiv preprint arXiv:1106.4574 (2011)
-
(2011)
Better mini-batch algorithms via accelerated gradient methods. arXiv preprint arXiv
, vol.1106
, pp. 4574
-
-
Cotter, A.1
Shamir, O.2
Srebro, N.3
Sridharan, K.4
-
5
-
-
0010442827
-
On the algorithmic implementation of multiclass kernel-based vector machines
-
Crammer, K., Singer, Y.: On the algorithmic implementation of multiclass kernel-based vector machines. J. Mach. Learn. Res. 2, 265–292 (2001)
-
(2001)
J. Mach. Learn. Res.
, vol.2
, pp. 265-292
-
-
Crammer, K.1
Singer, Y.2
-
6
-
-
57249107300
-
Smooth optimization with approximate gradient
-
d’Aspremont, A.: Smooth optimization with approximate gradient. SIAM J. Optim. 19(3), 1171–1183 (2008)
-
(2008)
SIAM J. Optim.
, vol.19
, Issue.3
, pp. 1171-1183
-
-
d’Aspremont, A.1
-
7
-
-
84905567870
-
First-order methods of smooth convex optimization with inexact oracle
-
Devolder, O., Glineur, F., Nesterov, Y.: First-order methods of smooth convex optimization with inexact oracle. Math. Program. 146(1–2), 37–75 (2014)
-
(2014)
Math. Program
, vol.146
, Issue.1-2
, pp. 37-75
-
-
Devolder, O.1
Glineur, F.2
Nesterov, Y.3
-
8
-
-
75249102673
-
Efficient online and batch learning using forward backward splitting
-
Duchi, J., Singer, Y.: Efficient online and batch learning using forward backward splitting. J. Mach. Learn. Res. 10, 2899–2934 (2009)
-
(2009)
J. Mach. Learn. Res.
, vol.10
, pp. 2899-2934
-
-
Duchi, J.1
Singer, Y.2
-
9
-
-
56449092085
-
Efficient projections onto the l 1-ball for learning in high dimensions
-
Duchi, J., Shalev-Shwartz, S., Singer, Y., Chandra, T.: Efficient projections onto the l 1-ball for learning in high dimensions. In: Proceedings of the 25th International Conference on Machine Learning, pp. 272–279. ACM (2008)
-
(2008)
Proceedings of the 25th International Conference on Machine Learning, pp. 272–279. ACM
-
-
Duchi, J.1
Shalev-Shwartz, S.2
Singer, Y.3
Chandra, T.4
-
10
-
-
84898075653
-
Composite objective mirror descent
-
Duchi, J., Shalev-Shwartz, S., Singer, Y., Tewari, A.: Composite objective mirror descent. In: Proceedings of the 23rd Annual Conference on Learning Theory, pp. 14–26 (2010)
-
(2010)
Proceedings of the 23rd Annual Conference on Learning Theory
, pp. 14-26
-
-
Duchi, J.1
Shalev-Shwartz, S.2
Singer, Y.3
Tewari, A.4
-
11
-
-
84912542181
-
Accelerated, parallel and proximal coordinate descent
-
Fercoq, O., Richtárik, P.: Accelerated, parallel and proximal coordinate descent. Technical report. arXiv:1312.5799 (2013)
-
(2013)
Technical report. arXiv
, vol.1312
, pp. 5799
-
-
Fercoq, O.1
Richtárik, P.2
-
12
-
-
84871576447
-
Optimal stochastic approximation algorithms for strongly convex stochastic composite optimization i: A generic algorithmic framework
-
Ghadimi, S., Lan, G.: Optimal stochastic approximation algorithms for strongly convex stochastic composite optimization i: A generic algorithmic framework. SIAM J. Optim. 22(4), 1469–1492 (2012)
-
(2012)
SIAM J. Optim.
, vol.22
, Issue.4
, pp. 1469-1492
-
-
Ghadimi, S.1
Lan, G.2
-
13
-
-
77956508892
-
Accelerated gradient methods for stochastic optimization and online learning
-
Hu, C., Weike, P., Kwok, J.T.: Accelerated gradient methods for stochastic optimization and online learning. In: Advances in Neural Information Processing Systems, pp. 781–789 (2009)
-
(2009)
Advances in Neural Information Processing Systems
, pp. 781-789
-
-
Hu, C.1
Weike, P.2
Kwok, J.T.3
-
14
-
-
84953244182
-
-
Lacoste-Julien, S., Jaggi, M., Schmidt, M., Pletscher, P.: Stochastic block-coordinate frank-wolfe optimization for structural svms. arXiv preprint arXiv:1207.4747 (2012)
-
(2012)
Stochastic block-coordinate frank-wolfe optimization for structural svms. arXiv preprint arXiv
, vol.1207
, pp. 4747
-
-
Lacoste-Julien, S.1
Jaggi, M.2
Schmidt, M.3
Pletscher, P.4
-
15
-
-
84863353464
-
Sparse online learning via truncated gradient
-
Langford, J., Li, L., Zhang, T.: Sparse online learning via truncated gradient. In: NIPS, pp. 905–912 (2009)
-
(2009)
NIPS
, pp. 905-912
-
-
Langford, J.1
Li, L.2
Zhang, T.3
-
16
-
-
84953279716
-
-
Roux, N.L., Schmidt, M., Bach, F.: A Stochastic Gradient Method with an Exponential Convergence Rate for Strongly-Convex Optimization with Finite Training Sets. arXiv preprint arXiv:1202.6258 (2012)
-
(2012)
A Stochastic Gradient Method with an Exponential Convergence Rate for Strongly-Convex Optimization with Finite Training Sets. arXiv preprint arXiv
, vol.1202
, pp. 6258
-
-
Roux, N.L.1
Schmidt, M.2
Bach, F.3
-
17
-
-
84865692149
-
Efficiency of coordinate descent methods on huge-scale optimization problems
-
Nesterov, Y.: Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optim. 22(2), 341–362 (2012)
-
(2012)
SIAM J. Optim.
, vol.22
, Issue.2
, pp. 341-362
-
-
Nesterov, Y.1
-
18
-
-
17444406259
-
Smooth minimization of non-smooth functions
-
Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103(1), 127–152 (2005)
-
(2005)
Math. Program.
, vol.103
, Issue.1
, pp. 127-152
-
-
Nesterov, Y.1
-
19
-
-
84879800501
-
Gradient methods for minimizing composite objective function
-
Nesterov, Y.: Gradient methods for minimizing composite objective function. Math. Program. 140, 125–161 (2013)
-
(2013)
Math. Program.
, vol.140
, pp. 125-161
-
-
Nesterov, Y.1
-
20
-
-
84897116612
-
Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
-
Richtárik, P., Takáč, M.: Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function. Math. Program. 144(1–2), 1–38 (2014)
-
(2014)
Math. Program
, vol.144
, Issue.1-2
, pp. 1-38
-
-
Richtárik, P.1
Takáč, M.2
-
21
-
-
84860633393
-
Convergence rates of inexact proximal-gradient methods for convex optimization
-
Schmidt, M., Roux, N.L., Bach, F.: Convergence rates of inexact proximal-gradient methods for convex optimization. Technical report. arXiv:1109.2415 (2011)
-
(2011)
Technical report. arXiv
, vol.1109
, pp. 2415
-
-
Schmidt, M.1
Roux, N.L.2
Bach, F.3
-
22
-
-
79960131832
-
Stochastic methods for l 1-regularized loss minimization
-
Shalev-Shwartz, S., Tewari, A.: Stochastic methods for l 1-regularized loss minimization. J. Mach. Learn. Res. 12, 1865–1892 (2011)
-
(2011)
J. Mach. Learn. Res.
, vol.12
, pp. 1865-1892
-
-
Shalev-Shwartz, S.1
Tewari, A.2
-
24
-
-
71149119963
-
Stochastic methods for l 1-regularized loss minimization
-
Shalev-Shwartz, S., Tewari, A.: Stochastic methods for l 1-regularized loss minimization. In: ICML, p. 117 (2009)
-
(2009)
ICML
, pp. 117
-
-
Shalev-Shwartz, S.1
Tewari, A.2
-
26
-
-
84875134236
-
Stochastic dual coordinate ascent methods for regularized loss minimization
-
Shalev-Shwartz, S., Zhang, T.: Stochastic dual coordinate ascent methods for regularized loss minimization. J. Mach. Learn. Res. 14, 567–599 (2013)
-
(2013)
J. Mach. Learn. Res.
, vol.14
, pp. 567-599
-
-
Shalev-Shwartz, S.1
Zhang, T.2
-
27
-
-
34547964973
-
Pegasos: primal estimated sub-GrAdient SOlver for SVM
-
Shalev-Shwartz, S., Singer, Y., Srebro, N.: Pegasos: primal estimated sub-GrAdient SOlver for SVM. In: ICML, pp. 807–814 (2007)
-
(2007)
ICML
, pp. 807-814
-
-
Shalev-Shwartz, S.1
Singer, Y.2
Srebro, N.3
-
28
-
-
79251503629
-
Trading accuracy for sparsity in optimization problems with sparsity constraints
-
Shalev-Shwartz, S., Srebro, N., Zhang, T.: Trading accuracy for sparsity in optimization problems with sparsity constraints. SIAM J. Optim. 20(6), 2807–2832 (2010)
-
(2010)
SIAM J. Optim.
, vol.20
, Issue.6
, pp. 2807-2832
-
-
Shalev-Shwartz, S.1
Srebro, N.2
Zhang, T.3
-
29
-
-
84897543082
-
Mini-batch primal and dual methods for SVMs
-
Takác, M., Bijral, A., Richtárik, P., Srebro, N.: Mini-batch primal and dual methods for SVMs. In: ICML (2013)
-
(2013)
ICML
-
-
Takác, M.1
Bijral, A.2
Richtárik, P.3
Srebro, N.4
-
30
-
-
78649396336
-
Dual averaging method for regularized stochastic learning and online optimization
-
Xiao, L.: Dual averaging method for regularized stochastic learning and online optimization. J. Mach. Learn. Res. 11, 2543–2596 (2010)
-
(2010)
J. Mach. Learn. Res.
, vol.11
, pp. 2543-2596
-
-
Xiao, L.1
-
31
-
-
0036158505
-
On the dual formulation of regularized linear systems
-
Zhang, T.: On the dual formulation of regularized linear systems. Mach. Learn. 46, 91–129 (2002)
-
(2002)
Mach. Learn.
, vol.46
, pp. 91-129
-
-
Zhang, T.1
|