-
1
-
-
85162387277
-
Distributed delayed stochastic optimization
-
A. Agarwal and J. C. Duchi. Distributed delayed stochastic optimization. In Advances in Neural Information Processing Systems 24, pages 873-881. 2011. URL http://papers. nips.cc/paper/4247-distributed-delayed-stochastic-optimization.pdf.
-
(2011)
Advances in Neural Information Processing Systems 24
, pp. 873-881
-
-
Agarwal, A.1
Duchi, J.C.2
-
2
-
-
84906673146
-
Revisiting asynchronous linear solvers: Provable convergence rate through randomization
-
H. Avron, A. Druinsky, and A. Gupta. Revisiting asynchronous linear solvers: Provable convergence rate through randomization. IPDPS, 2014.
-
(2014)
IPDPS
-
-
Avron, H.1
Druinsky, A.2
Gupta, A.3
-
3
-
-
85014561619
-
A fast iterative shrinkage-thresholding algorithm for linear inverse problems
-
A. Beck and M. Teboulle. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sciences, 2(1):183-202, 2009.
-
(2009)
SIAM J. Imaging Sciences
, vol.2
, Issue.1
, pp. 183-202
-
-
Beck, A.1
Teboulle, M.2
-
4
-
-
84892868336
-
On the convergence of block coordinate descent type methods
-
A. Beck and L. Tetruashvili. On the convergence of block coordinate descent type methods. SIAM Journal on Optimization, 23(4):2037-2060, 2013.
-
(2013)
SIAM Journal on Optimization
, vol.23
, Issue.4
, pp. 2037-2060
-
-
Beck, A.1
Tetruashvili, L.2
-
6
-
-
80051762104
-
Distributed optimization and statistical learning via the alternating direction method of multipliers
-
S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends in Machine Learning, 3(1):1-122, 2011.
-
(2011)
Foundations and Trends in Machine Learning
, vol.3
, Issue.1
, pp. 1-122
-
-
Boyd, S.1
Parikh, N.2
Chu, E.3
Peleato, B.4
Eckstein, J.5
-
9
-
-
85162498265
-
Better mini-batch algorithms via accelerated gradient methods
-
A. Cotter, O. Shamir, N. Srebro, and K. Sridharan. Better mini-batch algorithms via accelerated gradient methods. In Advances in Neural Information Processing Systems 24, pages 1647-1655. 2011. URL http://papers.nips.cc/paper/ 4432-better-mini-batch-algorithms-via-accelerated-gradient-methods.pdf.
-
(2011)
Advances in Neural Information Processing Systems 24
, pp. 1647-1655
-
-
Cotter, A.1
Shamir, O.2
Srebro, N.3
Sridharan, K.4
-
10
-
-
84857708133
-
Dual averaging for distributed optimization: Convergence analysis and network scaling
-
J. C. Duchi, A. Agarwal, and M. J.Wainwright. Dual averaging for distributed optimization: Convergence analysis and network scaling. IEEE Transactions on Automatic Control, 57 (3):592-606, 2012.
-
(2012)
IEEE Transactions on Automatic Control
, vol.57
, Issue.3
, pp. 592-606
-
-
Duchi, J.C.1
Agarwal, A.2
Wainwright, M.J.3
-
12
-
-
84865692740
-
Fast multiple-splitting algorithms for convex optimization
-
D. Goldfarb and S. Ma. Fast multiple-splitting algorithms for convex optimization. SIAM Journal on Optimization, 22(2):533-556, 2012.
-
(2012)
SIAM Journal on Optimization
, vol.22
, Issue.2
, pp. 533-556
-
-
Goldfarb, D.1
Ma, S.2
-
14
-
-
0026678659
-
On the convergence of the coordinate descent method for convex differentiable minimization
-
Z. Q. Luo and P. Tseng. On the convergence of the coordinate descent method for convex differentiable minimization. Journal of Optimization Theory and Applications, 72:7-35, 1992.
-
(1992)
Journal of Optimization Theory and Applications
, vol.72
, pp. 7-35
-
-
Luo, Z.Q.1
Tseng, P.2
-
15
-
-
0001208950
-
Parallel gradient distribution in unconstrained optimization
-
O. L. Mangasarian. Parallel gradient distribution in unconstrained optimization. SIAM Journal on Optimization, 33(1):916-1925, 1995.
-
(1995)
SIAM Journal on Optimization
, vol.33
, Issue.1
, pp. 916-1925
-
-
Mangasarian, O.L.1
-
16
-
-
70450197241
-
Robust stochastic approximation approach to stochastic programming
-
A. Nemirovski, A. Juditsky, G. Lan, and A. Shapiro. Robust stochastic approximation approach to stochastic programming. SIAM Journal on Optimization, 19:1574-1609, 2009.
-
(2009)
SIAM Journal on Optimization
, vol.19
, pp. 1574-1609
-
-
Nemirovski, A.1
Juditsky, A.2
Lan, G.3
Shapiro, A.4
-
18
-
-
84865692149
-
Efficiency of coordinate descent methods on huge-scale optimization problems
-
Y. Nesterov. Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM Journal on Optimization, 22(2):341-362, 2012.
-
(2012)
SIAM Journal on Optimization
, vol.22
, Issue.2
, pp. 341-362
-
-
Nesterov, Y.1
-
19
-
-
85162467517
-
Hogwild!: A lock-free approach to parallelizing stochastic gradient descent
-
F. Niu, B. Recht, C. Ré, and S. J. Wright. Hogwild!: A lock-free approach to parallelizing stochastic gradient descent. Advances in Neural Information Processing Systems 24, pages 693-701, 2011.
-
(2011)
Advances in Neural Information Processing Systems
, vol.24
, pp. 693-701
-
-
Niu, F.1
Recht, B.2
Ré, C.3
Wright, S.J.4
-
21
-
-
84897116612
-
Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
-
P. Richtárik and M. Takáč. Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function. Mathematrical Programming, 144:1-38, 2012a.
-
(2012)
Mathematrical Programming
, vol.144
, pp. 1-38
-
-
Richtárik, P.1
Takáč, M.2
-
23
-
-
84877770112
-
Feature clustering for accelerating parallel coordinate descent
-
C. Scherrer, A. Tewari, M. Halappanavar, and D. Haglin. Feature clustering for accelerating parallel coordinate descent. Advances in Neural Information Processing Systems 25, pages 28-36, 2012.
-
(2012)
Advances in Neural Information Processing Systems 25
, pp. 28-36
-
-
Scherrer, C.1
Tewari, A.2
Halappanavar, M.3
Haglin, D.4
-
27
-
-
0035533631
-
Convergence of a block coordinate descent method for nondifferentiable minimization
-
P. Tseng. Convergence of a block coordinate descent method for nondifferentiable minimization. Journal of Optimization Theory and Applications, 109:475-494, 2001.
-
(2001)
Journal of Optimization Theory and Applications
, vol.109
, pp. 475-494
-
-
Tseng, P.1
-
28
-
-
46749146509
-
A coordinate gradient descent method for nonsmooth separable minimization
-
June
-
P. Tseng and S. Yun. A coordinate gradient descent method for nonsmooth separable minimization. Mathematical Programming, Series B, 117:387-423, June 2009.
-
(2009)
Mathematical Programming, Series B
, vol.117
, pp. 387-423
-
-
Tseng, P.1
Yun, S.2
-
29
-
-
77956736675
-
A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
-
P. Tseng and S. Yun. A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training. Computational Optimization and Applications, 47(2):179-206, 2010.
-
(2010)
Computational Optimization and Applications
, vol.47
, Issue.2
, pp. 179-206
-
-
Tseng, P.1
Yun, S.2
-
30
-
-
84901632905
-
Iteration complexity of feasible descent methods for convex optimization
-
P.-W. Wang and C.-J. Lin. Iteration complexity of feasible descent methods for convex optimization. Journal of Machine Learning Research, 15:1523-1548, 2014.
-
(2014)
Journal of Machine Learning Research
, vol.15
, pp. 1523-1548
-
-
Wang, P.-W.1
Lin, C.-J.2
-
31
-
-
84861594597
-
Accelerated block-coordinate relaxation for regularized optimization
-
S. J. Wright. Accelerated block-coordinate relaxation for regularized optimization. SIAM Journal on Optimization, 22(1):159-186, 2012.
-
(2012)
SIAM Journal on Optimization
, vol.22
, Issue.1
, pp. 159-186
-
-
Wright, S.J.1
-
32
-
-
84898970556
-
Trading computation for communication: Distributed stochastic dual coordinate ascent
-
T. Yang. Trading computation for communication: Distributed stochastic dual coordinate ascent. Advances in Neural Information Processing Systems 26, pages 629-637, 2013.
-
(2013)
Advances in Neural Information Processing Systems 26
, pp. 629-637
-
-
Yang, T.1
|