메뉴 건너뛰기




Volumn 9851 LNAI, Issue , 2016, Pages 795-811

Linear convergence of gradient and proximal-gradient methods under the Polyak-Łojasiewicz condition

Author keywords

Boosting; Coordinate descent; Gradient descent; L1 regularization; Stochastic gradient; Support vector machines; Variance reduction

Indexed keywords

ARTIFICIAL INTELLIGENCE; GRADIENT METHODS; LEARNING SYSTEMS; STOCHASTIC SYSTEMS; SUPPORT VECTOR MACHINES;

EID: 84988569311     PISSN: 03029743     EISSN: 16113349     Source Type: Book Series    
DOI: 10.1007/978-3-319-46128-1_50     Document Type: Conference Paper
Times cited : (1221)

References (45)
  • 1
    • 84873371070 scopus 로고    scopus 로고
    • Fast global convergence rates of gradient methods for high-dimensional statistical recovery
    • Agarwal, A., Negahban, S.N., Wainwright, M.J.: Fast global convergence rates of gradient methods for high-dimensional statistical recovery. Ann. Statist. 40, 2452–2482 (2012)
    • (2012) Ann. Statist , vol.40 , pp. 2452-2482
    • Agarwal, A.1    Negahban, S.N.2    Wainwright, M.J.3
  • 2
    • 0034345435 scopus 로고    scopus 로고
    • Degenerate nonlinear programming with a quadratic growth condition. SIAM
    • Anitescu, M.: Degenerate nonlinear programming with a quadratic growth condition. SIAM J. Optim. 10, 1116–1135 (2000)
    • (2000) J. Optim , vol.10 , pp. 1116-1135
    • Anitescu, M.1
  • 3
    • 45749101721 scopus 로고    scopus 로고
    • On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
    • Attouch, H., Bolte, J.: On the convergence of the proximal algorithm for nonsmooth functions involving analytic features. Math. Program. Ser. B 116, 5–16 (2009)
    • (2009) Math. Program. Ser. B , vol.116 , pp. 5-16
    • Attouch, H.1    Bolte, J.2
  • 9
    • 84969531714 scopus 로고    scopus 로고
    • Faster rates for the Frank-Wolfe method over stronglyconvex sets
    • Garber, D., Hazan, E.: Faster rates for the Frank-Wolfe method over stronglyconvex sets. In: Proceedings of the 32nd ICML, pp. 541–549 (2015)
    • (2015) Proceedings of the 32Nd ICML , pp. 541-549
    • Garber, D.1    Hazan, E.2
  • 11
    • 84884202492 scopus 로고    scopus 로고
    • ParNes: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals
    • Gu, M., Lim, L.-H., Wu, C.J.: ParNes: a rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals. Numer. Algor. 64, 321–347 (2013)
    • (2013) Numer. Algor , vol.64 , pp. 321-347
    • Gu, M.1    Lim, L.-H.2    Wu, C.J.3
  • 12
    • 49149139478 scopus 로고
    • On sufficiency of the Kuhn-Tucker conditions
    • Hanson, M.A.: On sufficiency of the Kuhn-Tucker conditions. J. Math. Anal. Appl. 80, 545–550 (1981)
    • (1981) J. Math. Anal. Appl , vol.80 , pp. 545-550
    • Hanson, M.A.1
  • 13
    • 0000699491 scopus 로고
    • On approximate solutions of systems of linear inequalities
    • Hoffman, A.J.: On approximate solutions of systems of linear inequalities. J. Res. Nat. Bur. Stand. 49, 263–265 (1952)
    • (1952) J. Res. Nat. Bur. Stand , vol.49 , pp. 263-265
    • Hoffman, A.J.1
  • 15
    • 33646392997 scopus 로고    scopus 로고
    • QP algorithms with guaranteed accuracy and run time for support vector machines
    • Hush, D., Kelly, P., Scovel, C., Steinwart, I.: QP algorithms with guaranteed accuracy and run time for support vector machines. J. Mach. Learn. Res. 7, 733–769 (2006)
    • (2006) J. Mach. Learn. Res , vol.7 , pp. 733-769
    • Hush, D.1    Kelly, P.2    Scovel, C.3    Steinwart, I.4
  • 19
    • 84925448697 scopus 로고    scopus 로고
    • Asynchronous stochastic coordinate descent: Parallelism and convergence properties
    • Liu, J., Wright, S.J.: Asynchronous stochastic coordinate descent: parallelism and convergence properties. SIAM J. Optim. 25, 351–376 (2015)
    • (2015) SIAM J. Optim , vol.25 , pp. 351-376
    • Liu, J.1    Wright, S.J.2
  • 21
    • 0003366108 scopus 로고
    • A Topological Property of Real Analytic Subsets (In French). Coll. du CNRS
    • Łojasiewicz, S.: A Topological Property of Real Analytic Subsets (in French). Coll. du CNRS, Les équations aux dérivées partielles, vol. 117, pp. 87–89 (1963)
    • (1963) Les équations Aux dérivées Partielles , vol.117 , pp. 87-89
    • Łojasiewicz, S.1
  • 22
    • 21344480786 scopus 로고
    • Error bounds and convergence analysis of feasible descent methods: A general approach
    • Luo, Z.-Q., Tseng, P.: Error bounds and convergence analysis of feasible descent methods: a general approach. Ann. Oper. Res. 46, 157–178 (1993)
    • (1993) Ann. Oper. Res , vol.46 , pp. 157-178
    • Luo, Z.-Q.1    Tseng, P.2
  • 24
    • 35248862907 scopus 로고    scopus 로고
    • An introduction to boosting and leveraging
    • Mendelson, S., Smola, A.J. (eds.), Springer, Heidelberg
    • Meir, R., Rätsch, G.: An introduction to boosting and leveraging. In: Mendelson, S., Smola, A.J. (eds.) Advanced Lectures on Machine Learning. LNCS (LNAI), vol. 2600, pp. 118–183. Springer, Heidelberg (2003). doi:10.1007/3-540-36434-X_4
    • (2003) Advanced Lectures on Machine Learning. LNCS (LNAI) , vol.2600 , pp. 118-183
    • Meir, R.1    Rätsch, G.2
  • 26
    • 84962393728 scopus 로고    scopus 로고
    • Parallel random coordinate descent method for composite minimization: Convergence analysis and error bounds
    • Necoara, I., Clipici, D.: Parallel random coordinate descent method for composite minimization: convergence analysis and error bounds. SIAM J. Optim. 26, 197–226 (2016)
    • (2016) SIAM J. Optim , vol.26 , pp. 197-226
    • Necoara, I.1    Clipici, D.2
  • 27
    • 70450197241 scopus 로고    scopus 로고
    • Robust stochastic approximation approach to stochastic programming
    • Nemirovski, A., Juditsky, A., Lan, G., Shapiro, A.: Robust stochastic approximation approach to stochastic programming. SIAM J. Optim. 19, 1574–1609 (2009)
    • (2009) SIAM J. Optim , vol.19 , pp. 1574-1609
    • Nemirovski, A.1    Juditsky, A.2    Lan, G.3    Shapiro, A.4
  • 29
    • 84865692149 scopus 로고    scopus 로고
    • Efficiency of coordinate descent methods on huge-scale optimization problems
    • Nesterov, Y.: Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optim. 22, 341–362 (2012)
    • (2012) SIAM J. Optim , vol.22 , pp. 341-362
    • Nesterov, Y.1
  • 31
    • 0001067809 scopus 로고
    • Gradient methods for minimizing functionals
    • (in Russian)
    • Polyak, B.T.: Gradient methods for minimizing functionals. Zh. Vychisl. Mat. Mat. Fiz. 3, 643–653 (1963). (in Russian)
    • (1963) Zh. Vychisl. Mat. Mat. Fiz , vol.3 , pp. 643-653
    • Polyak, B.T.1
  • 35
    • 84875134236 scopus 로고    scopus 로고
    • Stochastic dual coordinate ascent methods for regularized loss minimization
    • Shalev-Shwartz, S., Zhang, T.: Stochastic dual coordinate ascent methods for regularized loss minimization. J. Mach. Learn. Res. 14, 567–599 (2013)
    • (2013) J. Mach. Learn. Res , vol.14 , pp. 567-599
    • Shalev-Shwartz, S.1    Zhang, T.2
  • 39
    • 84897116612 scopus 로고    scopus 로고
    • Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
    • Richtárik, P., Takáč, M.: Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function. Math. Program. Ser. A 144, 1–38 (2014)
    • (2014) Math. Program. Ser. A , vol.144 , pp. 1-38
    • Richtárik, P.1    Takáč, M.2
  • 40
    • 77958113563 scopus 로고    scopus 로고
    • Approximation accuracy, gradient methods, and error bound for structured convex optimization
    • Tseng, P.: Approximation accuracy, gradient methods, and error bound for structured convex optimization. Math. Program. Ser. B 125, 263–295 (2010)
    • (2010) Math. Program. Ser. B , vol.125 , pp. 263-295
    • Tseng, P.1
  • 41
    • 60349101047 scopus 로고    scopus 로고
    • Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
    • Tseng, P., Yun, S.: Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization. J. Optim. Theory Appl. 140, 513–535 (2009)
    • (2009) J. Optim. Theory Appl , vol.140 , pp. 513-535
    • Tseng, P.1    Yun, S.2
  • 42
    • 84901632905 scopus 로고    scopus 로고
    • Iteration complexity of feasible descent methods for convex optimization
    • Wang, P.-W., Lin, C.-J.: Iteration complexity of feasible descent methods for convex optimization. J. Mach. Learn. Res. 15, 1523–1548 (2014)
    • (2014) J. Mach. Learn. Res , vol.15 , pp. 1523-1548
    • Wang, P.-W.1    Lin, C.-J.2
  • 43
    • 84880741411 scopus 로고    scopus 로고
    • A proximal-gradient homotopy method for the sparse leastsquares problem
    • Xiao, L., Zhang, T.: A proximal-gradient homotopy method for the sparse leastsquares problem. SIAM J. Optim. 23, 1062–1091 (2013)
    • (2013) SIAM J. Optim , vol.23 , pp. 1062-1091
    • Xiao, L.1    Zhang, T.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.