메뉴 건너뛰기




Volumn 15, Issue , 2014, Pages 1523-1548

Iteration complexity of feasible descent methods for onvex optimization

Author keywords

Convergence rate; Convex optimization; Feasible descent methods; Iteration complexity

Indexed keywords

ALGORITHMS; ARTIFICIAL INTELLIGENCE; CONVEX OPTIMIZATION; OPTIMIZATION;

EID: 84901632905     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (129)

References (31)
  • 1
    • 84892868336 scopus 로고    scopus 로고
    • On the convergence of block coordinate descent type methods
    • Amir Beck and Luba Tetruashvili. On the convergence of block coordinate descent type methods. SIAM Journal on Optimization, 23(4):2037-2060, 2013.
    • (2013) SIAM Journal on Optimization , vol.2 , Issue.4 , pp. 2037-2060
    • Beck, A.1    Tetruashvili, L.2
  • 3
    • 48849104146 scopus 로고    scopus 로고
    • Coordinate descent method for largescale L2-loss linear SVM
    • Kai-Wei Chang, Cho-Jui Hsieh, and Chih-Jen Lin. Coordinate descent method for largescale L2-loss linear SVM. Journal of Machine Learning Research, 9:1369-1398, 2008. URL http://www.csie.ntu.edu.tw/~cjlin/papers/cdl2.pdf.
    • (2008) Journal of Machine Learning Research , vol.9 , pp. 1369-1398
    • Chang, K.1    Hsieh, C.2    Lin, C.3
  • 5
  • 8
    • 84870928469 scopus 로고    scopus 로고
    • Large-scale linear support vector regression
    • Chia-Hua Ho and Chih-Jen Lin. Large-scale linear support vector regression. Journal of Machine Learning Research, 13:3323-3348, 2012. URL http://www.csie.ntu.edu.tw/ ~cjlin/papers/linear-svr.pdf.
    • (2012) Journal of Machine Learning Research , vol.13 , pp. 3323-3348
    • Ho, C.1    Lin, C.2
  • 11
    • 0028201552 scopus 로고
    • Sharp Lipschitz constants for basic optimal solutions and basic feasible solutions of linear programs
    • Wu Li. Sharp Lipschitz constants for basic optimal solutions and basic feasible solutions of linear programs. SIAM Journal on Control and Optimization, 32(1):140-153, 1994.
    • (1994) SIAM Journal on Control and Optimization , vol.32 , Issue.1 , pp. 140-153
    • Li, W.1
  • 12
    • 44649088319 scopus 로고    scopus 로고
    • Trust region Newton method for large-scale logistic regression
    • Chih-Jen Lin, Ruby C. Weng, and S. Sathiya Keerthi. Trust region Newton method for large-scale logistic regression. Journal of Machine Learning Research, 9:627-650, 2008. URL http://www.csie.ntu.edu.tw/~cjlin/papers/ logistic.pdf.
    • (2008) Journal of Machine Learning Research , vol.9 , pp. 627-650
    • Lin, C.1    Weng, R.C.2    Sathiya Keerthi, S.3
  • 13
    • 0026678659 scopus 로고
    • On the convergence of coordinate descent method for convex difierentiable minimization
    • Zhi-Quan Luo and Paul Tseng. On the convergence of coordinate descent method for convex difierentiable minimization. Journal of Optimization Theory and Applications, 72 (1):7-35, 1992a.
    • (1992) Journal of Optimization Theory and Applications , vol.72 , Issue.1 , pp. 7-35
    • Luo, Z.1    Tseng, P.2
  • 14
    • 0026839499 scopus 로고
    • On the linear convergence of descent methods for convex essentially smooth minimization
    • Zhi-Quan Luo and Paul Tseng. On the linear convergence of descent methods for convex essentially smooth minimization. SIAM Journal on Control and Optimization, 30(2): 408-425, 1992b.
    • (1992) SIAM Journal on Control and Optimization , vol.30 , Issue.2 , pp. 408-425
    • Luo, Z.1    Tseng, P.2
  • 15
    • 21344480786 scopus 로고
    • Error bounds and convergence analysis of feasible descent methods: A general approach
    • Zhi-Quan Luo and Paul Tseng. Error bounds and convergence analysis of feasible descent methods: a general approach. Annals of Operations Research, 46:157-178, 1993.
    • (1993) Annals of Operations Research , vol.46 , pp. 157-178
    • Luo, Z.1    Tseng, P.2
  • 16
    • 0023348755 scopus 로고
    • Lipschitz continuity of solutions of linear inequalities, programs and complementarity problems
    • Olvi L. Mangasarian and Tzong-Huei Shiau. Lipschitz continuity of solutions of linear inequalities, programs and complementarity problems. SIAM Journal on Control and Optimization, 25(3):583-595, 1987. (Pubitemid 17585883)
    • (1987) SIAM Journal on Control and Optimization , vol.25 , Issue.3 , pp. 583-595
    • Mangasarian, O.L.1    Shiau, T.-H.2
  • 17
    • 84865692149 scopus 로고    scopus 로고
    • Nesterov. Eficiency of coordinate descent methods on huge-scale optimization problems
    • Yurii E. Nesterov. Eficiency of coordinate descent methods on huge-scale optimization problems. SIAM Journal on Optimization, 22(2):341-362, 2012.
    • (2012) SIAM Journal on Optimization , vol.22 , Issue.2 , pp. 341-362
    • Yurii, E.1
  • 18
    • 0001170028 scopus 로고
    • A posteriori error bounds for the linearly-constrained variational inequality problem
    • 1547
    • Jong-Shi Pang. A posteriori error bounds for the linearly-constrained variational inequality problem. Mathematics of Operations Research, 12(3):474-484, 1987. 1547
    • (1987) Mathematics of Operations Research , vol.12 , Issue.3 , pp. 474-484
    • Pang, J.1
  • 19
    • 0003120218 scopus 로고    scopus 로고
    • Bernhard Sch, olkopf, Christopher J C. Burges, and Alexander J. Smola, editors, Advances in Kernel Methods -Support Vector Learning, Cambridge, MA MIT Press
    • John C. Platt. Fast training of support vector machines using sequential minimal optimization. In Bernhard Sch, olkopf, Christopher J. C. Burges, and Alexander J. Smola, editors, Advances in Kernel Methods -Support Vector Learning, Cambridge, MA, 1998. MIT Press.
    • (1998) Fast Training of Support Vector Machines Using Sequential Minimal Optimization
    • John Platt, C.1
  • 22
    • 84877774282 scopus 로고    scopus 로고
    • On the nonasymptotic convergence of cyclic coordinate descent methods
    • Ankan Saha and Ambuj Tewari. On the nonasymptotic convergence of cyclic coordinate descent methods. SIAM Journal on Optimization, 23(1):576-601, 2013.
    • (2013) SIAM Journal on Optimization , vol.23 , Issue.1 , pp. 576-601
    • Saha, A.1    Tewari, A.2
  • 23
    • 0005722921 scopus 로고
    • Ueber ein verfahren die gleichungen auf welche die methode der kleinsten quadrate f uhrt sowie line are gleichungen uberhaupt durch successive ann aherung aufzul osen
    • Bayerischen Akademie der Wissenschaften Mathematisch- Naturwissenschaftliche Abteilung
    • Ludwig Seidel. Ueber ein verfahren, die gleichungen, auf welche die methode der kleinsten quadrate f, uhrt, sowie line, are gleichungen , uberhaupt, durch successive ann, aherung aufzul, osen. Abhandlungen der Bayerischen Akademie der Wissenschaften. Mathematisch-Naturwissenschaftliche Abteilung, 11(3):81-108, 1874.
    • (1874) Abhandlungen der , vol.11 , Issue.3 , pp. 81-108
    • Seidel, L.1
  • 25
    • 84875134236 scopus 로고    scopus 로고
    • Stochastic dual coordinate ascent methods for regularized loss minimization
    • Shai Shalev-Shwartz and Tong Zhang. Stochastic dual coordinate ascent methods for regularized loss minimization. Journal of Machine Learning Research, 14:567-599, 2013a.
    • (2013) Journal of Machine Learning Research , vol.14 , pp. 567-599
    • Shalev-Shwartz, S.1    Zhang, T.2
  • 28
    • 60349101047 scopus 로고    scopus 로고
    • Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
    • Paul Tseng and Sangwoon Yun. Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization. Journal of Optimization Theory and Applications, 140:513-535, 2009.
    • (2009) Journal of Optimization Theory and Applications , vol.140 , pp. 513-535
    • Tseng, P.1    Yun, S.2
  • 30
    • 0000371910 scopus 로고
    • Strong and weak convexity of sets and functions
    • Jean-Philippe Vial. Strong and weak convexity of sets and functions. Mathematics of Operations Research, 8(2):231-259, 1983.
    • (1983) Mathematics of Operations Research , vol.8 , Issue.2 , pp. 231-259
    • Vial, J.1
  • 31
    • 84861594597 scopus 로고    scopus 로고
    • Accelerated block-coordinate relaxation for regularized optimization
    • Stephen J Wright. Accelerated block-coordinate relaxation for regularized optimization. SIAM Journal on Optimization, 22(1):159-186, 2012.
    • (2012) SIAM Journal on Optimization , vol.22 , Issue.1 , pp. 159-186
    • Stephen Wright, J.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.