메뉴 건너뛰기




Volumn 10, Issue 3, 2018, Pages 305-314

A hypothesis about the rate of global convergence for optimal methods (Newton's type) in smooth convex optimization

Author keywords

Chebyshev's type methods; Hesse matrix; Lower bounds; Newton method; Superliner rate of convergence

Indexed keywords


EID: 85049776636     PISSN: 20767633     EISSN: 20776853     Source Type: Journal    
DOI: 10.20537/2076-7633-2018-10-3-305-314     Document Type: Article
Times cited : (1506)

References (32)
  • 1
    • 85049806868 scopus 로고    scopus 로고
    • Bezgradiyentnyye dvukhtochechnyye metody resheniya zadach stokhasticheskoy negladkoy vypukloy optimizatsii pri nalichii malykh shumov ne sluchaynoy prirody
    • (Russian).
    • Bayandina A. S., Gasnikov A. V., Lagunovskaya A. A. Bezgradiyentnyye dvukhtochechnyye metody resheniya zadach stokhasticheskoy negladkoy vypukloy optimizatsii pri nalichii malykh shumov ne sluchaynoy prirody [Gradient-less two-point methods for solving stochastic nonsmooth convex optimization problems in the presence of small non-random noises] // Automatics and telemechanics. - 2018. - URL: https://arxiv.org/ftp/arxiv/papers/1701/1701.03821.pdf (in Russian).
    • (2018) Automatics and Telemechanics
    • Bayandina, A.S.1    Gasnikov, A.V.2    Lagunovskaya, A.A.3
  • 3
    • 85049770031 scopus 로고    scopus 로고
    • Uskorennyye spuski po sluchaynomu napravleniyu i bezgradiyentnyye metody s neyevklidovoy proks-strukturoy
    • (Russian).
    • Vorontsova E. A., Gasnikov A. V., Gorbunov E. A. Uskorennyye spuski po sluchaynomu napravleniyu i bezgradiyentnyye metody s neyevklidovoy proks-strukturoy [Accelerated descents in a random direction and gradientless methods with non-euclidean prox-structure] // Automatics and telemechanics. - 2018. - URL: https://arxiv.org/pdf/1710.00162.pdf (in Russian).
    • (2018) Automatics and Telemechanics
    • Vorontsova, E.A.1    Gasnikov, A.V.2    Gorbunov, E.A.3
  • 4
    • 85049803579 scopus 로고    scopus 로고
    • Effektivnyye chislennyye metody poiska ravnovesiya v bol'shikh transportnykh setyakh: dissertatsiya na soiskaniye uchenoy stepeni d. f.-m. n. po spetsial'nosti 05.13.18
    • Moscow: MFTI (in Russian).
    • Gasnikov A. V. Effektivnyye chislennyye metody poiska ravnovesiya v bol'shikh transportnykh setyakh: dissertatsiya na soiskaniye uchenoy stepeni d. f.-m. n. po spetsial'nosti 05.13.18 [Effective numerical methods for finding equilibrium in large transport networks: thesis for PhD on the specialty 05.13.18] - Matematicheskoye modelirovaniye, chislennyye metody, kompleksy programm [Mathematical modeling, numerical methods, program complexes]. - Moscow: MFTI, 2016. - 487 p. (in Russian).
    • (2016) Matematicheskoye Modelirovaniye, Chislennyye Metody, Kompleksy Programm [Mathematical Modeling, Numerical Methods, Program Complexes] , pp. 487
    • Gasnikov, A.V.1
  • 13
    • 70350137854 scopus 로고    scopus 로고
    • K voprosu ob algoritmakh priblizhennogo vychisleniya minimuma vypukloy funktsii po yeye znacheniyam
    • (in Russian).
    • Protasov V. Yu. K voprosu ob algoritmakh priblizhennogo vychisleniya minimuma vypukloy funktsii po yeye znacheniyam [On the question of algorithms for the approximate calculation of the minimum of a convex function from its values] // Mat. zametki [Math. notes]. - 1996. - Vol. 59, No. 1. - P. 95-102. (in Russian).
    • (1996) Mat. Zametki [Math. Notes] , vol.59 , Issue.1 , pp. 95-102
    • Protasov, V.Yu.1
  • 16
    • 85049779919 scopus 로고    scopus 로고
    • Oracle complexity of second-order methods for smooth convex optimization
    • Arjevani Y., Shamir O., Shiff R. Oracle complexity of second-order methods for smooth convex optimization // e-print, 2017. - URL: https://arxiv.org/pdf/1705.07260.pdf
    • (2017) E-print
    • Arjevani, Y.1    Shamir, O.2    Shiff, R.3
  • 17
    • 84893334588 scopus 로고    scopus 로고
    • Estimate sequence methods: Extensions and approximations
    • Baes M. Estimate sequence methods: extensions and approximations // e-print, 2009. - URL: http://www.optimization-online.org/DB-FILE/2009/08/2372.pdf
    • (2009) E-print
    • Baes, M.1
  • 18
    • 84989161316 scopus 로고    scopus 로고
    • Automatic differentiation in machine learning: A survey
    • Baydin A. G., Pearlmutter B. A., Radul A. A., Siskand J. M. Automatic differentiation in machine learning: a survey // e-print, 2015. - URL: https://arxiv.org/pdf/1502.05767.pdf
    • (2015) E-print
    • Baydin, A.G.1    Pearlmutter, B.A.2    Radul, A.A.3    Siskand, J.M.4
  • 19
    • 84983143287 scopus 로고    scopus 로고
    • Convex optimization: Algorithms and complexity
    • Bubeck S. Convex optimization: algorithms and complexity // In Foundations and Trends in Machine Learning. - 2015. - Vol. 8, No. 3-4. - P. 231-357.
    • (2015) Foundations and Trends in Machine Learning. , vol.8 , Issue.3-4 , pp. 231-357
    • Bubeck, S.1
  • 20
    • 85024396637 scopus 로고    scopus 로고
    • Accelerated methods for non-convex optimization
    • Carmon Y., Duchi J. C., Hinder O., Sidford A. Accelerated methods for non-convex optimization // e-print, 2017. - URL: https://arxiv.org/pdf/1611.00756.pdf
    • (2017) E-print
    • Carmon, Y.1    Duchi, J.C.2    Hinder, O.3    Sidford, A.4
  • 21
    • 85049787492 scopus 로고    scopus 로고
    • Randomized similar triangles method: A unifying framework for accelerated randomized optimization methods (Coordinate descent directional search derivative-free method)
    • Dvurechensky P., Gasnikov A., Tiurin A. Randomized Similar Triangles Method: A Unifying Framework for Accelerated Randomized Optimization Methods (Coordinate Descent, Directional Search, Derivative-Free Method) // SIAM J. Optim. - 2017 (Submitted). - URL: https://arxiv.org/pdf/1707.08486.pdf
    • (2017) SIAM J. Optim.
    • Dvurechensky, P.1    Gasnikov, A.2    Tiurin, A.3
  • 22
    • 85049807076 scopus 로고    scopus 로고
    • Second-order methods with cubic regularization under inexact information
    • Ghadimi S., Liu H., Zhang T. Second-order methods with cubic regularization under inexact information // e-print, 2017. - URL: https://arxiv.org/pdf/1710.05782.pdf
    • (2017) E-print
    • Ghadimi, S.1    Liu, H.2    Zhang, T.3
  • 23
    • 85021161662 scopus 로고    scopus 로고
    • Regularized Newton methods for minimazing functions with Hölder continuous Hessian
    • Grapiglia G. N., Nesterov Yu. Regularized Newton methods for minimazing functions with Hölder continuous Hessian // SIAM J. Optim. - 2017. - Vol. 27 (1). - P. 478-506.
    • (2017) SIAM J. Optim. , vol.27 , Issue.1 , pp. 478-506
    • Grapiglia, G.N.1    Yu, N.2
  • 24
    • 85049774978 scopus 로고    scopus 로고
    • A faster cutting plane method and its implications for combinatorial and convex optimization
    • Lee Y.-T., Sidford A., Wong S. C.-W. A faster cutting plane method and its implications for combinatorial and convex optimization // e-print, 2015. - URL: https://arxiv.org/pdf/1508.04874.pdf
    • (2015) E-print
    • Lee, Y.-T.1    Sidford, A.2    Wong, S.C.-W.3
  • 25
    • 84880731606 scopus 로고    scopus 로고
    • An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
    • Monteiro R., Svaiter B. An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods // SIAM Journal on Optimization. - 2013. - Vol. 23 (2). - P. 1092-1125.
    • (2013) SIAM Journal on Optimization , vol.23 , Issue.2 , pp. 1092-1125
    • Monteiro, R.1    Svaiter, B.2
  • 27
    • 34547474583 scopus 로고    scopus 로고
    • Accelerating the cubic regularization of Newton's method on convex problems
    • Nesterov Yu. Accelerating the cubic regularization of Newton's method on convex problems // Math. Prog., Ser. A. - 2008. - Vol. 112. - P. 159-181.
    • (2008) Math. Prog., Ser. A. , vol.112 , pp. 159-181
    • Yu, N.1
  • 28
    • 85065412912 scopus 로고    scopus 로고
    • Implementable tensor methods in unconstrained convex optmization
    • Nesterov Yu. Implementable tensor methods in unconstrained convex optmization // CORE Discussion Papers 2018005. - 2018. - URL: https://ideas.repec.org/p/cor/louvco/2018005.html
    • (2018) CORE Discussion Papers 2018005
    • Yu, N.1
  • 29
    • 85049799866 scopus 로고    scopus 로고
    • Minimizing functions with bounded variation of subgradients
    • Nesterov Yu. Minimizing functions with bounded variation of subgradients // CORE Discussion Papers. 2005/79. - 2005. - 13 p. - URL: http://webdoc.sub.gwdg.de/ebook/serien/e/CORE/ dp2005-79.pdf
    • (2005) CORE Discussion Papers. 2005/79 , pp. 13
    • Yu, N.1
  • 30
    • 33646730150 scopus 로고    scopus 로고
    • Cubic regularization of Newton method and its global performance
    • Nesterov Yu., Polyak P. Cubic regularization of Newton method and its global performance // Math. Program. Ser. A. - 2006. - Vol. 108. - P. 177-205.
    • (2006) Math. Program. Ser. A. , vol.108 , pp. 177-205
    • Nesterov, Yu.1    Polyak, P.2
  • 31
    • 84948964679 scopus 로고    scopus 로고
    • Random gradient-free minimization of convex functions
    • Nesterov Yu., Spokoiny V. Random gradient-free minimization of convex functions // Foundations of Computational Mathematics. - 2017. - Vol. 17 (2). - P. 527-566.
    • (2017) Foundations of Computational Mathematics. , vol.17 , Issue.2 , pp. 527-566
    • Nesterov, Yu.1    Spokoiny, V.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.