메뉴 건너뛰기




Volumn 157, Issue 2, 2016, Pages 375-396

An inexact successive quadratic approximation method for L-1 regularized optimization

Author keywords

Inexact proximal Newton; Orthant based quasi Newton; Sparse optimization

Indexed keywords

APPROXIMATION THEORY; ARTIFICIAL INTELLIGENCE; LEARNING SYSTEMS; NEWTON-RAPHSON METHOD;

EID: 84940213672     PISSN: 00255610     EISSN: 14364646     Source Type: Journal    
DOI: 10.1007/s10107-015-0941-y     Document Type: Article
Times cited : (66)

References (28)
  • 2
    • 41549101939 scopus 로고    scopus 로고
    • Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data
    • Banerjee, O., El Ghaoui, L., d’Aspremont, A.: Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data. J. Mach. Learn. Res. 9, 485–516 (2008)
    • (2008) J. Mach. Learn. Res. , vol.9 , pp. 485-516
    • Banerjee, O.1    El Ghaoui, L.2    d’Aspremont, A.3
  • 4
    • 85014561619 scopus 로고    scopus 로고
    • A fast iterative shrinkage-thresholding algorithm for linear inverse problems
    • Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)
    • (2009) SIAM J. Imaging Sci. , vol.2 , Issue.1 , pp. 183-202
    • Beck, A.1    Teboulle, M.2
  • 5
    • 84856004485 scopus 로고    scopus 로고
    • Templates for convex cone problems with applications to sparse signal recovery
    • Becker, S.R., Candés, E.J., Grant, M.C.: Templates for convex cone problems with applications to sparse signal recovery. Math. Program. Comput. 3(3), 165–218 (2011)
    • (2011) Math. Program. Comput. , vol.3 , Issue.3 , pp. 165-218
    • Becker, S.R.1    Candés, E.J.2    Grant, M.C.3
  • 6
    • 84897489094 scopus 로고    scopus 로고
    • A family of second-order methods for convex L1 regularized optimization. Technical report, Optimization Center Report 2012/2
    • Byrd, R.H., Chin, G.M., Nocedal, J., Oztoprak, F.: A family of second-order methods for convex L1 regularized optimization. Technical report, Optimization Center Report 2012/2, Northwestern University (2012)
    • (2012) Northwestern University
    • Byrd, R.H.1    Chin, G.M.2    Nocedal, J.3    Oztoprak, F.4
  • 7
    • 84865685824 scopus 로고    scopus 로고
    • Sample size selection in optimization methods for machine learning
    • Byrd, R.H., Chin, G.M., Nocedal, J., Wu, Y.: Sample size selection in optimization methods for machine learning. Math. Program. 134(1), 127–155 (2012)
    • (2012) Math. Program. , vol.134 , Issue.1 , pp. 127-155
    • Byrd, R.H.1    Chin, G.M.2    Nocedal, J.3    Wu, Y.4
  • 8
    • 0028319529 scopus 로고
    • Representations of quasi-Newton matrices and their use in limited memory methods
    • Byrd, R.H., Nocedal, J., Schnabel, R.: Representations of quasi-Newton matrices and their use in limited memory methods. Math. Program. 63(4), 129–156 (1994)
    • (1994) Math. Program. , vol.63 , Issue.4 , pp. 129-156
    • Byrd, R.H.1    Nocedal, J.2    Schnabel, R.3
  • 10
    • 84879692820 scopus 로고    scopus 로고
    • Convergence of inexact Newton methods for generalized equations
    • Dontchev, A.L., Rockafellar, R.T.: Convergence of inexact Newton methods for generalized equations. Math. Program. 139, 115–137 (2013)
    • (2013) Math. Program. , vol.139 , pp. 115-137
    • Dontchev, A.L.1    Rockafellar, R.T.2
  • 14
    • 84860633052 scopus 로고    scopus 로고
    • An inexact interior point method for L1-regularized sparse covariance selection
    • Li, L., Toh, K.C.: An inexact interior point method for L1-regularized sparse covariance selection. Math. Program. Comput. 2(3), 291–315 (2010)
    • (2010) Math. Program. Comput. , vol.2 , Issue.3 , pp. 291-315
    • Li, L.1    Toh, K.C.2
  • 15
    • 85162564991 scopus 로고    scopus 로고
    • Convergence rates of inexact proximal-gradient methods for convex optimization
    • Le Roux, N., Schmidt, M.W., Bach, F.: Convergence rates of inexact proximal-gradient methods for convex optimization. In: NIPS, pp. 1458–1466 (2011)
    • (2011) NIPS , pp. 1458-1466
    • Le Roux, N.1    Schmidt, M.W.2    Bach, F.3
  • 16
    • 84897540828 scopus 로고    scopus 로고
    • A semismooth Newton method with multi-dimensional filter globalization for L1-optimization
    • Milzarek, A., Ulbrich, M.: A semismooth Newton method with multi-dimensional filter globalization for L1-optimization. SIAM J. Optim. 24(1), 298–333 (2014)
    • (2014) SIAM J. Optim. , vol.24 , Issue.1 , pp. 298-333
    • Milzarek, A.1    Ulbrich, M.2
  • 20
    • 0032222093 scopus 로고    scopus 로고
    • Cost approximation: a unified framework of descent algorithms for nonlinear programs
    • Patriksson, M.: Cost approximation: a unified framework of descent algorithms for nonlinear programs. SIAM J. Optim. 8(2), 561–582 (1998)
    • (1998) SIAM J. Optim. , vol.8 , Issue.2 , pp. 561-582
    • Patriksson, M.1
  • 22
    • 84877736378 scopus 로고    scopus 로고
    • Gaussian Markov random fields: theory and applications
    • Picka, J.D.: Gaussian Markov random fields: theory and applications. Technometrics 48(1), 146–147 (2006)
    • (2006) Technometrics , vol.48 , Issue.1 , pp. 146-147
    • Picka, J.D.1
  • 23
    • 84872004272 scopus 로고    scopus 로고
    • Inexact and accelerated proximal point algorithms
    • Salzo, S., Villa, S.: Inexact and accelerated proximal point algorithms. J. Convex Anal. 19(4), 1167–1192 (2012)
    • (2012) J. Convex Anal. , vol.19 , Issue.4 , pp. 1167-1192
    • Salzo, S.1    Villa, S.2
  • 27
    • 79551500651 scopus 로고    scopus 로고
    • A comparison of optimization methods and software for large-scale l1-regularized linear classification
    • Yuan, G.-X., Chang, K., Hsie, C., Lin, C.-J.: A comparison of optimization methods and software for large-scale l1-regularized linear classification. J. Mach. Learn. Res. 11(1), 3183–3234 (2010)
    • (2010) J. Mach. Learn. Res. , vol.11 , Issue.1 , pp. 3183-3234
    • Yuan, G.-X.1    Chang, K.2    Hsie, C.3    Lin, C.-J.4
  • 28
    • 84864920041 scopus 로고    scopus 로고
    • An improved glmnet for l1-regularized logistic regression
    • Yuan, G.-X., Ho, C.-H., Lin, C.-J.: An improved glmnet for l1-regularized logistic regression. J. Mach. Learn. Res. 13(1), 1999–2030 (2012)
    • (2012) J. Mach. Learn. Res. , vol.13 , Issue.1 , pp. 1999-2030
    • Yuan, G.-X.1    Ho, C.-H.2    Lin, C.-J.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.