메뉴 건너뛰기




Volumn 11, Issue , 2010, Pages 1145-1200

A quasi-newton approach to nonsmooth convex optimization problems in machine learning

Author keywords

BFGS; BMRM; Bundle methods; Hinge loss; Multiclass; Multilabel; Ocas; Owl QN; Risk minimization; Subgradient; Variable metric methods; Wolfe conditions

Indexed keywords

BFGS; BMRM; BUNDLE METHODS; MULTI-CLASS; MULTI-LABEL; RISK MINIMIZATION; SUBGRADIENT; VARIABLE METRIC METHODS; WOLFE CONDITIONS;

EID: 77951160087     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (88)

References (37)
  • 1
    • 0035281455 scopus 로고    scopus 로고
    • Polynomial learnability of stochastic rules with respect to the KL-divergence and quadratic distance
    • N. Abe, J. Takeuchi, and M. K. Warmuth. Polynomial Learnability of Stochastic Rules with Respect to the KL-Divergence and Quadratic Distance. IEICE Transactions on Information and Systems, 84(3):299-316, 2001.
    • (2001) IEICE Transactions on Information and Systems , vol.84 , Issue.3 , pp. 299-316
    • Abe, N.1    Takeuchi, J.2    Warmuth, M.K.3
  • 2
    • 0347493330 scopus 로고    scopus 로고
    • Davenport-Schinzel sequences and their geometric applications
    • In J. Sack and J. Urrutia, editors, NorthHolland, New York
    • P. K. Agarwal and M. Sharir. Davenport-Schinzel sequences and their geometric applications. In J. Sack and J. Urrutia, editors, Handbook of Computational Geometry, pages 1-47. NorthHolland, New York, 2000.
    • (2000) Handbook of Computational Geometry , pp. 1-47
    • Agarwal, P.K.1    Sharir, M.2
  • 4
  • 6
    • 11744310763 scopus 로고    scopus 로고
    • A general approach to convergence properties of some methods for nonsmooth convex optimization
    • J. R. Birge, L. Qi, and Z. Wei. A general approach-convergence properties of some methods for nonsmooth convex optimization. Applied Mathematics and Optimization, 38(2):141-158,1998. (Pubitemid128621131)
    • (1998) Applied Mathematics and Optimization , vol.38 , Issue.2 , pp. 141-158
    • Birge, J.R.1    Qi, L.2    Wei, Z.3
  • 9
    • 0141496132 scopus 로고    scopus 로고
    • Ultraconservative online algorithms for multiclass problems
    • January
    • K. Crammer and Y. Singer. Ultraconservative online algorithms for multiclass problems. Journal of Machine Learning Research, 3:951-991, January 2003a.
    • (2003) Journal of Machine Learning Research , vol.3 , pp. 951-991
    • Crammer, K.1    Singer, Y.2
  • 10
    • 0142228873 scopus 로고    scopus 로고
    • A family of additive online algorithms for category ranking
    • February
    • K. Crammer and Y. Singer. A family of additive online algorithms for category ranking. J. Mach. Learn. Res., 3:1025-1058, February 2003b.
    • (2003) J. Mach. Learn. Res. , vol.3 , pp. 1025-1058
    • Crammer, K.1    Singer, Y.2
  • 11
    • 56449101964 scopus 로고    scopus 로고
    • Optimized cutting plane algorithm for support vector machines
    • In A. McCallum and S. Roweis, editors, Omnipress
    • V. Franc and S. Sonnenburg. Optimized cutting plane algorithm for support vector machines. In A. McCallum and S. Roweis, editors, ICML, pages 320-327. Omnipress, 2008.
    • (2008) ICML , pp. 320-327
    • Franc, V.1    Sonnenburg, S.2
  • 12
    • 70450265503 scopus 로고    scopus 로고
    • Optimized cutting plane algorithm for large-scale risk minimization
    • V. Franc and S. Sonnenburg. Optimized cutting plane algorithm for large-scale risk minimization. Journal of Machine Learning Research, 10:2157-2192, 2009.
    • (2009) Journal of Machine Learning Research , vol.10 , pp. 2157-2192
    • Franc, V.1    Sonnenburg, S.2
  • 14
    • 0024885134 scopus 로고
    • Finding the upper envelope of n line segments in O(n log n) time
    • December
    • J. Hershberger. Finding the upper envelope of n line segments in O(n log n) time. Information Processing Letters, 33(4):169-174, December 1989.
    • (1989) Information Processing Letters , vol.33 , Issue.4 , pp. 169-174
    • Hershberger, J.1
  • 17
    • 0035479871 scopus 로고    scopus 로고
    • SSVM: A smooth support vector machine for classification
    • DOI 10.1023/A:1011215321374
    • Y. J. Lee and O. L. Mangasarian. SSVM: A smooth support vector machine for classification. Computational optimization and Applications, 20(1):5-22, 2001. (Pubitemid 32797935)
    • (2001) Computational Optimization and Applications , vol.20 , Issue.1 , pp. 5-22
    • Lee, Y.-J.1    Mangasarian, O.L.2
  • 19
    • 77949562670 scopus 로고    scopus 로고
    • Technical report, Optimization Online, Submitted to SIAM J. Optimization.
    • A. S. Lewis and M. L. Overton. Nonsmooth optimization via BFGS. Technical report, Optimization Online, 2008a. URL http://www.optimization-online.org/DB- FILE/2008/12/ 2172.pdf. Submitted to SIAM J. Optimization.
    • (2008) Nonsmooth Optimization Via BFGS.
    • Lewis, A.S.1    Overton, M.L.2
  • 21
    • 33646887390 scopus 로고
    • On the limited memory BFGS method for large scale optimization
    • D. C. Liu and J. Nocedal. On the limited memory BFGS method for large scale optimization. Mathematical Programming, 45(3):503-528, 1989.
    • (1989) Mathematical Programming , vol.45 , Issue.3 , pp. 503-528
    • Liu, D.C.1    Nocedal, J.2
  • 22
    • 0033245906 scopus 로고    scopus 로고
    • Globally convergent variable metric method for convex nonsmooth unconstrained minimization
    • L. Lukšan and J. Vlček. Globally convergent variable metric method for convex nonsmooth unconstrained minimization. Journal of Optimization Theory and Applications, 102(3):593-613, 1999.
    • (1999) Journal of Optimization Theory and Applications , vol.102 , Issue.3 , pp. 593-613
    • Lukšan, L.1    Vlček, J.2
  • 24
    • 0005422061 scopus 로고    scopus 로고
    • Convergence rate of incremental subgradient algorithms
    • In S. Uryasev and P. M. Pardalos, editors, Kluwer Academic Publishers
    • A. Nedić and D. P. Bertsekas. Convergence rate of incremental subgradient algorithms. In S. Uryasev and P. M. Pardalos, editors, Stochastic Optimization: Algorithms and Applications, pages 263-304. Kluwer Academic Publishers, 2000.
    • (2000) Stochastic Optimization: Algorithms and Applications , pp. 263-304
    • Nedić, A.1    Bertsekas, D.P.2
  • 25
    • 14944353419 scopus 로고    scopus 로고
    • Prox-method with rate of convergence O(1/t) for variational inequalities with lipschitz continuous monotone operators and smooth convex-concave saddle point problems
    • ISSN 1052-6234.
    • A. Nemirovski. Prox-method with rate of convergence O(1/t) for variational inequalities with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM J. on Optimization, 15(1):229-251, 2005. ISSN 1052-6234.
    • (2005) SIAM J. on Optimization , vol.15 , Issue.1 , pp. 229-251
    • Nemirovski, A.1
  • 26
    • 17444406259 scopus 로고    scopus 로고
    • Smooth minimization of non-smooth functions
    • Y. Nesterov. Smooth minimization of non-smooth functions. Math. Program., 103(1): 127-152, 2005.
    • (2005) Math. Program. , vol.103 , Issue.1 , pp. 127-152
    • Nesterov, Y.1
  • 28
    • 84859446378 scopus 로고    scopus 로고
    • On the equivalence of weak learnability and linear separability: New relaxations and efficient boosting algorithms
    • S. Shalev-Shwartz and Y. Singer. On the equivalence of weak learnability and linear separability: New relaxations and efficient boosting algorithms. In Proceedings of COLT, 2008.
    • (2008) Proceedings of COLT
    • Shalev-Shwartz, S.1    Singer, Y.2
  • 30
    • 84898948585 scopus 로고    scopus 로고
    • Max-margin Markov networks
    • In S. Thrun, L. Saul, and B. Schölkopf, editors, Cambridge, MA, MIT Press.
    • B. Taskar, C. Guestrin, and D. Koller. Max-margin Markov networks. In S. Thrun, L. Saul, and B. Schölkopf, editors, Advances in Neural Information Processing Systems 16, pages 25-32, Cambridge, MA, 2004. MIT Press.
    • (2004) Advances in Neural Information Processing Systems 16 , pp. 25-32
    • Taskar, B.1    Guestrin, C.2    Koller, D.3
  • 33
    • 56749106067 scopus 로고    scopus 로고
    • Entropy regularized LPBoost
    • In Y. Freund, Y. Làszlò Györfi, and G. Turàn, editors, number 5254 in Lecture Notes in Artificial Intelligence, Budapest, October Springer-Verlag.
    • M. K. Warmuth, K. A. Glocer, and S. V. N. Vishwanathan. Entropy regularized LPBoost. In Y. Freund, Y. Làszlò Györfi, and G. Turàn, editors, Proc. Intl. Conf. Algorithmic Learning Theory, number 5254 in Lecture Notes in Artificial Intelligence, pages 256 - 271, Budapest, October 2008. Springer-Verlag.
    • (2008) Proc. Intl. Conf. Algorithmic Learning Theory , pp. 256-271
    • Warmuth, M.K.1    Glocer, K.A.2    Vishwanathan, S.V.N.3
  • 34
    • 0014492147 scopus 로고
    • Convergence conditions for ascent methods
    • P. Wolfe. Convergence conditions for ascent methods. SIAM Review, 11 (2): 226-235, 1969.
    • (1969) SIAM Review , vol.11 , Issue.2 , pp. 226-235
    • Wolfe, P.1
  • 35
    • 0016621235 scopus 로고
    • A method of conjugate subgradients for minimizing nondifferentiable functions
    • P. Wolfe. A method of conjugate subgradients for minimizing nondifferentiable functions. Mathematical Programming Study, 3:145-173, 1975.
    • (1975) Mathematical Programming Study , vol.3 , pp. 145-173
    • Wolfe, P.1
  • 36
    • 56449097975 scopus 로고    scopus 로고
    • A quasi-Newton approach to nonsmooth convex optimization
    • In A. McCallum and S. Roweis, editors, Omnipress
    • J. Yu, S. V. N. Vishwanathan, S. Gunter, and N. N. Schraudolph. A quasi-Newton approach to nonsmooth convex optimization. In A. McCallum and S. Roweis, editors, ICML, pages 1216-1223. Omnipress, 2008.
    • (2008) ICML , pp. 1216-1223
    • Yu, J.1    Vishwanathan, S.V.N.2    Gunter, S.3    Schraudolph, N.N.4
  • 37
    • 0001868572 scopus 로고    scopus 로고
    • Text categorization based on regularized linear classification methods
    • T. Zhang and F. J. Oles. Text categorization based on regularized linear classification methods. Information Retrieval, 4:5-31, 2001.
    • (2001) Information Retrieval , vol.4 , pp. 5-31
    • Zhang, T.1    Oles, F.J.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.