메뉴 건너뛰기




Volumn 12, Issue , 2011, Pages 141-202

Training SVMs without offset

Author keywords

Decomposition algorithms; Support vector machines

Indexed keywords

CONVERGENCE RATES; DECOMPOSITION ALGORITHM; KEY FEATURE; NUMBER OF ITERATIONS; RUNTIMES; STOPPING CRITERIA; TRAINING ALGORITHMS; WARM START; WORKING SET; WORKING SET SELECTION;

EID: 79551679020     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (71)

References (29)
  • 2
    • 33746932071 scopus 로고    scopus 로고
    • A study on SMO-type decomposition methods for support vector machines
    • DOI 10.1109/TNN.2006.875973
    • P.-H. Chen, R.-E. Fan, and C.-J. Lin. A study on SMO-type decomposition methods for support vector (Pubitemid 44194150)
    • (2006) IEEE Transactions on Neural Networks , vol.17 , Issue.4 , pp. 893-908
    • Chen, P.-H.1    Fan, R.-E.2    Lin, C.-J.3
  • 5
    • 29144499905 scopus 로고    scopus 로고
    • Working set selection using second order information for training support vector machines
    • R.-E. Fan, P.-H. Chen, and C.-J. Lin. Working set selection using second order information for training support vector machines. J. Mach. Learn. Res., 6:1889-1918, 2005. (Pubitemid 41798130)
    • (2005) Journal of Machine Learning Research , vol.6 , pp. 1889-1918
    • Fan, R.-E.1    Chen, P.-H.2    Lin, C.-J.3
  • 8
    • 0036158552 scopus 로고    scopus 로고
    • A simple decomposition method for support vector machines
    • DOI 10.1023/A:1012427100071
    • C.-W. Hsu and C.-J. Lin. A simple decomposition method for support vector machines. Mach. Learn., 46:291-314, 2002. (Pubitemid 34129973)
    • (2002) Machine Learning , vol.46 , Issue.1-3 , pp. 291-314
    • Hsu, C.-W.1    Lin, C.-J.2
  • 9
    • 79551678947 scopus 로고    scopus 로고
    • C.-W. Hsu and C.-J. Lin. BSVM. http://www.csie.ntu.edu.tw/~cjlin/bsvm/, 2006.
    • (2006)
    • Hsu, C.-W.1    Lin, C.-J.2
  • 11
    • 0037399781 scopus 로고    scopus 로고
    • Polynomial-time decomposition algorithms for support vector machines
    • D. Hush and C. Scovel. Polynomial-time decomposition algorithms for support vector machines. Mach. Learn., 51:51-71, 2003.
    • (2003) Mach. Learn. , vol.51 , pp. 51-71
    • Hush, D.1    Scovel, C.2
  • 12
    • 33646392997 scopus 로고    scopus 로고
    • QP algorithms with guaranteed accuracy and run time for support vector machines
    • D. Hush, P. Kelly, C. Scovel, and I. Steinwart. QP algorithms with guaranteed accuracy and run time for support vector machines. J. Mach. Learn. Res., 7:733-769, 2006.
    • (2006) J. Mach. Learn. Res. , vol.7 , pp. 733-769
    • Hush, D.1    Kelly, P.2    Scovel, C.3    Steinwart, I.4
  • 13
    • 0002714543 scopus 로고    scopus 로고
    • Making large-scale SVM learning practical
    • B. Schölkopf, C. Burges, and A. Smola, editors, chapter 11, MIT Press, Cambridge, MA
    • T. Joachims. Making large-scale SVM learning practical. In B. Schölkopf, C. Burges, and A. Smola, editors, Advances in KernelMethods - Support Vector Learning, chapter 11, pages 169-184.MIT Press, Cambridge, MA, 1999.
    • (1999) Advances in KernelMethods - Support Vector Learning , pp. 169-184
    • Joachims, T.1
  • 14
    • 34547315922 scopus 로고    scopus 로고
    • Iterative single data algorithm for training kernel machines from huge data sets: Theory and performance
    • L. Wang, editor. Springer Verlag
    • V. Kecman, T.-M. Huang, and M. Vogt. Iterative single data algorithm for training kernel machines from huge data sets: Theory and performance. In L. Wang, editor, Support Vector Machines: Theory and Applications, pages 255-274. Springer Verlag, 2005.
    • (2005) Support Vector Machines: Theory and Applications , pp. 255-274
    • Kecman, V.1    Huang, T.-M.2    Vogt, M.3
  • 15
    • 84864039082 scopus 로고    scopus 로고
    • An efficient method for gradient-based adaptation of hyperparameters in SVM models
    • MIT Press, Cambridge, MA
    • S. Keerthi, V. Sindhwani, and O. Chapelle. An efficient method for gradient-based adaptation of hyperparameters in SVM models. In Advances in Neural Information Processing Systems 19, pages 673-680. MIT Press, Cambridge, MA, 2007.
    • (2007) Advances in Neural Information Processing Systems , vol.19 , pp. 673-680
    • Keerthi, S.1    Sindhwani, V.2    Chapelle, O.3
  • 16
    • 0000545946 scopus 로고    scopus 로고
    • Improvements to Platt's SMO algorithm for SVM classifier design
    • DOI 10.1162/089976601300014493
    • S. S. Keerthi, S. K. Shevade, C. Battacharyya, and K. R. K. Murthy. Improvements to Platt's SMO algorithm for SVM classifier design. Neural Comput., 13:637-649, 2001. (Pubitemid 33595014)
    • (2001) Neural Computation , vol.13 , Issue.3 , pp. 637-649
    • Keerthi, S.S.1    Shevade, S.K.2    Bhattacharyya, C.3    Murthy, K.R.K.4
  • 17
    • 0035506741 scopus 로고    scopus 로고
    • On the convergence of the decomposition method for support vector machines
    • C. J. Lin. On the convergence of the decomposition method for support vector machines. IEEE Trans. Neural Networks, 12:1288-1298, 2001.
    • (2001) IEEE Trans. Neural Networks , vol.12 , pp. 1288-1298
    • Lin, C.J.1
  • 18
    • 0036129250 scopus 로고    scopus 로고
    • Asymptotic convergence of an SMO algorithm without any assumptions
    • C. J. Lin. Asymptotic convergence of an SMO algorithm without any assumptions. IEEE Trans. Neural Networks, 13:248-250, 2002a.
    • (2002) IEEE Trans. Neural Networks , vol.13 , pp. 248-250
    • Lin, C.J.1
  • 19
    • 0036129250 scopus 로고    scopus 로고
    • A formal analysis of stopping criteria of decomposition methods for support vector machines
    • C. J. Lin. A formal analysis of stopping criteria of decomposition methods for support vector machines. IEEE Trans. Neural Networks, 13:248-250, 2002b.
    • (2002) IEEE Trans. Neural Networks , vol.13 , pp. 248-250
    • Lin, C.J.1
  • 20
    • 9444296042 scopus 로고    scopus 로고
    • A General Convergence Theorem for the Decomposition Method
    • Learning Theory
    • N. List and H.-U. Simon. A general convergence theorem for the decomposition method. In Proceedings of the 17th Annual Conference on Learning Theory, pages 363-377. Springer, Heidelberg, 2004. (Pubitemid 38940346)
    • (2004) LECTURE NOTES IN COMPUTER SCIENCE. , Issue.3120 , pp. 363-377
    • List, N.1    Simon, H.U.2
  • 21
    • 26944489027 scopus 로고    scopus 로고
    • General polynomial time decomposition algorithms
    • S. Ben-David, J. Case, and A.Maruko, editors, Springer, Heidelberg
    • N. List and H. U. Simon. General polynomial time decomposition algorithms. In S. Ben-David, J. Case, and A.Maruko, editors, Proceedings of the 18th Annual Conference on Learning Theory, COLT 2005, pages 308-322. Springer, Heidelberg, 2005.
    • (2005) Proceedings of the 18th Annual Conference on Learning Theory, COLT 2005 , pp. 308-322
    • List, N.1    Simon, H.U.2
  • 22
    • 33847123716 scopus 로고    scopus 로고
    • General polynomial time decomposition algorithms
    • N. List and H. U. Simon. General polynomial time decomposition algorithms. J. Mach. Learn. Res., 8:303-321, 2007. (Pubitemid 46280175)
    • (2007) Journal of Machine Learning Research , vol.8 , pp. 303-321
    • List, N.1    Simon, H.U.2
  • 24
    • 0026678659 scopus 로고
    • On the convergence of the coordinate descent method for convex differentiable minimization
    • L.Q. Luo and P. Tseng. On the convergence of the coordinate descent method for convex differentiable minimization. J. Optimization Theory Appl., 72:7-35, 1992.
    • (1992) J. Optimization Theory Appl. , vol.72 , pp. 7-35
    • Luo, L.Q.1    Tseng, P.2
  • 26
    • 4644354708 scopus 로고    scopus 로고
    • Sparseness of support vector machines
    • I. Steinwart. Sparseness of support vector machines. J. Mach. Learn. Res., 4:1071-1105, 2003.
    • (2003) J. Mach. Learn. Res. , vol.4 , pp. 1071-1105
    • Steinwart, I.1
  • 28
    • 38049041673 scopus 로고    scopus 로고
    • An oracle inequality for clipped regularized risk minimizers
    • B. Schölkopf, J. Platt, and T. Hoffman, editors, MIT Press, Cambridge, MA
    • I. Steinwart, D. Hush, and C. Scovel. An oracle inequality for clipped regularized risk minimizers. In B. Schölkopf, J. Platt, and T. Hoffman, editors, Advances in Neural Information Processing Systems 19, pages 1321-1328. MIT Press, Cambridge, MA, 2007.
    • (2007) Advances in Neural Information Processing Systems , vol.19 , pp. 1321-1328
    • Steinwart, I.1    Hush, D.2    Scovel, C.3
  • 29
    • 3543134928 scopus 로고    scopus 로고
    • SMO algorithms for support vector machines without bias
    • M. Vogt. SMO algorithms for support vector machines without bias. Technical report, University of Darmstadt, 2002. http://www.rtm.tu-darmstadt.de/ ehemalige-mitarbeiter/~vogt/docs/vogt-2002-smowob.pdf.
    • (2002) Technical Report University of Darmstadt
    • Vogt, M.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.