메뉴 건너뛰기




Volumn 28, Issue 2, 2013, Pages 256-275

On-line SVM learning via an incremental primal-dual technique

Author keywords

incremental gradient technique; interior point methods; machine learning; nonlinear programming; primal dual method; quadratic programming; SVM

Indexed keywords

COMPUTATIONAL RESULTS; CPU TIME; DATA CLASSIFICATION; DECOMPOSITION PROPERTY; GENERALIZATION PERFORMANCE; INCREMENTAL GRADIENT TECHNIQUE; INTERIOR-POINT METHOD; NON-LINEAR OPTIMIZATION PROBLEMS; ON-LINE TRAINING ALGORITHM; PATTERN RECOGNITION PROBLEMS; PREDICTION ACCURACY; PRIMAL-DUAL; PRIMAL-DUAL METHODS; SVM; TRAINING DATA SETS; TRAINING METHODS;

EID: 84870862968     PISSN: 10556788     EISSN: 10294937     Source Type: Journal    
DOI: 10.1080/10556788.2011.633705     Document Type: Article
Times cited : (4)

References (31)
  • 1
    • 84870950737 scopus 로고
    • Incremental least squares methods and the extended Kalman filter
    • Cambridge, MA: Department of Electrical Engineering and Computer Science, MIT
    • Bertsekas, D. P. 1995. Incremental least squares methods and the extended Kalman filter. In Tech. Rep, Cambridge, MA: Department of Electrical Engineering and Computer Science, MIT.
    • (1995) Tech. Rep
    • Bertsekas, D.P.1
  • 2
    • 0040251121 scopus 로고    scopus 로고
    • A new class of incremental gradient methods for least squares problems
    • Cambridge, MA: Department of Electrical Engineering and Computer Science, MIT
    • Bertsekas, D. P. 1996. A new class of incremental gradient methods for least squares problems. In Tech. Rep, Cambridge, MA: Department of Electrical Engineering and Computer Science, MIT.
    • (1996) Tech. Rep
    • Bertsekas, D.P.1
  • 3
    • 81155141540 scopus 로고    scopus 로고
    • Incremental gradient, subgradient, and proximal methods for convex optimization: A survey
    • Cambridge, MA: LIDS-2848, Laboratory for Information and Decision Systems, MIT
    • Bersekas, D. P. 2010. Incremental gradient, subgradient, and proximal methods for convex optimization: A survey. In Tech. Rep, Cambridge, MA: LIDS-2848, Laboratory for Information and Decision Systems, MIT.
    • (2010) Tech. Rep
    • Bersekas, D.P.1
  • 4
    • 80052866161 scopus 로고    scopus 로고
    • Incremental and decremental support vector machine learning
    • Edited by: Leen, T. K. Dietterich, T. G. and Tresp, V. Cambridge: MIT
    • Cauwenberghs, G., and, Poggio, T. 2001. Incremental and decremental support vector machine learning. In Advances in Neural Information Processing Systems 13, Edited by: Leen, T. K., Dietterich, T. G. and Tresp, V. 409-415. Cambridge: MIT.
    • (2001) Advances in Neural Information Processing Systems 13 , pp. 409-415
    • Cauwenberghs, G.1    Poggio, T.2
  • 7
    • 84872277037 scopus 로고    scopus 로고
    • An incremental primal-dual method for nonlinear programming with special structure
    • doi:10.1007/s11590-011-0393-0
    • Couellan, N. P., and, Trafalis, T. B. 2011. An incremental primal-dual method for nonlinear programming with special structure. In Optim. Lett. doi:10.1007/s11590-011-0393-0
    • (2011) Optim. Lett
    • Couellan, N.P.1    Trafalis, T.B.2
  • 8
    • 0016916995 scopus 로고
    • New least-square algorithms
    • Davidon, W. C. 1976. New least-square algorithms. J. Optim. Theory Appl, 18 (2): 187-197.
    • (1976) J. Optim. Theory Appl , vol.18 , Issue.2 , pp. 187-197
    • Davidon, W.C.1
  • 11
    • 0000259511 scopus 로고    scopus 로고
    • Approximate statistical tests for comparing supervised classification learning algorithms
    • Dietterich, T. G. 1998. Approximate statistical tests for comparing supervised classification learning algorithms. Neural Comput, 10 (7): 1895-1924.
    • (1998) Neural Comput , vol.10 , Issue.7 , pp. 1895-1924
    • Dietterich, T.G.1
  • 12
    • 0030541525 scopus 로고    scopus 로고
    • On the formulation and theory of the Newton interior point method for nonlinear programming
    • El-Bakry, A. S., Tapia, R. A., Tsuchiya, T., and, Zhang, Y. 1996. On the formulation and theory of the Newton interior point method for nonlinear programming. J. Optim. Theory Appl, 89 (3): 507-541.
    • (1996) J. Optim. Theory Appl , vol.89 , Issue.3 , pp. 507-541
    • El-Bakry, A.S.1    Tapia, R.A.2    Tsuchiya, T.3    Zhang, Y.4
  • 13
    • 78649934709 scopus 로고    scopus 로고
    • Irvine, CA: University of California, School of Information and Computer Science
    • Frank, A., and, Asuncion, A. 2010. UCI Machine Learning Repository. Irvine, CA: University of California, School of Information and Computer Science. Available at http://archive.ics.uci.edu/ml
    • (2010) UCI Machine Learning Repository
    • Frank, A.1    Asuncion, A.2
  • 14
    • 84868111801 scopus 로고    scopus 로고
    • A new approximate maximal margin classification algorithm
    • Gentile, C. 2001. A new approximate maximal margin classification algorithm. J. Mach. Learn. Res, 2: 213-242.
    • (2001) J. Mach. Learn. Res , vol.2 , pp. 213-242
    • Gentile, C.1
  • 17
    • 84870879361 scopus 로고    scopus 로고
    • Matlab, The Math-Works, Inc. Natick, MA, 1994-2010
    • Matlab, The Math-Works, Inc., Natick, MA, 1994-2010. Available at http://www.mathworks.com
  • 18
    • 0036161258 scopus 로고    scopus 로고
    • The relaxed online maximum margin algorithm
    • Li, Y., and, Long, P. M. 2002. The relaxed online maximum margin algorithm. Mach. Learn, 46 (1): 361-387.
    • (2002) Mach. Learn , vol.46 , Issue.1 , pp. 361-387
    • Li, Y.1    Long, P.M.2
  • 19
    • 33744961449 scopus 로고    scopus 로고
    • Breast cancer survival and chemotherapy: A support vector machine analysis
    • Providence, RI: American Mathematical Society. DIMACS Series in Discrete Mathematics and Theoretical Computer Science
    • Lee, Y. J., Mangasarian, O. L., and, Wolberg, W. H. 2000. Breast cancer survival and chemotherapy: A support vector machine analysis. In Data Mining Institute Tech. Rep. 99-10, Vol. 55, 1-10. Providence, RI: American Mathematical Society. DIMACS Series in Discrete Mathematics and Theoretical Computer Science
    • (2000) Data Mining Institute Tech. Rep. 99-10 , vol.55 , pp. 1-10
    • Lee, Y.J.1    Mangasarian, O.L.2    Wolberg, W.H.3
  • 20
    • 33645022761 scopus 로고    scopus 로고
    • Breast tumor susceptibility to chemotherapy via support vector machines
    • Fung, G. M., and, Mangasarian, O. L. 2006. Breast tumor susceptibility to chemotherapy via support vector machines. Comput. Manage. Sci, 3: 103-112.
    • (2006) Comput. Manage. Sci , vol.3 , pp. 103-112
    • Fung, G.M.1    Mangasarian, O.L.2
  • 21
    • 0032594961 scopus 로고    scopus 로고
    • Successive overrelaxation for support vector machines
    • Mangasarian, O. L., and, Musicant, D. R. 1999. Successive overrelaxation for support vector machines. IEEE Trans. Neural Netw, 10: 1032-1037.
    • (1999) IEEE Trans. Neural Netw , vol.10 , pp. 1032-1037
    • Mangasarian, O.L.1    Musicant, D.R.2
  • 22
    • 0042847140 scopus 로고    scopus 로고
    • Inference for the generalization error
    • Nadeau, C., and, Bengio, Y. 2003. Inference for the generalization error. Mach. Learn, 52 (3): 239-281.
    • (2003) Mach. Learn , vol.52 , Issue.3 , pp. 239-281
    • Nadeau, C.1    Bengio, Y.2
  • 24
    • 0004155661 scopus 로고
    • Englewood Cliffs, NJ: Prentice-Hall International (UK)
    • Sderstrom, T., and, Stoica, P. 1989. System Identification. Englewood Cliffs, NJ: Prentice-Hall International (UK).
    • (1989) System Identification
    • Sderstrom, T.1    Stoica, P.2
  • 27
    • 37249073050 scopus 로고    scopus 로고
    • Active learning with support vector machines for tornado prediction
    • Edited by: Shi, Y. Albade, G. D. Dongara, J. and Sloot, P. M.A. Berlin: Springer
    • Trafalis, T. B., Adrianto, I., and, Richman, M. B. 2007. Active learning with support vector machines for tornado prediction. In ICCS 2007, Part I, LNCS 4487, Edited by: Shi, Y., Albade, G. D., Dongara, J. and Sloot, P. M.A. 1130-1137. Berlin: Springer.
    • (2007) ICCS 2007, Part I, LNCS 4487 , pp. 1130-1137
    • Trafalis, T.B.1    Adrianto, I.2    Richman, M.B.3
  • 29
    • 0039659391 scopus 로고
    • Incremental gradient(-projection) method with momentum term and adaptive stepsize rule
    • Seattle, WA: Department of Mathematics, University of Washington
    • Tseng, P. 1995. Incremental gradient(-projection) method with momentum term and adaptive stepsize rule. In Tech. Rep, Seattle, WA: Department of Mathematics, University of Washington.
    • (1995) Tech. Rep
    • Tseng, P.1
  • 31
    • 0003456156 scopus 로고
    • A globally convergent primal-dual interior point method for constrained optimization
    • Shinjuku, Tokyo, Japan: Mathematical System Institute, Inc
    • Yamashita, H. 1992. A globally convergent primal-dual interior point method for constrained optimization. In Tech. Rep, Shinjuku, Tokyo, Japan: Mathematical System Institute, Inc.
    • (1992) Tech. Rep
    • Yamashita, H.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.