메뉴 건너뛰기




Volumn 19, Issue 4, 2007, Pages 1082-1096

Recursive finite Newton algorithm for support vector regression in the primal

Author keywords

[No Author keywords available]

Indexed keywords


EID: 34247216124     PISSN: 08997667     EISSN: 1530888X     Source Type: Journal    
DOI: 10.1162/neco.2007.19.4.1082     Document Type: Article
Times cited : (36)

References (20)
  • 1
    • 27144489164 scopus 로고    scopus 로고
    • A tutorial on support vector machines for pattern recognition
    • Burges, C. J. C. (1998). A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 2(2), 121-167.
    • (1998) Data Mining and Knowledge Discovery , vol.2 , Issue.2 , pp. 121-167
    • Burges, C.J.C.1
  • 2
    • 33749246680 scopus 로고    scopus 로고
    • Training a support vector machine in the primal
    • no. 147, Tübingen: Max Planck Institute for Biological Cybernetics
    • Chapelle, O. (2006). Training a support vector machine in the primal. (MPI-Tech. Rep. no. 147). Tübingen: Max Planck Institute for Biological Cybernetics.
    • (2006) MPI-Tech. Rep
    • Chapelle, O.1
  • 4
    • 29144499905 scopus 로고    scopus 로고
    • Working set selection using second order information for training support vector machines
    • Fan, R. E., Chen P. H., & Lin C. J. (2005). Working set selection using second order information for training support vector machines. Journal of Machine Learning Research, 6, 1889-1918.
    • (2005) Journal of Machine Learning Research , vol.6 , pp. 1889-1918
    • Fan, R.E.1    Chen, P.H.2    Lin, C.J.3
  • 5
    • 0242288821 scopus 로고    scopus 로고
    • Finite Newton method for Lagrangian support vector machine classification
    • Fung, G., & Mangasarian, O. L. (2003). Finite Newton method for Lagrangian support vector machine classification. Neurocomputing, 55(1-2), 39-55.
    • (2003) Neurocomputing , vol.55 , Issue.1-2 , pp. 39-55
    • Fung, G.1    Mangasarian, O.L.2
  • 6
    • 0021371266 scopus 로고
    • Generalized Hessian matrix and second-order optimality conditions for problems with CL1 data
    • Hiriart-Urruty, J. B., Strodiot, J. J., & Nguyen V. H. (1984). Generalized Hessian matrix and second-order optimality conditions for problems with CL1 data. Applied Mathematics and Optimization, 11, 43-56.
    • (1984) Applied Mathematics and Optimization , vol.11 , pp. 43-56
    • Hiriart-Urruty, J.B.1    Strodiot, J.J.2    Nguyen, V.H.3
  • 8
    • 0002714543 scopus 로고    scopus 로고
    • Making large-scale SVM learning practical
    • B. Schölkopf, C. Burges, & A. J. Smola Eds, Cambridge, MA: MIT Press
    • Joachims, T. (1999). Making large-scale SVM learning practical. In B. Schölkopf, C. Burges, & A. J. Smola (Eds.), Advances in kernel methods - Support vector learning. Cambridge, MA: MIT Press.
    • (1999) Advances in kernel methods - Support vector learning
    • Joachims, T.1
  • 10
    • 21844461582 scopus 로고    scopus 로고
    • A modified finite Newton method for fast solution of large scale linear SVMS
    • Keerthi, S. S., & DeCoste D. M. (2005). A modified finite Newton method for fast solution of large scale linear SVMS. Journal of Machine Learning Research, 6, 341-361.
    • (2005) Journal of Machine Learning Research , vol.6 , pp. 341-361
    • Keerthi, S.S.1    DeCoste, D.M.2
  • 12
    • 0000406385 scopus 로고
    • A correspondence between Bayesian estimation on stochastic processes and smoothing by splines
    • Kimeldorf, G. S., & Wahba G. (1970). A correspondence between Bayesian estimation on stochastic processes and smoothing by splines. Annals of Mathematical Statistics, 41, 495-502.
    • (1970) Annals of Mathematical Statistics , vol.41 , pp. 495-502
    • Kimeldorf, G.S.1    Wahba, G.2
  • 13
    • 0000793765 scopus 로고
    • Finite algorithms for robust linear-regression
    • Madsen, K., & Nielsen H. B. (1990). Finite algorithms for robust linear-regression. BIT, 30(4), 682-699.
    • (1990) BIT , vol.30 , Issue.4 , pp. 682-699
    • Madsen, K.1    Nielsen, H.B.2
  • 14
    • 0036817951 scopus 로고    scopus 로고
    • A finite Newton method for classification
    • Mangasarian, O. L. (2002). A finite Newton method for classification. Optimization Methods and Software, 17(5), 913-929.
    • (2002) Optimization Methods and Software , vol.17 , Issue.5 , pp. 913-929
    • Mangasarian, O.L.1
  • 15
    • 0003120218 scopus 로고    scopus 로고
    • Sequential minimal optimization: A fast algorithm for training support vector machines
    • B. Schölkopf, C. J. C. Burges, & A. J. Smola Eds, Cambridge, MA: MIT Press
    • Platt, J. (1999). Sequential minimal optimization: A fast algorithm for training support vector machines. In B. Schölkopf, C. J. C. Burges, & A. J. Smola (Eds.), Advances in kernel methods - Support vector learning. Cambridge, MA: MIT Press.
    • (1999) Advances in kernel methods - Support vector learning
    • Platt, J.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.