메뉴 건너뛰기




Volumn 26, Issue 1, 2010, Pages 183-196

Variant methods of reduced set selection for reduced support vector machines

Author keywords

Kernel methods; Kernel width estimation; Nystr m approximation; Reduced set; Sampling methods; Support vector machines

Indexed keywords

CLUSTER CENTROIDS; DATA POINTS; GAUSSIAN KERNELS; KERNEL METHODS; LARGE DATASETS; MODEL COMPLEXITY; NEW APPROACHES; RANDOM SELECTION; REDUCED SUPPORT VECTOR MACHINES; SAMPLING METHOD; SYSTEMATIC SAMPLING; TUNING TIME; VARIANT METHOD;

EID: 77249099817     PISSN: 10162364     EISSN: None     Source Type: Journal    
DOI: None     Document Type: Conference Paper
Times cited : (25)

References (30)
  • 1
    • 36948999941 scopus 로고    scopus 로고
    • University of California, School of Information and Computer Science
    • A. Asuncion and D. J. Newman, UCI Machine Learning Repository, University of California, School of Information and Computer Science, 2007, http://www.ics.uci. edu/~mlearn/MLRepository.html.
    • (2007) UCI Machine Learning Repository
    • Asuncion, A.1    Newman, D.J.2
  • 2
    • 27144489164 scopus 로고    scopus 로고
    • A tutorial on support vector machines for pattern recognition
    • C. J. C. Burges, "A tutorial on support vector machines for pattern recognition," Data Mining and Knowledge Discovery, Vol.2, 1998, pp. 121-167. (Pubitemid 128695475)
    • (1998) Data Mining and Knowledge Discovery , vol.2 , Issue.2 , pp. 121-167
    • Burges, C.J.C.1
  • 4
    • 0036161011 scopus 로고    scopus 로고
    • Choosing multiple parameters for support vector machines
    • O. Chapelle, V. Vapnik, O. Bousquet, and S. Mukherjee, "Choosing multiple parameters for support vector machines," Machine Learning, Vol.46, 2002, pp. 131-159.
    • (2002) Machine Learning , vol.46 , pp. 131-159
    • Chapelle, O.1    Vapnik, V.2    Bousquet, O.3    Mukherjee, S.4
  • 5
    • 33846055166 scopus 로고    scopus 로고
    • Clustering model selection for reduced support vector machines
    • Intelligent Data Engineering and Automated Learning - IDEAL 2004
    • L. J. Chien and Y. J. Lee, "Clustering model selection for reduced support vector machines," in Proceedings of the 5th Intelligent Data Engineering and Automated Learning, LNCS 3177, 2004, pp. 714-719. (Pubitemid 39174192)
    • (2004) LECTURE NOTES in COMPUTER SCIENCE , Issue.3177 , pp. 714-719
    • Jen, L.-R.1    Lee, Y.-J.2
  • 8
    • 77249155651 scopus 로고    scopus 로고
    • MIT Center for Biological and Computation Learning, face database (1)
    • MIT Center for Biological and Computation Learning, CBCL face database (1), 2000, http://cbcl.mit.edu/projects/cbcl/software-datasets/FaceData2.html.
    • (2000)
  • 10
    • 33745789043 scopus 로고    scopus 로고
    • Building support vector machines with reduced classifier complexity
    • S. S. Keerthi, O. Chapelle, and D. DeCoste, "Building support vector machines with reduced classifier complexity," The Journal of Machine Learning Research, Vol.7, 2006, pp. 1493-1515. (Pubitemid 44024591)
    • (2006) Journal of Machine Learning Research , vol.7 , pp. 1493-1515
    • Keerthi, S.S.1    Chapelle, O.2    Decoste, D.3
  • 11
    • 0037822222 scopus 로고    scopus 로고
    • Asymptotic behaviors of support vector machines with Gaussian kernel
    • S. S. Keerthi and C. J. Lin, "Asymptotic behaviors of support vector machines with Gaussian kernel," Neural Computation, Vol.15, 2003, pp. 1667-1689.
    • (2003) Neural Computation , vol.15 , pp. 1667-1689
    • Keerthi, S.S.1    Lin, C.J.2
  • 12
    • 19944407892 scopus 로고    scopus 로고
    • ε-SSVR: A smooth support vector machine for ε-insensitive regression
    • DOI 10.1109/TKDE.2005.77
    • Y. J. Lee, W. F. Hsieh, and C. M. Huang, "ε-SSVR: A smooth vector machine for e-insensitive regression," IEEE Transactions on Knowledge and Data Engineering, Vol.17, 2005, pp. 678-685. (Pubitemid 40750826)
    • (2005) IEEE Transactions on Knowledge and Data Engineering , vol.17 , Issue.5 , pp. 678-685
    • Lee, Y.-J.1    Hsieh, W.-F.2    Huang, C.-M.3
  • 13
    • 33846092558 scopus 로고    scopus 로고
    • Reduced support vector machines: A statistical theory
    • DOI 10.1109/TNN.2006.883722
    • Y. J. Lee and S. Y. Huang, "Reduced support vector machines: A statistical theory," IEEE Transactions on Neural Networks, Vol.18, 2007, pp. 1-13. (Pubitemid 46062912)
    • (2007) IEEE Transactions on Neural Networks , vol.18 , Issue.1 , pp. 1-13
    • Lee, Y.-J.1    Huang, S.-Y.2
  • 17
    • 0001777975 scopus 로고    scopus 로고
    • Generalized support vector machines
    • A. Smola, P. Bartlett, B. Schölkopf, and D. Schuurmans, (eds.), MIT Press, Cambridge, MA
    • O. L. Mangasarian, "Generalized support vector machines," in A. Smola, P. Bartlett, B. Schölkopf, and D. Schuurmans, (eds.), Advances in Large Margin Classifiers, MIT Press, Cambridge, MA, 2000, pp. 135-146.
    • (2000) Advances in Large Margin Classifiers , pp. 135-146
    • Mangasarian, O.L.1
  • 18
    • 1542287818 scopus 로고    scopus 로고
    • Large scale kernel regression via linear programming
    • Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin
    • O. L. Mangasarian and D. R. Musicant, "Large scale kernel regression via linear programming," Technical Report 99-102, Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, 1999.
    • (1999) Technical Report 99-102
    • Mangasarian, O.L.1    Musicant, D.R.2
  • 22
    • 4344584449 scopus 로고    scopus 로고
    • Occam's razor
    • T. K. Leen, T. G. Dietterich, and V. Tresp, (eds.), MIT Press, Cambridge, MA
    • C. E. Rasmussen and Z. Ghahramani, "Occam's razor," in T. K. Leen, T. G. Dietterich, and V. Tresp, (eds.), Advances in Neural Information Processing Systems 13, MIT Press, Cambridge, MA, 2001, pp. 294-300.
    • (2001) Advances in Neural Information Processing Systems , vol.13 , pp. 294-300
    • Rasmussen, C.E.1    Ghahramani, Z.2
  • 25
    • 4043137356 scopus 로고    scopus 로고
    • A tutorial on support vector regression
    • A. Smola and B. Schölkopf, "A tutorial on support vector regression," Statistics and Computing, Vol.14, 2004, pp. 199-222.
    • (2004) Statistics and Computing , vol.14 , pp. 199-222
    • Smola, A.1    Schölkopf, B.2
  • 26
    • 0001048793 scopus 로고
    • An asymptotically optimal window selection rule for kernel density estimates
    • C. J. Stone, "An asymptotically optimal window selection rule for kernel density estimates," Annals of Statistics, Vol.12, 1984, pp. 1285-1297.
    • (1984) Annals of Statistics , vol.12 , pp. 1285-1297
    • Stone, C.J.1
  • 28
    • 84899010839 scopus 로고    scopus 로고
    • Using the nyström method to speed up kernel machines
    • T. K. Leen, T. G. Dietterich, and V. Tresp, (eds.), MIT Press, Cambridge, MA
    • C. K. I. Williams and M. Seeger, "Using the Nyström method to speed up kernel machines," in T. K. Leen, T. G. Dietterich, and V. Tresp, (eds.), Advances in Neural Information Processing Systems 13, MIT Press, Cambridge, MA, 2001, pp. 682-688.
    • (2001) Advances in Neural Information Processing Systems , vol.13 , pp. 682-688
    • Williams, C.K.I.1    Seeger, M.2
  • 30
    • 33646379851 scopus 로고    scopus 로고
    • A direct method for building sparse kernel learning algorithms
    • M. Wu, B. Schölkopf, and G. Bakir, "A direct method for building sparse kernel learning algorithms," Journal of Machine Learning Research, Vol.7, 2006, pp. 603-624.
    • (2006) Journal of Machine Learning Research , vol.7 , pp. 603-624
    • Wu, M.1    Schölkopf, B.2    Bakir, G.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.