메뉴 건너뛰기




Volumn 11, Issue 3, 2013, Pages

Hyper-parameter selection for sparse LS-SVM via minimization of its localized generalization error

Author keywords

hyper parameter selection; Least squares support vector machine (LS SVM); localized generalization error model (L GEM); sensitivity measure; sparsity

Indexed keywords

HYPER-PARAMETER; LEAST SQUARES SUPPORT VECTOR MACHINES; LOCALIZED GENERALIZATION ERROR MODELS; SENSITIVITY MEASURES; SPARSITY;

EID: 84878923846     PISSN: 02196913     EISSN: None     Source Type: Journal    
DOI: 10.1142/S0219691313500306     Document Type: Article
Times cited : (16)

References (22)
  • 1
    • 0032638628 scopus 로고    scopus 로고
    • Least squares support vector machine classifiers
    • J. A. K. Suykens and J. Vandewalle, Least squares support vector machine classifiers, Neural Process. Lett. 9(3) (1999) 293-300.
    • (1999) Neural Process. Lett. , vol.9 , Issue.3 , pp. 293-300
    • Suykens, J.A.K.1    Vandewalle, J.2
  • 3
    • 78650681132 scopus 로고    scopus 로고
    • Model selection for least squares support vector regressions based on small-world strategy
    • W. Mao, G. Yan, L. Dong and D. Hu, Model selection for least squares support vector regressions based on small-world strategy, Exper. Syst. Appl. 38(4) (2011) 3227-3237.
    • (2011) Exper. Syst. Appl. , vol.38 , Issue.4 , pp. 3227-3237
    • Mao, W.1    Yan, G.2    Dong, L.3    Hu, D.4
  • 4
    • 40649116219 scopus 로고    scopus 로고
    • Leave-one-out cross-validation based model selection criteria for weighted LS-SVMs
    • G. C. Cawley, Leave-one-out cross-validation based model selection criteria for weighted LS-SVMs, Intl. Joint Conf. Neural Networ. (2006) 1661-1668.
    • (2006) Intl. Joint Conf. Neural Networ. , pp. 1661-1668
    • Cawley, G.C.1
  • 5
    • 0141639615 scopus 로고    scopus 로고
    • Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers
    • G. C. Cawley and N. L. C. Talbot, Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers, Pattern Recogn. 36(11) (2003) 2585-2592.
    • (2003) Pattern Recogn. , vol.36 , Issue.11 , pp. 2585-2592
    • Cawley, G.C.1    Talbot, N.L.C.2
  • 7
    • 34347245533 scopus 로고    scopus 로고
    • Localized generalization error model and its application to architecture selection for radial basis functions neural network
    • D. S. Yeung, W. W. Y. Ng, D. Wang, E. C. C. Tsang and X. Z. Wang, Localized generalization error model and its application to architecture selection for radial basis functions neural network, IEEE T. Neural Networ. 18(5) (2007) 1294-1305.
    • (2007) IEEE T. Neural Networ. , vol.18 , Issue.5 , pp. 1294-1305
    • Yeung, D.S.1    Ng, W.W.Y.2    Wang, D.3    Tsang, E.C.C.4    Wang, X.Z.5
  • 10
    • 80052923482 scopus 로고    scopus 로고
    • 1-Norm least squares twin support vector machines
    • S. Gao, Q. Ye and N. Ye, 1-Norm least squares twin support vector machines, Neurocomputing 74 (2011) 3590-3597.
    • (2011) Neurocomputing , vol.74 , pp. 3590-3597
    • Gao, S.1    Ye, Q.2    Ye, N.3
  • 11
    • 73949113489 scopus 로고    scopus 로고
    • Sparse kernel learning with LASSO and Bayesian inference algorithm
    • J. Gao, P. W. Kwan and D. Shi, Sparse kernel learning with LASSO and Bayesian inference algorithm, Neural Networks 23(2) (2010) 257-264.
    • (2010) Neural Networks , vol.23 , Issue.2 , pp. 257-264
    • Gao, J.1    Kwan, P.W.2    Shi, D.3
  • 12
    • 28244453270 scopus 로고    scopus 로고
    • SMO-based pruning methods for sparse least squares support vector machines
    • X. Zeng and X. Chen, SMO-based pruning methods for sparse least squares support vector machines, IEEE T. Neural Networ. 16(6) (2005) 1541-1546.
    • (2005) IEEE T. Neural Networ. , vol.16 , Issue.6 , pp. 1541-1546
    • Zeng, X.1    Chen, X.2
  • 13
    • 78049527784 scopus 로고    scopus 로고
    • A weighted Lq adaptive least squares support vector machine classifiers-Robust and sparse approximation
    • J. Liu, J. Li, W. Xu and Y. Shi, A weighted Lq adaptive least squares support vector machine classifiers-Robust and sparse approximation, Exper. Syst. Appl. 38(3) (2011) 225-2259.
    • (2011) Exper. Syst. Appl. , vol.38 , Issue.3 , pp. 225-2259
    • Liu, J.1    Li, J.2    Xu, W.3    Shi, Y.4
  • 14
    • 38649088632 scopus 로고    scopus 로고
    • An orthogonal forward regression technique for sparse kernel density estimation
    • S. Chen, X. Hong and C. J. Harris, An orthogonal forward regression technique for sparse kernel density estimation, Neurocomputing 71(4) (2008) 931-943.
    • (2008) Neurocomputing , vol.71 , Issue.4 , pp. 931-943
    • Chen, S.1    Hong, X.2    Harris, C.J.3
  • 15
    • 70350217628 scopus 로고    scopus 로고
    • IP-LSSVM: A two-step sparse classifier
    • B. P. R. Carvalho and A. P. Braga, IP-LSSVM: A two-step sparse classifier, Pattern Recogn. Lett. 30(16) (2009) 1507-1515.
    • (2009) Pattern Recogn. Lett. , vol.30 , Issue.16 , pp. 1507-1515
    • Carvalho, B.P.R.1    Braga, A.P.2
  • 16
    • 79955482677 scopus 로고    scopus 로고
    • Evolution strategies based adaptive Lp LS-SVM
    • L. Wei, Z. Chen and J. Li, Evolution strategies based adaptive Lp LS-SVM, Inform. Sciences 181 (2011) 3000-3016.
    • (2011) Inform. Sciences , vol.181 , pp. 3000-3016
    • Wei, L.1    Chen, Z.2    Li, J.3
  • 17
    • 84855897694 scopus 로고    scopus 로고
    • Least squares support vector machines with tuning based on chaotic differential evolution approach applied to the identification of a thermal process
    • G. S. Santos, L. G. J. Luvizotto, V. C. Mariani and L. S. Coelho, Least squares support vector machines with tuning based on chaotic differential evolution approach applied to the identification of a thermal process, Exper. Syst. Appl. 39 (2012) 4805-4812.
    • (2012) Exper. Syst. Appl. , vol.39 , pp. 4805-4812
    • Santos, G.S.1    Luvizotto, L.G.J.2    Mariani, V.C.3    Coelho, L.S.4
  • 18
    • 56549111881 scopus 로고    scopus 로고
    • Novel LS-SVMs hyperparameter selection based on particle swarm optimization
    • X. C. Guo, J. H. Yang, C. G. Wu, C. Y. Wang and Y. C. Liang, Novel LS-SVMs hyperparameter selection based on particle swarm optimization, Neurocomputing 71 (2008) 3211-3215.
    • (2008) Neurocomputing , vol.71 , pp. 3211-3215
    • Guo, X.C.1    Yang, J.H.2    Wu, C.G.3    Wang, C.Y.4    Liang, Y.C.5
  • 19
    • 34247558132 scopus 로고    scopus 로고
    • Preventing over-fitting during model selection via bayesian regularisation of the hyper-parameters
    • G. C. Cawley and N. L. C. Talbot, Preventing over-fitting during model selection via bayesian regularisation of the hyper-parameters, J. Mach. Learn. Res. 8 (2007) 841-861.
    • (2007) J. Mach. Learn. Res. , vol.8 , pp. 841-861
    • Cawley, G.C.1    Talbot, N.L.C.2
  • 20
    • 84862027224 scopus 로고    scopus 로고
    • Efficient cross-validation for kernelized least-squares regression with sparse basis expansions
    • T. Pahikkala, H. Suominen and J. Boberg, Efficient cross-validation for kernelized least-squares regression with sparse basis expansions, Mach. Learn. 87 (2012) 381-407.
    • (2012) Mach. Learn. , vol.87 , pp. 381-407
    • Pahikkala, T.1    Suominen, H.2    Boberg, J.3
  • 21
    • 34247558132 scopus 로고    scopus 로고
    • Preventing over-fitting during model selection via bayesian regularisation of the hyper-parameters
    • G. C. Cawley and N. L. C. Talbot, Preventing over-fitting during model selection via bayesian regularisation of the hyper-parameters, J. Mach. Learn. Res. 8 (2007) 841-861.
    • (2007) J. Mach. Learn. Res. , vol.8 , pp. 841-861
    • Cawley, G.C.1    Talbot, N.L.C.2
  • 22
    • 49449083313 scopus 로고    scopus 로고
    • Feature selection using localized generalization error for supervised classification problems using RBFNN
    • W. W. Y. Ng, D. S. Yeung, M. Firth, E. C. C. Tsang and X. Z. Wang, Feature selection using localized generalization error for supervised classification problems using RBFNN, Pattern Recog. 41(12) (2008) 3706-3719.
    • (2008) Pattern Recog. , vol.41 , Issue.12 , pp. 3706-3719
    • Ng, W.W.Y.1    Yeung, D.S.2    Firth, M.3    Tsang, E.C.C.4    Wang, X.Z.5


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.