메뉴 건너뛰기




Volumn 112, Issue , 2013, Pages 64-78

Theoretical and empirical study on the potential inadequacy of mutual information for feature selection in classification

Author keywords

Classification; Feature selection; Hellman Raviv and Fano bounds; Mutual information; Probability of misclassification

Indexed keywords

CLASSIFICATION ACCURACY; EMPIRICAL STUDIES; FEA-TURE SELECTIONS; HELLMAN-RAVIV AND FANO BOUNDS; MISCLASSIFICATION PROBABILITY; MUTUAL INFORMATIONS; PERFORMANCE CRITERION; PROBABILITY OF MISCLASSIFICATION;

EID: 84877634882     PISSN: 09252312     EISSN: 18728286     Source Type: Journal    
DOI: 10.1016/j.neucom.2012.12.051     Document Type: Article
Times cited : (28)

References (18)
  • 3
    • 33745561205 scopus 로고    scopus 로고
    • An introduction to variable and feature selection
    • Guyon I., Elisseeff A. An introduction to variable and feature selection. J. Mach. Learn. Res. 2003, 3:1157-1182.
    • (2003) J. Mach. Learn. Res. , vol.3 , pp. 1157-1182
    • Guyon, I.1    Elisseeff, A.2
  • 4
    • 0028468293 scopus 로고
    • Using mutual information for selecting features in supervised neural net learning
    • Battiti R. Using mutual information for selecting features in supervised neural net learning. IEEE Trans. Neural Networks 1994, 5:537-550.
    • (1994) IEEE Trans. Neural Networks , vol.5 , pp. 537-550
    • Battiti, R.1
  • 6
    • 24344458137 scopus 로고    scopus 로고
    • Feature selection based on mutual information. criteria of max-dependency, max-relevance, and min-redundancy
    • Peng H., Long F., Ding C. Feature selection based on mutual information. criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27:1226-1238.
    • (2005) IEEE Trans. Pattern Anal. Mach. Intell. , vol.27 , pp. 1226-1238
    • Peng, H.1    Long, F.2    Ding, C.3
  • 7
    • 33645690579 scopus 로고    scopus 로고
    • Fast binary feature selection with conditional mutual information
    • Fleuret F. Fast binary feature selection with conditional mutual information. J. Mach. Learn. Res. 2004, 5:1531-1555.
    • (2004) J. Mach. Learn. Res. , vol.5 , pp. 1531-1555
    • Fleuret, F.1
  • 8
    • 32944462016 scopus 로고    scopus 로고
    • Mutual information for the selection of relevant variables in spectrometric nonlinear modelling
    • Rossi F., Lendasse A., Francoi D., Wertz V., Verleysen M. Mutual information for the selection of relevant variables in spectrometric nonlinear modelling. Chemom. Intell. Lab. Syst. 2006, 80:215-226.
    • (2006) Chemom. Intell. Lab. Syst. , vol.80 , pp. 215-226
    • Rossi, F.1    Lendasse, A.2    Francoi, D.3    Wertz, V.4    Verleysen, M.5
  • 9
  • 10
    • 84856043672 scopus 로고
    • A mathematical theory of communication
    • 623-656
    • Shannon C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27:379-423. 623-656.
    • (1948) Bell Syst. Tech. J. , vol.27 , pp. 379-423
    • Shannon, C.E.1
  • 13
    • 84915005479 scopus 로고
    • Probability of error, equivocation and the Chernoff bound
    • Hellman M.E., Raviv J. Probability of error, equivocation and the Chernoff bound. IEEE Trans. Inf. Theory 1970, 16:368-372.
    • (1970) IEEE Trans. Inf. Theory , vol.16 , pp. 368-372
    • Hellman, M.E.1    Raviv, J.2
  • 14
    • 70349322734 scopus 로고    scopus 로고
    • An information theoretic perspective on multiple classifier systems
    • Springer-Verlag, Berlin, Heidelberg, 2009
    • G. Brown, An information theoretic perspective on multiple classifier systems, in: Proceedings of MCS 2009, Springer-Verlag, Berlin, Heidelberg, 2009, pp. 344-353.
    • (2009) Proceedings of MCS , pp. 344-353
    • Brown, G.1
  • 15
    • 33646090136 scopus 로고    scopus 로고
    • Spectral feature projections that maximize Shannon mutual information with class labels
    • Ozertem U., Erdogmus D., Jenssen R. Spectral feature projections that maximize Shannon mutual information with class labels. Pattern Recognition 2006, 39:1241-1252.
    • (2006) Pattern Recognition , vol.39 , pp. 1241-1252
    • Ozertem, U.1    Erdogmus, D.2    Jenssen, R.3
  • 16
    • 33847674996 scopus 로고    scopus 로고
    • Resampling methods for parameter-free and robust feature selection with mutual information
    • Francois D., Rossi F., Wertz V., Verleysen M. Resampling methods for parameter-free and robust feature selection with mutual information. Neurocomputing 2007, 70(7-9):1276-1288.
    • (2007) Neurocomputing , vol.70 , Issue.7-9 , pp. 1276-1288
    • Francois, D.1    Rossi, F.2    Wertz, V.3    Verleysen, M.4
  • 17
    • 84877631679 scopus 로고    scopus 로고
    • D.N.A. Asuncion, UCI machine learning repository
    • D.N.A. Asuncion, UCI machine learning repository, 2007 URL 〈〉. http://www.ics.uci.edu/~mlearn/MLRepository.html.
    • (2007)
  • 18
    • 0023325560 scopus 로고
    • Sample estimate of the entropy of a random vector
    • Kozachenko L.F., Leonenko N. Sample estimate of the entropy of a random vector. Probl. Inf. Transm. 1987, 23:95-101.
    • (1987) Probl. Inf. Transm. , vol.23 , pp. 95-101
    • Kozachenko, L.F.1    Leonenko, N.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.