메뉴 건너뛰기




Volumn 33, Issue 2, 2011, Pages 210-218

Conditional mutual information-based feature selection analyzing for synergy and redundancy

Author keywords

Classification; Conditional mutual information; Feature selection; Interaction; Redundancy

Indexed keywords

CLASSIFICATION; CLASSIFICATION TASKS; CONDITIONAL MUTUAL INFORMATION; DISCRIMINATIVE FEATURES; FEATURE REDUNDANCY; FEATURE SELECTION; FEATURE SELECTION ALGORITHM; INTERACTION; INTERACTION INFORMATION; MUTUAL INFORMATIONS; REDUNDANT FEATURES;

EID: 79954439038     PISSN: 12256463     EISSN: None     Source Type: Journal    
DOI: 10.4218/etrij.11.0110.0237     Document Type: Article
Times cited : (72)

References (19)
  • 2
    • 0013326060 scopus 로고    scopus 로고
    • Feature selection for classification
    • M. Dash and H. Liu, "Feature Selection for Classification, " Intelligent Data Analysis, vol. 1, 1997, pp. 131-156.
    • (1997) Intelligent Data Analysis , vol.1 , pp. 131-156
    • Dash, M.1    Liu, H.2
  • 3
    • 0004493166 scopus 로고    scopus 로고
    • On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems
    • PII S0304397597001151
    • E. Amaldi and V. Kann, "On the Approximation of Minimizing Nonzero Variables or Unsatisfied Relations in Linear Systems, " Theoretical Computer Sci., vol. 209, 1998, pp. 237-260. (Pubitemid 128450440)
    • (1998) Theoretical Computer Science , vol.209 , Issue.1-2 , pp. 237-260
    • Amaldi, E.1    Kann, V.2
  • 4
    • 0031381525 scopus 로고    scopus 로고
    • Wrappers for feature subset selection
    • PII S000437029700043X
    • R. Kohavi and G. H. John, "Wrappers for Feature Subset Selection, " Artificial Intell., vol. 97, no. 1-2, 1997, pp. 273-324. (Pubitemid 127401107)
    • (1997) Artificial Intelligence , vol.97 , Issue.1-2 , pp. 273-324
    • Kohavi, R.1    John, G.H.2
  • 5
    • 0028468293 scopus 로고
    • Using mutual information for selecting features in supervised neural net learning
    • R Battiti, "Using Mutual Information for Selecting Features in Supervised Neural Net Learning, " IEEE Tram. Neural Netw., vol. 5, no. 4, 1994, pp. 537-550.
    • (1994) IEEE Tram. Neural. Netw. , vol.5 , Issue.4 , pp. 537-550
    • Battiti, R.1
  • 6
    • 0036127473 scopus 로고    scopus 로고
    • Input feature selection for classification problems
    • DOI 10.1109/72.977291, PII S1045922702003417
    • N. Kwak and C. H. Choi, "Input Feature Selection for Classification Problems, " IEEE Tram. Neural Netw., vol. 13, no. 1, 2002, pp. 143-159. (Pubitemid 34236844)
    • (2002) IEEE Transactions on Neural Networks , vol.13 , Issue.1 , pp. 143-159
    • Kwak, N.1    Choi, C.-H.2
  • 7
    • 42449117307 scopus 로고    scopus 로고
    • Feature selection for classificatoiy analysis based on information-theoretic criteria
    • J. J. HUANG et al., "Feature Selection for Classificatoiy Analysis Based on Information-Theoretic Criteria, " Acta Automatica Sinica, vol. 34, no. 3, 2008, pp. 383-392.
    • (2008) Acta Automatica Sinica , vol.34 , Issue.3 , pp. 383-392
    • Huang, J.J.1
  • 8
    • 24344458137 scopus 로고    scopus 로고
    • Feature selection based on mutual information: Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy
    • DOI 10.1109/TPAMI.2005.159
    • H. Peng, F. Long, and C. Ding, "Feature Selection Based on Mutual Information: Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy, " IEEE Tram. Pattern Anal. Machine Intell, vol. 27, no. 8, 2005, pp. 1226-1238. (Pubitemid 41245053)
    • (2005) IEEE Transactions on Pattern Analysis and Machine Intelligence , vol.27 , Issue.8 , pp. 1226-1238
    • Peng, H.1    Long, F.2    Ding, C.3
  • 9
    • 60849097547 scopus 로고    scopus 로고
    • Normalized mutual information feature selection
    • P. A. Estevez et al., "Normalized Mutual Information Feature Selection, " IEEE Tram. Neural Netw., vol. 20, no. 2, 2009, pp. 189-201.
    • (2009) IEEE Tram. Neural. Netw. , vol.20 , Issue.2 , pp. 189-201
    • Estevez, P.A.1
  • 10
    • 38449109775 scopus 로고    scopus 로고
    • Conditional mutual information based feature selection for classification task
    • Springer
    • J. Novovicova, "Conditional Mutual Information Based Feature Selection for Classification Task, " Progress Pattern Recog., Image Anal. Appl., LNCS, Springer, vol. 4756, 2007, pp. 417-426.
    • (2007) Progress Pattern Recog., Image Anal. Appl., LNCS , vol.4756 , pp. 417-426
    • Novovicova, J.1
  • 14
    • 0002593344 scopus 로고
    • Multi-interval discretization of continuous-valued attributes for classification learning
    • U. M. Fayyad and K. B. Irani, "Multi-interval Discretization of Continuous-Valued Attributes for Classification Learning, " Proc. 13th Int. Joint Conf. Artificial Intell., 1993, pp. 1022-1027.
    • (1993) Proc. 13th Int. Joint Conf. Artificial Intell. , pp. 1022-1027
    • Fayyad, U.M.1    Irani, K.B.2
  • 15
    • 0002801737 scopus 로고
    • Multivariate information transmission
    • W. J. McGill, "Multivariate Information Transmission, " Psychomeetrika, vol. 19, no. 2, 1954, pp. 97-116.
    • (1954) Psychomeetrika , vol.19 , Issue.2 , pp. 97-116
    • McGill, W.J.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.