메뉴 건너뛰기




Volumn 12, Issue 10, 2010, Pages 2144-2170

Increasing and decreasing returns and losses in mutual information feature subset selection

Author keywords

Bayesian networks; Bit parity; Conditional entropy; Conditional mutual information; Decreasing losses; Decreasing returns; Feature subset selection; Increasing losses; Increasing returns

Indexed keywords


EID: 77958167275     PISSN: None     EISSN: 10994300     Source Type: Journal    
DOI: 10.3390/e12102144     Document Type: Article
Times cited : (7)

References (35)
  • 2
    • 67349129358 scopus 로고    scopus 로고
    • Posterior probability profiles for the automated assessment of the recovery of patients with stroke from activity of daily living tasks
    • Van Dijck, G.; Van Vaerenbergh, J.; Van Hulle, M.M. Posterior probability profiles for the automated assessment of the recovery of patients with stroke from activity of daily living tasks. Artif. Intell. Med. 2009, 46, 233-249.
    • (2009) Artif. Intell. Med. , vol.46 , pp. 233-249
    • Van Dijck, G.1    Van Vaerenbergh, J.2    Van Hulle, M.M.3
  • 4
    • 0000727041 scopus 로고
    • characteristic selection problem in recognition systems
    • Lewis II, P.M. The characteristic selection problem in recognition systems. IEEE Trans. Inf. Theory 1962, 8, 171-178.
    • (1962) IEEE Trans. Inf. Theory , vol.8 , pp. 171-178
    • Lewis, I.I.1    The, P.M.2
  • 5
    • 33745561205 scopus 로고    scopus 로고
    • An introduction to variable and feature selection
    • Guyon, I.; Elisseeff, A. An introduction to variable and feature selection. J. Mach. Learn. Res. 2003, 3, 1157-1182.
    • (2003) J. Mach. Learn. Res. , vol.3 , pp. 1157-1182
    • Guyon, I.1    Elisseeff, A.2
  • 6
    • 18744400819 scopus 로고    scopus 로고
    • Feature selection with conditional mutual information maximin in text categorization
    • Evans, D.A.; Gravano, L.; Herzog, O.; Zhai, C.; Ronthaler, M., Eds.; ACM Press: New York, NY, USA
    • Wang, G.; Lochovsky, F.H.; Yang, Q. Feature selection with conditional mutual information maximin in text categorization. Proceedings of the 13th ACM International Conference on Information and Knowledge Management (CIKM'04); Evans, D.A.; Gravano, L.; Herzog, O.; Zhai, C.; Ronthaler, M., Eds.; ACM Press: New York, NY, USA, 2004; pp. 342-349.
    • (2004) Proceedings of the 13th ACM International Conference on Information and Knowledge Management (CIKM'04) , pp. 342-349
    • Wang, G.1    Lochovsky, F.H.2    Yang, Q.3
  • 8
    • 27144471473 scopus 로고    scopus 로고
    • Efficient selection of discriminative genes from microarray gene expression data for cancer diagnosis
    • Huang, D.; Chow, T.W.S.; Wa, E.W.M.; Li, J. Efficient selection of discriminative genes from microarray gene expression data for cancer diagnosis. IEEE Trans. Circuits Syst. I-Regul. Pap. 2005, 52, 1909-1918.
    • (2005) IEEE Trans. Circuits Syst. I-Regul. Pap. , vol.52 , pp. 1909-1918
    • Huang, D.1    Chow, T.W.S.2    Wa, E.W.M.3    Li, J.4
  • 9
    • 33645951954 scopus 로고
    • Computer-automated design of multifont print recognition logic
    • Kamentsky, L.A.; Liu, C.N. Computer-automated design of multifont print recognition logic. IBM J. Res. Dev. 1963, 7, 2-13.
    • (1963) IBM J. Res. Dev. , vol.7 , pp. 2-13
    • Kamentsky, L.A.1    Liu, C.N.2
  • 10
    • 77958193287 scopus 로고
    • A programmed algorithm for designing multifont character recognition logics
    • Liu, C.N. A programmed algorithm for designing multifont character recognition logics. IEEE Trans. Electron. 1964, EC-13, 586-593.
    • (1964) IEEE Trans. Electron. , vol.EC-13 , pp. 586-593
    • Liu, C.N.1
  • 11
    • 0028468293 scopus 로고
    • Using mutual information for selecting features in supervised neural net learning
    • Battiti, R. Using mutual information for selecting features in supervised neural net learning. IEEE Trans. Neural Netw. 1994, 5, 537-550.
    • (1994) IEEE Trans. Neural Netw. , vol.5 , pp. 537-550
    • Battiti, R.1
  • 13
    • 0036964229 scopus 로고    scopus 로고
    • A constructive algorithm for feedforward neural networks with incremental training
    • Liu, D.; Chang, T.; Zhang, Y. A constructive algorithm for feedforward neural networks with incremental training. IEEE Trans. Circuits Syst. I-Regul. Pap. 2002, 49, 1876-1879.
    • (2002) IEEE Trans. Circuits Syst. I-Regul. Pap. , vol.49 , pp. 1876-1879
    • Liu, D.1    Chang, T.2    Zhang, Y.3
  • 14
    • 84937351341 scopus 로고
    • Multivariate information transmission
    • McGill, W.J. Multivariate information transmission. IEEE Trans. Inf. Theory 1954, 4, 93-111.
    • (1954) IEEE Trans. Inf. Theory , vol.4 , pp. 93-111
    • Mcgill, W.J.1
  • 15
    • 0034271164 scopus 로고    scopus 로고
    • Physical nature of higher-order mutual information: Intrinsic correlations and frustration
    • Matsuda, H. Physical nature of higher-order mutual information: Intrinsic correlations and frustration. Phys. Rev. E 2000, 62, 3096-3102.
    • (2000) Phys. Rev. E , vol.62 , pp. 3096-3102
    • Matsuda, H.1
  • 16
    • 77958173248 scopus 로고
    • Information theoretic analysis of connection structure from spike trains
    • Hanson, S.J.; Cowan, J.D.; Giles, C.L., Eds.; Morgan Kaufmann Publishers Inc.: San Francisco, CA, USA
    • Shiono, S.; Yamada, S.; Nakashima, M.; Matsumoto, K., Information theoretic analysis of connection structure from spike trains. In Advances in Neural Information Processing Systems 5; Hanson, S.J.; Cowan, J.D.; Giles, C.L., Eds.; Morgan Kaufmann Publishers Inc.: San Francisco, CA, USA, 1993; pp. 515-522.
    • (1993) Advances in Neural Information Processing Systems , vol.5 , pp. 515-522
    • Shiono, S.1    Yamada, S.2    Nakashima, M.3    Matsumoto, K.4
  • 21
    • 38049153421 scopus 로고    scopus 로고
    • Speeding up feature subset selection through mutual information relevance filtering
    • Kok, J.; Koronacki, J.; Lopez de Mantaras, R.; Matwin, S.; Mladenic, D.; Skowron, A., Eds.; Springer Berlin/Heidelberg, Lecture Notes in Computer Science
    • Van Dijck, G.; Van Hulle, M.M. Speeding up feature subset selection through mutual information relevance filtering. In Knowledge Discovery in Databases: PKDD 2007; Kok, J.; Koronacki, J.; Lopez de Mantaras, R.; Matwin, S.; Mladenic, D.; Skowron, A., Eds.; Springer Berlin/Heidelberg, 2007; Vol. 4702, Lecture Notes in Computer Science, pp. 277-287.
    • (2007) Knowledge Discovery in Databases: PKDD 2007 , vol.4702 , pp. 277-287
    • Van Dijck, G.1    Van Hulle, M.M.2
  • 23
    • 0036127473 scopus 로고    scopus 로고
    • Input feature selection for classification problems
    • Kwak, N.; Choi, C.H. Input feature selection for classification problems. IEEE Trans. Neural Netw. 2002, 13, 143-159.
    • (2002) IEEE Trans. Neural Netw. , vol.13 , pp. 143-159
    • Kwak, N.1    Choi, C.H.2
  • 24
    • 24344458137 scopus 로고    scopus 로고
    • Feature selection based on mutual information: Criteria of max-dependency, max-relevance and min-redundancy
    • Peng, H.; Long, F.; Ding, C. Feature selection based on mutual information: Criteria of max-dependency, max-relevance and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1226-1238.
    • (2005) IEEE Trans. Pattern Anal. Mach. Intell. , vol.27 , pp. 1226-1238
    • Peng, H.1    Long, F.2    Ding, C.3
  • 26
    • 33645690579 scopus 로고    scopus 로고
    • Fast binary feature selection with conditional mutual information
    • Fleuret, F. Fast binary feature selection with conditional mutual information. J. Mach. Learn. Res. 2004, 5, 1531-1555.
    • (2004) J. Mach. Learn. Res. , vol.5 , pp. 1531-1555
    • Fleuret, F.1
  • 27
    • 0036933407 scopus 로고    scopus 로고
    • Input feature selection by mutual information based on Parzen window
    • Kwak, N.; Choi, C.H. Input feature selection by mutual information based on Parzen window. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 1667-1671.
    • (2002) IEEE Trans. Pattern Anal. Mach. Intell. , vol.24 , pp. 1667-1671
    • Kwak, N.1    Choi, C.H.2
  • 28
    • 50549092368 scopus 로고    scopus 로고
    • Feature selection, mutual information, and the classification of high-dimensional patterns: Applications to image classification and microarray data analysis
    • Bonev, B.; Escolano, F.; Cazorla, M. Feature selection, mutual information, and the classification of high-dimensional patterns: applications to image classification and microarray data analysis. Pattern Anal. Appl. 2008, 11, 309-319.
    • (2008) Pattern Anal. Appl. , vol.11 , pp. 309-319
    • Bonev, B.1    Escolano, F.2    Cazorla, M.3
  • 29
    • 33847674996 scopus 로고    scopus 로고
    • Resampling methods for parameter-free and robust feature selection with mutual information
    • François, D.; Rossi, F.; Wertz, V.; Verleysen, M. Resampling methods for parameter-free and robust feature selection with mutual information. Neurocomputing 2007, 70, 1276-1288.
    • (2007) Neurocomputing , vol.70 , pp. 1276-1288
    • François, D.1    Rossi, F.2    Wertz, V.3    Verleysen, M.4
  • 30
    • 84915005479 scopus 로고
    • Probability of error, equivocation, and the Chernoff bound
    • Hellman, M.E.; Raviv, J. Probability of error, equivocation, and the Chernoff bound. IEEE Trans. Inf. Theory 1970, IT-16, 368-372.
    • (1970) IEEE Trans. Inf. Theory , vol.IT-16 , pp. 368-372
    • Hellman, M.E.1    Raviv, J.2
  • 31
    • 0003007148 scopus 로고
    • The problem of character recognition from the point of view of mathematical statistics
    • Kovalevsky, V.A., Ed.; Spartan: New York, NY, USA
    • Kovalevsky, V.A. The problem of character recognition from the point of view of mathematical statistics. In Character Readers and Pattern Recognition; Kovalevsky, V.A., Ed.; Spartan: New York, NY, USA, 1968.
    • (1968) Character Readers and Pattern Recognition
    • Kovalevsky, V.A.1
  • 33
    • 0028273691 scopus 로고
    • Relations between entropy and error probability
    • Feder, M.; Merhav, N. Relations between entropy and error probability. IEEE Trans. Inf. Theory 1994, 40, 259-266.
    • (1994) IEEE Trans. Inf. Theory , vol.40 , pp. 259-266
    • Feder, M.1    Merhav, N.2
  • 34
    • 85008014525 scopus 로고    scopus 로고
    • Comment on Relations between entropy and error probability
    • Golić, J.D. Comment on "Relations between entropy and error probability". IEEE Trans. Inf. Theory 1999, 45, 372-372.
    • (1999) IEEE Trans. Inf. Theory , vol.45 , pp. 372-372
    • Golić, J.D.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.