메뉴 건너뛰기




Volumn 34, Issue 14, 2013, Pages 1630-1635

Feature interaction maximisation

Author keywords

Classification; Dimensionality reduction; Feature selection; Interaction information; Mutual information; Subset feature selection

Indexed keywords

FEATURE EXTRACTION; NEAREST NEIGHBOR SEARCH; REDUNDANCY;

EID: 84885659095     PISSN: 01678655     EISSN: None     Source Type: Journal    
DOI: 10.1016/j.patrec.2013.04.002     Document Type: Article
Times cited : (32)

References (22)
  • 1
    • 0028468293 scopus 로고
    • Using mutual information for selecting features in supervised neural net learning
    • Battiti, R., 1994. Using mutual information for selecting features in supervised neural net learning. IEEE Trans. Neural Networks 5, 537-550.
    • (1994) IEEE Trans. Neural Networks , vol.5 , pp. 537-550
    • Battiti, R.1
  • 2
    • 84863403768 scopus 로고    scopus 로고
    • Conditional likelihood maximisation: A unifying framework for information theoretic feature selection
    • Brown, G., Pocock, A., Zhao, M., Lujan, M., 2012. Conditional likelihood maximisation: a unifying framework for information theoretic feature selection. J. Mach. Learn. Res. 13, 27-66.
    • (2012) J. Mach. Learn. Res , vol.13 , pp. 27-66
    • Brown, G.1    Pocock, A.2    Zhao, M.3    Lujan, M.4
  • 3
    • 79954439038 scopus 로고    scopus 로고
    • Conditional mutual informationbased feature selection analysing for synergy and redundancy
    • Cheng, H., Qin, Z., Feng, C., Wang, Y., Li, F., 2011. Conditional mutual informationbased feature selection analysing for synergy and redundancy. ETRI J. 33, 210-218.
    • (2011) ETRI J , vol.33 , pp. 210-218
    • Cheng, H.1    Qin, Z.2    Feng, C.3    Wang, Y.4    Li, F.5
  • 8
    • 78649934709 scopus 로고    scopus 로고
    • Irvine, CA, University of California, School of Information and Computer Science
    • Frank, A., Asuncion, A., 2010. UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA, University of California, School of Information and Computer Science.
    • (2010) UCI Machine Learning Repository
    • Frank, A.1    Asuncion, A.2
  • 10
    • 33745561205 scopus 로고    scopus 로고
    • An introduction to variable and feature selection
    • Guyon, I., Elisseeff, A., 2003. An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157-1182.
    • (2003) J. Mach. Learn. Res , vol.3 , pp. 1157-1182
    • Guyon, I.1    Elisseeff, A.2
  • 11
    • 9444253133 scopus 로고    scopus 로고
    • Master thesis, Computer and Information Science, University of Ljubljana. Jakulin, A., 2005. Machine Learning Based on Attribute Interactions. PhD thesis, Computer and Information Science, University of Ljubljana
    • Jakulin, A., 2003. Attribute Interactions in Machine Learning. Master thesis, Computer and Information Science, University of Ljubljana. Jakulin, A., 2005. Machine Learning Based on Attribute Interactions. PhD thesis, Computer and Information Science, University of Ljubljana.
    • (2003) Attribute Interactions in Machine Learning
    • Jakulin, A.1
  • 12
    • 80051574467 scopus 로고    scopus 로고
    • Feature subset selection problem using wrapper approach in supervised learning
    • Karegowda, A., Jayaram, A.M., Manjunath, A., 2010. Feature subset selection problem using wrapper approach in supervised learning. Int. J. Comput. Appl. 1, 13-17.
    • (2010) Int. J. Comput. Appl , vol.1 , pp. 13-17
    • Karegowda, A.1    Jayaram, A.M.2    Manjunath, A.3
  • 13
    • 85146422424 scopus 로고
    • A practical approach to feature selection. A practical approach to feature selection
    • Machine Learning
    • Kira, K., Rendell, L., 1992. A practical approach to feature selection. A practical approach to feature selection. In: ML92 Proceedings of the Ninth International Workshop on, Machine Learning, pp. 249-256.
    • (1992) ML92 Proceedings of the Ninth International Workshop on , pp. 249-256
    • Kira, K.1    Rendell, L.2
  • 14
    • 0036127473 scopus 로고    scopus 로고
    • Input feature selection for classification problems
    • DOI 10.1109/72.977291, PII S1045922702003417
    • Kwak, N., Choi, C., 2002. Input feature selection for classification problems. IEEE Trans. Neural Networks 13, 143-159. (Pubitemid 34236844)
    • (2002) IEEE Transactions on Neural Networks , vol.13 , Issue.1 , pp. 143-159
    • Kwak, N.1    Choi, C.-H.2
  • 15
    • 84870668654 scopus 로고    scopus 로고
    • Feature selection for multi-label classification using multivariate mutual information
    • Lee, J., Kim, D., 2013. Feature selection for multi-label classification using multivariate mutual information. Pattern Recogn. Lett. 34, 349-357.
    • (2013) Pattern Recogn. Lett , vol.34 , pp. 349-357
    • Lee, J.1    Kim, D.2
  • 16
    • 1842679412 scopus 로고    scopus 로고
    • Implementing the Fisher's discriminant ratio in a kmeans clustering algorithm for feature selection and data set trimming
    • Lin, T., Li, H., Tsai, K.-C., 2004. Implementing the Fisher's discriminant ratio in a kmeans clustering algorithm for feature selection and data set trimming. J. Chem. Inf. Comput. Sci. 44, 76-87.
    • (2004) J. Chem. Inf. Comput. Sci , vol.44 , pp. 76-87
    • Lin, T.1    Li, H.2    Tsai, K.-C.3
  • 18
    • 62349118015 scopus 로고    scopus 로고
    • Feature selection with dynamic mutual information
    • Liu, H., Sun, J., Liu, L., Zhang, H., 2009. Feature selection with dynamic mutual information. Pattern Recogn. 42, 1330-1339.
    • (2009) Pattern Recogn , vol.42 , pp. 1330-1339
    • Liu, H.1    Sun, J.2    Liu, L.3    Zhang, H.4
  • 20
    • 47049102021 scopus 로고    scopus 로고
    • Information-theoretic feature selection in microarray data using variable complementarily
    • Meyer, P., Schretter, C., Bontempi, G., 2008. Information-theoretic feature selection in microarray data using variable complementarily. IEEE J. Sel. Top. Signal Process. 2, 261-274.
    • (2008) IEEE J. Sel. Top. Signal Process , vol.2 , pp. 261-274
    • Meyer, P.1    Schretter, C.2    Bontempi, G.3
  • 21
    • 24344458137 scopus 로고    scopus 로고
    • Feature selection based on mutual information: Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy
    • DOI 10.1109/TPAMI.2005.159
    • Peng, H., Long, F., Ding, C., 2005. Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27, 1226-1238. (Pubitemid 41245053)
    • (2005) IEEE Transactions on Pattern Analysis and Machine Intelligence , vol.27 , Issue.8 , pp. 1226-1238
    • Peng, H.1    Long, F.2    Ding, C.3
  • 22
    • 84952503562 scopus 로고
    • Thirteen ways to look at the correlation coefficient
    • Rodgers, J., Nicewander, W., 1988. Thirteen ways to look at the correlation coefficient. Am. Stat. 42, 59-66.
    • (1988) Am. Stat , vol.42 , pp. 59-66
    • Rodgers, J.1    Nicewander, W.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.