메뉴 건너뛰기




Volumn , Issue , 2006, Pages 5034-5039

Estimating mutual information using gaussian mixture model for feature ranking and selection

Author keywords

[No Author keywords available]

Indexed keywords

COMPUTATIONAL EFFICIENCY; FEATURE EXTRACTION; PROBABILITY DENSITY FUNCTION;

EID: 40649100303     PISSN: 10987576     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (8)

References (21)
  • 4
    • 0033570807 scopus 로고    scopus 로고
    • Independent Component Analysis: A Flexible Nonlinearity and Decorrelating Manifold Approach
    • R. Everson, S. Roberts, "Independent Component Analysis: A Flexible Nonlinearity and Decorrelating Manifold Approach", Neural Computation, vol. 11, no. 8, pp. 1957-1983, 2003.
    • (2003) Neural Computation , vol.11 , Issue.8 , pp. 1957-1983
    • Everson, R.1    Roberts, S.2
  • 5
    • 0002090822 scopus 로고    scopus 로고
    • Image Feature Extraction by Sparse coding and Independent Component Analysis
    • A. Hyvärinen, E. Oja, P. Hoyer, J. Hurri, "Image Feature Extraction by Sparse coding and Independent Component Analysis",Proceedings of ICPR'98,pp. 1268-1273, 1998.
    • (1998) Proceedings of ICPR'98 , pp. 1268-1273
    • Hyvärinen, A.1    Oja, E.2    Hoyer, P.3    Hurri, J.4
  • 6
    • 33745701301 scopus 로고    scopus 로고
    • Feature Selection by Independent Component Analysis and Mutual Information Maximization in EEG Signal Classification
    • Montreal, Canada, pp, Aug
    • T. Lan, D. Erdogmus, A. Adami, M. Pavel, "Feature Selection by Independent Component Analysis and Mutual Information Maximization in EEG Signal Classification," Proceedings of IJCNN'05, Montreal, Canada, pp. 3011-3016, Aug. 2005.
    • (2005) Proceedings of IJCNN'05 , pp. 3011-3016
    • Lan, T.1    Erdogmus, D.2    Adami, A.3    Pavel, M.4
  • 9
    • 84915005479 scopus 로고
    • Probability of Error, Equivocation and the Chernoff Bound
    • M.E. Hellman, J. Raviv, "Probability of Error, Equivocation and the Chernoff Bound," IEEE Transactions on Information Theory, vol. 16, pp. 368-372, 1970.
    • (1970) IEEE Transactions on Information Theory , vol.16 , pp. 368-372
    • Hellman, M.E.1    Raviv, J.2
  • 10
    • 0028468293 scopus 로고
    • Using Mutual Information for Selecting Features in Supervised Neural Net Training
    • July
    • Battiti, R., "Using Mutual Information for Selecting Features in Supervised Neural Net Training," IEEE Trans Neural Networks, vol 5, no 4, pp 537-550. July 1994.
    • (1994) IEEE Trans Neural Networks , vol.5 , Issue.4 , pp. 537-550
    • Battiti, R.1
  • 11
    • 0027002164 scopus 로고    scopus 로고
    • Kira, K. and Rendell,L., The feature selection problem: Traditional methods and a new algorithm, In Proceedings of the Tenth National Conference on Artificial Intelligence (AAAI-92), pages 129-134, Menlo Park, CA, USA, 1992. AAAI Press.
    • Kira, K. and Rendell,L., "The feature selection problem: Traditional methods and a new algorithm," In Proceedings of the Tenth National Conference on Artificial Intelligence (AAAI-92), pages 129-134, Menlo Park, CA, USA, 1992. AAAI Press.
  • 14
    • 33846903365 scopus 로고    scopus 로고
    • Salient EEG Channel Selection in Brain Computer Interfaces by Mutual Information Maximization
    • Shanghai, China, Sept
    • T. Lan, D. Erdogmus, A. Adami, M. Pavel, S. Mathan, "Salient EEG Channel Selection in Brain Computer Interfaces by Mutual Information Maximization," Proceedings of EMBC'05, Shanghai, China, Sept. 2005.
    • (2005) Proceedings of EMBC'05
    • Lan, T.1    Erdogmus, D.2    Adami, A.3    Pavel, M.4    Mathan, S.5
  • 16
    • 3543121719 scopus 로고    scopus 로고
    • Lower and Upper Bounds for Misclassification Probability Based on Renyi's Information
    • D. Erdogmus, J.C. Principe, "Lower and Upper Bounds for Misclassification Probability Based on Renyi's Information," Journal of VLSI Signal Processing Systems, vol. 37, no. 2/3, pp. 305-317, 2004.
    • (2004) Journal of VLSI Signal Processing Systems , vol.37 , Issue.2-3 , pp. 305-317
    • Erdogmus, D.1    Principe, J.C.2
  • 17
    • 1942450610 scopus 로고    scopus 로고
    • Feature Extraction by Non-Parametric Mutual Information Maximization
    • K. Torkkola, "Feature Extraction by Non-Parametric Mutual Information Maximization," Journal of Machine Learning Research, vol. 3, pp. 1415-1438, 2003.
    • (2003) Journal of Machine Learning Research , vol.3 , pp. 1415-1438
    • Torkkola, K.1
  • 18
    • 0028468293 scopus 로고
    • Using Mutual Information for Selecting Features in Supervised Neural Networks learning
    • R. Battiti, "Using Mutual Information for Selecting Features in Supervised Neural Networks learning," IEEE Trans. Neural Networks, vol. 5, no. 4, pp. 537-550, 1994.
    • (1994) IEEE Trans. Neural Networks , vol.5 , Issue.4 , pp. 537-550
    • Battiti, R.1
  • 20
    • 40649115650 scopus 로고    scopus 로고
    • http://www.ics.uci.edu/mlearn/MLRepository.html


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.