메뉴 건너뛰기




Volumn 2, Issue 3, 2008, Pages 261-274

Information-theoretic feature selection in microarray data using variable complementarity

Author keywords

Information theoretic feature selection; Variable complementarity; Variable interaction

Indexed keywords

FEATURE EXTRACTION; INFORMATION THEORY; MICROARRAYS; WAVE FILTERS;

EID: 47049102021     PISSN: 19324553     EISSN: None     Source Type: Journal    
DOI: 10.1109/JSTSP.2008.923858     Document Type: Article
Times cited : (260)

References (47)
  • 2
    • 0042657622 scopus 로고    scopus 로고
    • Greedily finding a dense subgraph
    • Y. Asahiro, K. Iwama, H. Tamaki, and T. Tokuyama, "Greedily finding a dense subgraph," J. Algorithms, vol. 34, no. 1, pp. 203-221, 2000.
    • (2000) J. Algorithms , vol.34 , Issue.1 , pp. 203-221
    • Asahiro, Y.1    Iwama, K.2    Tamaki, H.3    Tokuyama, T.4
  • 3
    • 0028468293 scopus 로고
    • Using mutual information for selecting features in supervised neural net learning
    • R. Battiti, "Using mutual information for selecting features in supervised neural net learning," IEEE Trans. Neural Netw., 1994.
    • (1994) IEEE Trans. Neural Netw
    • Battiti, R.1
  • 4
    • 0034324043 scopus 로고    scopus 로고
    • A formalism for relevance and its application in feature subset selection
    • D.A. Bell and H. Wang, "A formalism for relevance and its application in feature subset selection," Mach. Learn., vol. 41, no. 2, pp. 175-195, 2000.
    • (2000) Mach. Learn , vol.41 , Issue.2 , pp. 175-195
    • Bell, D.A.1    Wang, H.2
  • 5
    • 0001677717 scopus 로고
    • Controlling the false discovery rate: A practical and powerful approach to multiple testing
    • Y. Benjamini and Y. Hochberg, "Controlling the false discovery rate: A practical and powerful approach to multiple testing," J. Royal Statist. Soc., vol. B57, pp. 289-300, 1995.
    • (1995) J. Royal Statist. Soc , vol.B57 , pp. 289-300
    • Benjamini, Y.1    Hochberg, Y.2
  • 6
    • 0030189996 scopus 로고    scopus 로고
    • Linear programming for the 0-1 quadratic knapsack problem
    • A. Billionnet and F. Calmels, "Linear programming for the 0-1 quadratic knapsack problem," Eur. J. Oper. Res., vol. 92, pp. 310-325, 1996.
    • (1996) Eur. J. Oper. Res , vol.92 , pp. 310-325
    • Billionnet, A.1    Calmels, F.2
  • 7
    • 0031334221 scopus 로고    scopus 로고
    • Selection of relevant features and examples in machine learning
    • A. Blum and P. Langley, "Selection of relevant features and examples in machine learning," Artif. Intell., vol. 97, pp. 245-271, 1997.
    • (1997) Artif. Intell , vol.97 , pp. 245-271
    • Blum, A.1    Langley, P.2
  • 8
    • 33845283742 scopus 로고
    • Training a 3-node neural network is NP-complete
    • Machine Learning: From Theory to Applications
    • A. Blum and R. L. Rivest, "Training a 3-node neural network is NP-complete," Machine Learning: From Theory to Applications, vol. 661, Lecture Notes in Computer Science, 1993.
    • (1993) Lecture Notes in Computer Science , vol.661
    • Blum, A.1    Rivest, R.L.2
  • 11
    • 0013326060 scopus 로고    scopus 로고
    • Feature selection for classification
    • M. Dash and H. Liu, "Feature selection for classification," Intell. Data Anal., 1997.
    • (1997) Intell. Data Anal
    • Dash, M.1    Liu, H.2
  • 12
    • 85169548668 scopus 로고
    • Np-completeness of searches for smallest possible feature sets
    • S. Davies and S. Russell, "Np-completeness of searches for smallest possible feature sets," in Proc. AAAI Fall Symp. Relevance, 1994.
    • (1994) Proc. AAAI Fall Symp. Relevance
    • Davies, S.1    Russell, S.2
  • 14
    • 17644384367 scopus 로고    scopus 로고
    • Minimum redundancy feature selection from microarray gene expression data
    • C. Ding and H. Peng, "Minimum redundancy feature selection from microarray gene expression data," J. Bioinform. Comput. Biol., vol. 3, no. 2, pp. 185-205, 2005.
    • (2005) J. Bioinform. Comput. Biol , vol.3 , Issue.2 , pp. 185-205
    • Ding, C.1    Peng, H.2
  • 15
    • 85139983802 scopus 로고
    • Supervised and unsupervised discretization of continuous features
    • J. Dougherty, R. Kohavi, and M. Sahami, "Supervised and unsupervised discretization of continuous features," in Int. Conf. Machine Learning, 1995, pp. 194-202.
    • (1995) Int. Conf. Machine Learning , pp. 194-202
    • Dougherty, J.1    Kohavi, R.2    Sahami, M.3
  • 17
    • 33645690579 scopus 로고    scopus 로고
    • Fast binary feature selection with conditional mutual information
    • F. Fleuret, "Fast binary feature selection with conditional mutual information," J. Mach. Learn. Res., vol. 5, pp. 1531-1555, 2004.
    • (2004) J. Mach. Learn. Res , vol.5 , pp. 1531-1555
    • Fleuret, F.1
  • 18
    • 33745561205 scopus 로고    scopus 로고
    • An introduction to variable and feature selection
    • I. Guyon and A. Elisseeff, "An introduction to variable and feature selection," J. Mach. Learn. Res., vol. 3, pp. 1157-1182, 2003.
    • (2003) J. Mach. Learn. Res , vol.3 , pp. 1157-1182
    • Guyon, I.1    Elisseeff, A.2
  • 19
    • 0031078007 scopus 로고    scopus 로고
    • Feature selection: Evaluation, application, and small sampleperformance
    • A. Jain and D. Zongker, "Feature selection: Evaluation, application, and small sampleperformance," IEEE Trans. Pattern Anal. Mach. Intell., vol. 19, 1997.
    • (1997) IEEE Trans. Pattern Anal. Mach. Intell , vol.19
    • Jain, A.1    Zongker, D.2
  • 21
    • 0031381525 scopus 로고    scopus 로고
    • Wrappers for feature subset selection
    • R. Kohavi and G. H. John, "Wrappers for feature subset selection," Artif. Intell., vol. 97, no. 1-2, pp. 273-324, 1997.
    • (1997) Artif. Intell , vol.97 , Issue.1-2 , pp. 273-324
    • Kohavi, R.1    John, G.H.2
  • 22
    • 19044391788 scopus 로고    scopus 로고
    • Relevance measures for subset variable selection in regression problems based on k-additive mutual information
    • I. Kojadinovic, "Relevance measures for subset variable selection in regression problems based on k-additive mutual information," Comput. Statist. Data Anal., vol. 49, 2005.
    • (2005) Comput. Statist. Data Anal , vol.49
    • Kojadinovic, I.1
  • 24
    • 84992726552 scopus 로고
    • Estimating attributes: Analysis and extensions of RELIEF
    • I. Kononenko, "Estimating attributes: Analysis and extensions of RELIEF," in Eur. Conf. Machine Learning, 1994, pp. 171-182.
    • (1994) Eur. Conf. Machine Learning , pp. 171-182
    • Kononenko, I.1
  • 26
    • 0002801737 scopus 로고
    • Multivariate information transmission
    • W. J. McGill, "Multivariate information transmission," Psychometrika, vol. 19, 1954.
    • (1954) Psychometrika , vol.19
    • McGill, W.J.1
  • 27
    • 0036496382 scopus 로고    scopus 로고
    • Greedy and local search heuristics for unconstrained binary quadratic programming
    • P. Merz and B. Freisleben, "Greedy and local search heuristics for unconstrained binary quadratic programming," J. Heuristics, vol. 8, no. 2, pp. 1381-1231, 2002.
    • (2002) J. Heuristics , vol.8 , Issue.2 , pp. 1381-1231
    • Merz, P.1    Freisleben, B.2
  • 28
    • 33745799421 scopus 로고    scopus 로고
    • P. E. Meyer and G. Bontempi, On the use of variable complementarity for feature selection in cancer classification, in Applications of Evolutionary Computing: EvoWorkshops, F. Rothlauf, Ed. et al., 2006, 3907, Lecture Notes in Computer Science, pp. 91-102.
    • P. E. Meyer and G. Bontempi, "On the use of variable complementarity for feature selection in cancer classification," in Applications of Evolutionary Computing: EvoWorkshops, F. Rothlauf, Ed. et al., 2006, vol. 3907, Lecture Notes in Computer Science, pp. 91-102.
  • 31
    • 0041877169 scopus 로고    scopus 로고
    • Estimation of entropy and mutual information
    • L. Paninski, "Estimation of entropy and mutual information," Neural Comput., vol. 15, no. 6, pp. 1191-1253, 2003.
    • (2003) Neural Comput , vol.15 , Issue.6 , pp. 1191-1253
    • Paninski, L.1
  • 33
    • 24344458137 scopus 로고    scopus 로고
    • Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy
    • H. Peng, F. Long, and C. Ding, "Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy," IEEE Trans. Pattern Anal. Mach. Intell., vol. 27, no. 8, pp. 1226-1238, 2005.
    • (2005) IEEE Trans. Pattern Anal. Mach. Intell , vol.27 , Issue.8 , pp. 1226-1238
    • Peng, H.1    Long, F.2    Ding, C.3
  • 34
    • 26444531878 scopus 로고    scopus 로고
    • Upper bounds and exact algorithms for dispersion problems
    • D. Pisinger, "Upper bounds and exact algorithms for dispersion problems," Comput. OR, vol. 33, pp. 1380-1398, 2006.
    • (2006) Comput. OR , vol.33 , pp. 1380-1398
    • Pisinger, D.1
  • 39
    • 85167036807 scopus 로고    scopus 로고
    • G. D. Tourassi, E. D. Frederick, M. K. Markey, and C. E. Floyd, Jr., Application of the mutual information criterion for feature selection in computer-aided diagnosis, Med. Phys., 28, no. 12, pp. 2394-2402, 2001.
    • G. D. Tourassi, E. D. Frederick, M. K. Markey, and C. E. Floyd, Jr., "Application of the mutual information criterion for feature selection in computer-aided diagnosis," Med. Phys., vol. 28, no. 12, pp. 2394-2402, 2001.
  • 41
    • 3242756447 scopus 로고    scopus 로고
    • Towards principled feature selection: Relevancy, filters, and wrappers
    • I. Tsamardinos and C. Aliferis, "Towards principled feature selection: Relevancy, filters, and wrappers," Artif. Intell. Statist., 2003.
    • (2003) Artif. Intell. Statist
    • Tsamardinos, I.1    Aliferis, C.2
  • 44
    • 0029683895 scopus 로고    scopus 로고
    • How to determine the redundancy of noisy chaotic time series
    • W. Wienholt and B. Sendhoff, "How to determine the redundancy of noisy chaotic time series," Int. J. Bifurc. Chaos, vol. 6, no. 1, pp. 101-117, 1996.
    • (1996) Int. J. Bifurc. Chaos , vol.6 , Issue.1 , pp. 101-117
    • Wienholt, W.1    Sendhoff, B.2
  • 46
    • 25144492516 scopus 로고    scopus 로고
    • Efficient feature selection via analysis of relevance and redundancy
    • L. Yu and H. Liu, "Efficient feature selection via analysis of relevance and redundancy," J. Mach. Learn. Res., vol. 5, pp. 1205-1224, 2004.
    • (2004) J. Mach. Learn. Res , vol.5 , pp. 1205-1224
    • Yu, L.1    Liu, H.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.