메뉴 건너뛰기




Volumn 38, Issue 7, 2011, Pages 8170-8177

Empirical study of feature selection methods based on individual feature evaluation for classification problems

Author keywords

Classification problems; Data reduction; Feature estimation; Feature evaluation; Feature selection

Indexed keywords

AUTOMATIC FEATURE SELECTION; CLASSIFICATION PROBLEMS; EMPIRICAL STUDIES; FEATURE ESTIMATION; FEATURE EVALUATION; FEATURE SELECTION; FEATURE SELECTION METHODS; LEARNING PROCESS; MODULARIZATIONS; RUNNING TIME; SELECTION CRITERIA; UNDERSTANDABILITY; WIDE SPECTRUM;

EID: 79952437887     PISSN: 09574174     EISSN: None     Source Type: Journal    
DOI: 10.1016/j.eswa.2010.12.160     Document Type: Article
Times cited : (90)

References (33)
  • 1
    • 69249220239 scopus 로고    scopus 로고
    • Design of input vector for day-ahead price forecasting of electricity markets
    • N. Amjady, and A. Daraeepour Design of input vector for day-ahead price forecasting of electricity markets Expert Systems with Applications 36 10 2009 12281 12294
    • (2009) Expert Systems with Applications , vol.36 , Issue.10 , pp. 12281-12294
    • Amjady, N.1    Daraeepour, A.2
  • 6
    • 0013326060 scopus 로고    scopus 로고
    • Feature selection for classification
    • M. Dash, and H. Liu Feature selection for classification Intelligent Data Analysis 1 1-4 1997 131 156
    • (1997) Intelligent Data Analysis , vol.1 , Issue.14 , pp. 131-156
    • Dash, M.1    Liu, H.2
  • 7
    • 29644438050 scopus 로고    scopus 로고
    • Statistical comparisons of classifiers over multiple data sets
    • J. Demsar Statistical comparisons of classifiers over multiple data sets Journal of Machine Learning Research 7 2006 1 30 (Pubitemid 43022939)
    • (2006) Journal of Machine Learning Research , vol.7 , pp. 1-30
    • Demsar, J.1
  • 16
    • 0024880831 scopus 로고
    • Multilayer feedforward networks are universal approximators
    • DOI 10.1016/0893-6080(89)90020-8
    • K. Hornik, M. Stinchcombe, and H. White Multilayer feedforward networks are universal approximators Neural Networks 2 1989 359 366 (Pubitemid 20609008)
    • (1989) Neural Networks , vol.2 , Issue.5 , pp. 359-366
    • Hornik Kurt1    Stinchcombe Maxwell2    White Halbert3
  • 17
    • 0031078007 scopus 로고    scopus 로고
    • Feature selection: evaluation, application, and small sample performance
    • A. Jain, and D. Zongker Feature selection: Evaluation, application and small sample performance IEEE Transactions Pattern Analysis and Machine Intelligence 19 2 1997 153 158 (Pubitemid 127828334)
    • (1997) IEEE Transactions on Pattern Analysis and Machine Intelligence , vol.19 , Issue.2 , pp. 153-158
    • Jain, A.1
  • 18
    • 67349259624 scopus 로고    scopus 로고
    • A new feature selection method on classification of medical datasets: Kernel f-score feature selection
    • S.G. Kemal Polat A new feature selection method on classification of medical datasets: Kernel f-score feature selection Expert Systems with Applications 36 7 2009 10367 10373
    • (2009) Expert Systems with Applications , vol.36 , Issue.7 , pp. 10367-10373
    • Kemal Polat, S.G.1
  • 20
    • 0031381525 scopus 로고    scopus 로고
    • Wrappers for feature subset selection
    • PII S000437029700043X
    • R. Kohavi, and G.H. John Wrappers for feature subset selection Artificial Intelligence 97 1-2 1997 273 324 (Pubitemid 127401107)
    • (1997) Artificial Intelligence , vol.97 , Issue.1-2 , pp. 273-324
    • Kohavi, R.1    John, G.H.2
  • 21
    • 84992726552 scopus 로고
    • Estimating attributes: Analysis and extensions of RELIEF
    • Kononenko, I., (1994). Estimating attributes: Analysis and extensions of RELIEF. In European Conference on Machine Learning (pp. 171-182). Available from: .
    • (1994) European Conference on Machine Learning , pp. 171-182
    • Kononenko, I.1
  • 22
    • 85061066913 scopus 로고
    • Selection of relevant features in machine learning
    • AAAI Press New Orleans, LA, USA
    • P. Langley Selection of relevant features in machine learning Proceedings of the AAAI fall symposium on relevance 1994 AAAI Press New Orleans, LA, USA 1 5
    • (1994) Proceedings of the AAAI Fall Symposium on Relevance , pp. 1-5
    • Langley, P.1
  • 23
    • 17044405923 scopus 로고    scopus 로고
    • Toward integrating feature selection algorithms for classification and clustering
    • Liu, H., Yu, L., (2005). Toward integrating feature selection algorithms for classification and clustering. IEEE Transactions on Knowledge and Data Engineering 17(3).
    • (2005) IEEE Transactions on Knowledge and Data Engineering , vol.17 , Issue.3
    • Liu, H.1    Yu, L.2
  • 28
    • 0141990695 scopus 로고    scopus 로고
    • Theoretical and empirical analysis of relieff and rrelieff
    • M. Robnik-Sikonja, and I. Kononenko Theoretical and empirical analysis of relieff and rrelieff Machine Learning 53 2003 23 69
    • (2003) Machine Learning , vol.53 , pp. 23-69
    • Robnik-Sikonja, M.1    Kononenko, I.2
  • 29
    • 53749107899 scopus 로고    scopus 로고
    • Dimensionality reduction based on rough set theory: A review
    • K. Thangavel, and A. Pethalakshmi Dimensionality reduction based on rough set theory: A review Applied Soft Computing 9 1 2008 1 12
    • (2008) Applied Soft Computing , vol.9 , Issue.1 , pp. 1-12
    • Thangavel, K.1    Pethalakshmi, A.2
  • 32
  • 33
    • 10044283198 scopus 로고    scopus 로고
    • The optimality of naive Bayes
    • Zhang, H. (2004). The optimality of naive Bayes. In FLAIRS conference.
    • (2004) FLAIRS Conference
    • Zhang, H.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.