메뉴 건너뛰기




Volumn , Issue , 1994, Pages 121-129

Irrelevant Features and the Subset Selection Problem

Author keywords

[No Author keywords available]

Indexed keywords

SET THEORY;

EID: 85099325734     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1016/B978-1-55860-335-6.50023-4     Document Type: Conference Paper
Times cited : (1797)

References (44)
  • 2
    • 70350346892 scopus 로고
    • Use of distance measures, information measures and error bounds in feature evaluation
    • Krishnaiah, P. R., and Kanal, L. N., eds., North-Holland Publishing Company
    • Ben-Bassat, M. 1982. Use of distance measures, information measures and error bounds in feature evaluation. In Krishnaiah, P. R., and Kanal, L. N., eds., Handbook of Statistics, volume 2. North-Holland Publishing Company. 773-791.
    • (1982) Handbook of Statistics , vol.2 , pp. 773-791
    • Ben-Bassat, M.1
  • 3
    • 0026453958 scopus 로고
    • Training a 3-node neural network is NP-complete
    • Blum, A. L., and Rivest, R. L. 1992. Training a 3-node neural network is NP-complete. Neural Networks 5:117-127.
    • (1992) Neural Networks , vol.5 , pp. 117-127
    • Blum, A. L.1    Rivest, R. L.2
  • 9
    • 0038710451 scopus 로고
    • Efficient pruning methods for separate-and-conquer rule learning systems
    • Morgan Kaufmann
    • Cohen, W. W. 1993. Efficient pruning methods for separate-and-conquer rule learning systems. In 13th International Joint Conference on Artificial Intelligence, 988-994. Morgan Kaufmann.
    • (1993) 13th International Joint Conference on Artificial Intelligence , pp. 988-994
    • Cohen, W. W.1
  • 14
    • 0027580356 scopus 로고
    • Very simple classification rules perform well on most commonly used datasets
    • Holte, R. C. 1993. Very simple classification rules perform well on most commonly used datasets. Machine Learning 11:63-90.
    • (1993) Machine Learning , vol.11 , pp. 63-90
    • Holte, R. C.1
  • 15
    • 0027002164 scopus 로고
    • The feature selection problem: Traditional methods and a new algorithm
    • a MIT Press
    • Kira, K., and Rendell, L. A. 1992a. The feature selection problem: Traditional methods and a new algorithm. In Tenth National Conference on Artificial Intelligence, 129-134. MIT Press.
    • (1992) Tenth National Conference on Artificial Intelligence , pp. 129-134
    • Kira, K.1    Rendell, L. A.2
  • 20
    • 34250091945 scopus 로고
    • Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm
    • Littlestone, N. 1988. Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm. Machine Learning 2:285-318.
    • (1988) Machine Learning , vol.2 , pp. 285-318
    • Littlestone, N.1
  • 26
    • 84976709337 scopus 로고
    • Decision trees and diagrams
    • Moret, B. M. E. 1982. Decision trees and diagrams. ACM Computing Surveys 14(4):593-623.
    • (1982) ACM Computing Surveys , vol.14 , Issue.4 , pp. 593-623
    • Moret, B. M. E.1
  • 27
    • 84948597805 scopus 로고
    • A comparison of seven techniques for choosing subsets of pattern recognition properties
    • Mucciardi, A. N., and Gose, E. E. 1971. A comparison of seven techniques for choosing subsets of pattern recognition properties. IEEE Transactions on Computers C-20(9):1023-1031.
    • (1971) IEEE Transactions on Computers , vol.C-20 , Issue.9 , pp. 1023-1031
    • Mucciardi, A. N.1    Gose, E. E.2
  • 29
    • 0017535866 scopus 로고
    • A branch and bound algorithm for feature subset selection
    • Narendra, M. P., and Fukunaga, K. 1977. A branch and bound algorithm for feature subset selection. IEEE Transactions on Computers C-26(9):917-922.
    • (1977) IEEE Transactions on Computers , vol.C-26 , Issue.9 , pp. 917-922
    • Narendra, M. P.1    Fukunaga, K.2
  • 31
    • 0025389210 scopus 로고
    • Boolean feature discovery in empirical learning
    • Pagallo, G., and Haussler, D. 1990. Boolean feature discovery in empirical learning. Machine Learning 5:71-99.
    • (1990) Machine Learning , vol.5 , pp. 71-99
    • Pagallo, G.1    Haussler, D.2
  • 32
    • 33744584654 scopus 로고
    • Induction of decision trees
    • Reprinted in Shavlik and Dietterich (eds) Readings in Machine Learning
    • Quinlan, J. R. 1986. Induction of decision trees. Machine Learning 1:81-106. Reprinted in Shavlik and Dietterich (eds.) Readings in Machine Learning.
    • (1986) Machine Learning , vol.1 , pp. 81-106
    • Quinlan, J. R.1
  • 34
    • 0000318553 scopus 로고
    • Stochastic complexity and modeling
    • Rissanen, J. 1986. Stochastic complexity and modeling. Ann. Statist 14:1080-1100.
    • (1986) Ann. Statist , vol.14 , pp. 1080-1100
    • Rissanen, J.1
  • 37
    • 85152626023 scopus 로고
    • Efficiently inducing determinations: A complete and systematic search algorithm that uses optimal pruning
    • Morgan Kaufmann
    • Schlimmer, J. C. 1993. Efficiently inducing determinations: A complete and systematic search algorithm that uses optimal pruning. In Proceedings of the Tenth International Conference on Machine Learning, 284-290. Morgan Kaufmann.
    • (1993) Proceedings of the Tenth International Conference on Machine Learning , pp. 284-290
    • Schlimmer, J. C.1
  • 39
    • 0012657799 scopus 로고
    • Prototype and feature selection by sampling and random mutation hill climbing algorithms
    • Cohen, W. W., and Hirsh, H, eds., Morgan Kaufmann
    • Skalak, D. B. 1994. Prototype and feature selection by sampling and random mutation hill climbing algorithms. In Cohen, W. W., and Hirsh, H, eds., Machine Learning: Proceedings of the Eleventh International Conference. Morgan Kaufmann.
    • (1994) Machine Learning: Proceedings of the Eleventh International Conference
    • Skalak, D. B.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.