메뉴 건너뛰기




Volumn 10, Issue , 2009, Pages 1341-1366

Feature selection with ensembles, artificial variables, and redundancy elimination

Author keywords

Importance; Masking; Resampling; Residuals; Trees

Indexed keywords

IMPORTANCE; MASKING; RESAMPLING; RESIDUALS; TREES;

EID: 68949154557     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (266)

References (43)
  • 1
    • 0028496468 scopus 로고
    • Learning boolean concepts in the presence of many irrelevant features
    • H. Almuallin and T. G. Dietterich. Learning boolean concepts in the presence of many irrelevant features. Artificial Intelligence, 69(1-2):279-305, 1994.
    • (1994) Artificial Intelligence , vol.69 , Issue.1-2 , pp. 279-305
    • Almuallin, H.1    Dietterich, T.G.2
  • 2
    • 0001492549 scopus 로고    scopus 로고
    • Shape quantization and recognition with randomized trees
    • Y. Amit and D. Geman. Shape quantization and recognition with randomized trees. Neural Computation, 9(7):1545-88, 1997.
    • (1997) Neural Computation , vol.9 , Issue.7 , pp. 1545-1588
    • Amit, Y.1    Geman, D.2
  • 3
    • 0032645080 scopus 로고    scopus 로고
    • An empirical comparison of voting classification algorithms: Bagging, boosting and variants
    • E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithms: Bagging, boosting and variants. Machine Learning, 36(1/2):525-536, 1999.
    • (1999) Machine Learning , vol.36 , Issue.1-2 , pp. 525-536
    • Bauer, E.1    Kohavi, R.2
  • 4
    • 34147208482 scopus 로고    scopus 로고
    • Using metarules to organize and group discovered association rules
    • A. Berrado and G.C. Runger. Using metarules to organize and group discovered association rules. Data Mining and Knowledge Discovery, 14(3):409-431, 2007.
    • (2007) Data Mining and Knowledge Discovery , vol.14 , Issue.3 , pp. 409-431
    • Berrado, A.1    Runger, G.C.2
  • 6
    • 0026966646 scopus 로고
    • A training algorithm for optimal margin classifiers
    • D. Haussler, editor, Pittsburgh, PA, ACM Press
    • B. Boser, I. Guyon, and V. Vapnik. A training algorithm for optimal margin classifiers. In D. Haussler, editor, 5th Annual ACM Workshop on COLT, Pittsburgh, PA, pages 144-152. ACM Press, 1992.
    • (1992) 5th Annual ACM Workshop on COLT , pp. 144-152
    • Boser, B.1    Guyon, I.2    Vapnik, V.3
  • 8
    • 0030211964 scopus 로고    scopus 로고
    • Bagging predictors
    • L. Breiman. Bagging predictors. Machine Learning, 24(2):123-140, 1996.
    • (1996) Machine Learning , vol.24 , Issue.2 , pp. 123-140
    • Breiman, L.1
  • 9
    • 0346786584 scopus 로고    scopus 로고
    • Arcing classifiers
    • L. Breiman. Arcing classifiers. The Annals of Statistics, 26(3):801-849, 1998.
    • (1998) The Annals of Statistics , vol.26 , Issue.3 , pp. 801-849
    • Breiman, L.1
  • 10
    • 0035478854 scopus 로고    scopus 로고
    • Random forests
    • L. Breiman. Random forests. Machine Learning, 45(1):5-32, 2001.
    • (2001) Machine Learning , vol.45 , Issue.1 , pp. 5-32
    • Breiman, L.1
  • 12
    • 34250080806 scopus 로고    scopus 로고
    • S. Cost and S. Salzberg. A weighted nearest neighbor algorithm for learning with symbolic features. Machine Learning, 10(1):57-78, 1993.
    • S. Cost and S. Salzberg. A weighted nearest neighbor algorithm for learning with symbolic features. Machine Learning, 10(1):57-78, 1993.
  • 13
    • 0020748239 scopus 로고
    • Computer intensive methods in statistics
    • P. Diaconis and B. Efron. Computer intensive methods in statistics. Scientific American, (248): 116-131, 1983.
    • (1983) Scientific American , vol.248 , pp. 116-131
    • Diaconis, P.1    Efron, B.2
  • 14
    • 0034250160 scopus 로고    scopus 로고
    • An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization
    • T. G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, 40(2):139-157, 2000a.
    • (2000) Machine Learning , vol.40 , Issue.2 , pp. 139-157
    • Dietterich, T.G.1
  • 15
    • 80053403826 scopus 로고    scopus 로고
    • Ensemble methods in machine learning
    • First International Workshop on Multiple Classifier Systems 2000, Cagliari, Italy, of, Springer
    • T. G. Dietterich. Ensemble methods in machine learning. In First International Workshop on Multiple Classifier Systems 2000, Cagliari, Italy, volume 1857 of Lecture Notes in Computer Science, pages 1-15. Springer, 2000b.
    • (2000) Lecture Notes in Computer Science , vol.1857 , pp. 1-15
    • Dietterich, T.G.1
  • 18
    • 0003591748 scopus 로고    scopus 로고
    • Greedy function approximation: A gradient boosting machine
    • Technical report, Dept. of Statistics, Stanford University
    • J. Friedman. Greedy function approximation: a gradient boosting machine. Technical report, Dept. of Statistics, Stanford University, 1999.
    • (1999)
    • Friedman, J.1
  • 19
    • 0034164230 scopus 로고    scopus 로고
    • Additive logistic regression: A statistical view of boosting
    • J. Friedman, T. Hastie, and R. Tibshirani. Additive logistic regression: A statistical view of boosting. Annals of Statistics, 28:832-844, 2000.
    • (2000) Annals of Statistics , vol.28 , pp. 832-844
    • Friedman, J.1    Hastie, T.2    Tibshirani, R.3
  • 20
    • 84950941772 scopus 로고    scopus 로고
    • J. H Friedman,M. Jacobson, andW. Stuetzle. Projection pursuit regression. Journal of the American Statistical Association, 76:817-823, 1981.
    • J. H Friedman,M. Jacobson, andW. Stuetzle. Projection pursuit regression. Journal of the American Statistical Association, 76:817-823, 1981.
  • 21
    • 33745561205 scopus 로고    scopus 로고
    • An introduction to variable and feature selection
    • Mar
    • I. Guyon and A. Elisseeff. An introduction to variable and feature selection. Journal of Machine Learning Research, 3:1157-1182, Mar 2003.
    • (2003) Journal of Machine Learning Research , vol.3 , pp. 1157-1182
    • Guyon, I.1    Elisseeff, A.2
  • 22
    • 0036161259 scopus 로고    scopus 로고
    • Gene selection for cancer classification using support vector machines
    • I. Guyon, J. Weston, S. Barnhill, and V. Vapnik. Gene selection for cancer classification using support vector machines. Machine Learning, 46(1-3):389-422, 2002.
    • (2002) Machine Learning , vol.46 , Issue.1-3 , pp. 389-422
    • Guyon, I.1    Weston, J.2    Barnhill, S.3    Vapnik, V.4
  • 25
    • 0032139235 scopus 로고    scopus 로고
    • The random subspace method for constructing decision forests
    • T. K. Ho. The random subspace method for constructing decision forests. IEEE Trans. on Pattern Analysis and Machine Intelligence, 20(8):832-844, 1998.
    • (1998) IEEE Trans. on Pattern Analysis and Machine Intelligence , vol.20 , Issue.8 , pp. 832-844
    • Ho, T.K.1
  • 26
    • 85146422424 scopus 로고
    • A practical approach to feature selection
    • San Francisco, CA, USA, Morgan Kaufmann Publishers Inc. ISBN 1-5586-247-X
    • K. Kira and L. A. Rendell. A practical approach to feature selection. In ML92: Proceedings of the ninth international workshop on Machine learning, pages 249-256, San Francisco, CA, USA, 1992. Morgan Kaufmann Publishers Inc. ISBN 1-5586-247-X.
    • (1992) ML92: Proceedings of the ninth international workshop on Machine learning , pp. 249-256
    • Kira, K.1    Rendell, L.A.2
  • 29
    • 17044405923 scopus 로고    scopus 로고
    • Toward integrating feature selection algorithms for classification and clustering
    • H. Liu and L. Yu. Toward integrating feature selection algorithms for classification and clustering. IEEE Trans. Knowledge and Data Eng., 17(4):491-502, 2005.
    • (2005) IEEE Trans. Knowledge and Data Eng , vol.17 , Issue.4 , pp. 491-502
    • Liu, H.1    Yu, L.2
  • 30
    • 33745655665 scopus 로고    scopus 로고
    • Learning theory: Stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization
    • S. Mukherjee, P. Niyogi, T. Poggio, and R. Rifkin. Learning theory: Stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization. Advances in Computational Mathematics, 25:161-193, 2006.
    • (2006) Advances in Computational Mathematics , vol.25 , pp. 161-193
    • Mukherjee, S.1    Niyogi, P.2    Poggio, T.3    Rifkin, R.4
  • 31
    • 85156199954 scopus 로고    scopus 로고
    • Improving committee diagnosis with resampling techniques
    • D. S. Touretzky, M. C. Mozer, and M. Hesselmo, editors, Cambridge, MA: MIT Press
    • B. Parmanto, P. Munro, and H. Doyle. Improving committee diagnosis with resampling techniques. In D. S. Touretzky, M. C. Mozer, and M. Hesselmo, editors, Advances in Neural Information Processing Systems 8, pages 882-888. Cambridge, MA: MIT Press, 1996.
    • (1996) Advances in Neural Information Processing Systems 8 , pp. 882-888
    • Parmanto, B.1    Munro, P.2    Doyle, H.3
  • 33
    • 1842420581 scopus 로고    scopus 로고
    • General conditions for predictivity in learning theory
    • T. Poggio, R. Rifkin, S. Mukherjee, and P. Niyogi. General conditions for predictivity in learning theory. Nature, 428:419-422, 2004.
    • (2004) Nature , vol.428 , pp. 419-422
    • Poggio, T.1    Rifkin, R.2    Mukherjee, S.3    Niyogi, P.4
  • 34
    • 0141990695 scopus 로고    scopus 로고
    • Theoretical and empirical analysis of relief and relieff
    • M. Robnik-Sikonja and I. Kononenko. Theoretical and empirical analysis of relief and relieff. Machine Learning, 53:23-69, 2003.
    • (2003) Machine Learning , vol.53 , pp. 23-69
    • Robnik-Sikonja, M.1    Kononenko, I.2
  • 35
    • 0022909661 scopus 로고
    • Toward memory-based reasoning
    • December
    • C. Stanfill and D. Waltz. Toward memory-based reasoning. Communications of the ACM, 29: 1213-1228, December 1986.
    • (1986) Communications of the ACM , vol.29 , pp. 1213-1228
    • Stanfill, C.1    Waltz, D.2
  • 38
    • 68949089706 scopus 로고    scopus 로고
    • Ensemble learning and feature selection
    • I. Guyon, S. Gunn,M. Nikravesh, and L. Zadeh, editors, Springer
    • E. Tuv. Ensemble learning and feature selection. In I. Guyon, S. Gunn,M. Nikravesh, and L. Zadeh, editors, Feature Extraction, Foundations and Applications. Springer, 2006.
    • (2006) Feature Extraction, Foundations and Applications
    • Tuv, E.1
  • 40
    • 1942452226 scopus 로고    scopus 로고
    • Low bias bagged support vector machines
    • G. Valentini and T. Dietterich. Low bias bagged support vector machines. In ICML 2003, pages 752-759, 2003.
    • (2003) ICML 2003 , pp. 752-759
    • Valentini, G.1    Dietterich, T.2
  • 41
    • 84865801454 scopus 로고    scopus 로고
    • Ensembles of learning machines
    • M. Marinaro and R. Tagliaferri, editors, Neural Nets WIRN Vietri-02, Springer-Verlag
    • G. Valentini and F. Masulli. Ensembles of learning machines. In M. Marinaro and R. Tagliaferri, editors, Neural Nets WIRN Vietri-02, Lecture Notes in Computer Science. Springer-Verlag, 2002.
    • (2002) Lecture Notes in Computer Science
    • Valentini, G.1    Masulli, F.2
  • 43
    • 25144492516 scopus 로고    scopus 로고
    • Efficient feature selection via analysis of relevance and redundancy
    • L. Yu and H. Liu. Efficient feature selection via analysis of relevance and redundancy. J. of Machine Learning Research, 5:1205-1224, 2004.
    • (2004) J. of Machine Learning Research , vol.5 , pp. 1205-1224
    • Yu, L.1    Liu, H.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.