메뉴 건너뛰기




Volumn 2, Issue , 2007, Pages 1043-1047

Using random forests for handwritten digit recognition

Author keywords

[No Author keywords available]

Indexed keywords

CLASSIFICATION (OF INFORMATION); CLASSIFIERS; COMPUTER NETWORKS; DATABASE SYSTEMS; FEATURE EXTRACTION; LEARNING SYSTEMS; PATTERN RECOGNITION;

EID: 51149110955     PISSN: 15205363     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1109/ICDAR.2007.4377074     Document Type: Conference Paper
Times cited : (71)

References (17)
  • 2
    • 0030211964 scopus 로고    scopus 로고
    • Bagging predictors
    • L. Breiman. Bagging predictors. Machine Learning, 24(2):123-140, 1996.
    • (1996) Machine Learning , vol.24 , Issue.2 , pp. 123-140
    • Breiman, L.1
  • 3
    • 0035478854 scopus 로고    scopus 로고
    • Random forests
    • L. Breiman. Random forests. Machine Learning, 45(1):5-32, 2001.
    • (2001) Machine Learning , vol.45 , Issue.1 , pp. 5-32
    • Breiman, L.1
  • 5
    • 0002117591 scopus 로고
    • A further comparison of splitting rules for decisin-tree induction
    • W. Buntine and T. Niblett. A further comparison of splitting rules for decisin-tree induction. Machine Learning, 8:75-85, 1992.
    • (1992) Machine Learning , vol.8 , pp. 75-85
    • Buntine, W.1    Niblett, T.2
  • 7
    • 0034250160 scopus 로고    scopus 로고
    • An experimental comparison of three methods for constructing ensembles of decision trees : Bagging, boosting, and randomization
    • T. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees : Bagging, boosting, and randomization. Machine Learning, 40:139-157, 1999.
    • (1999) Machine Learning , vol.40 , pp. 139-157
    • Dietterich, T.1
  • 8
    • 0002978642 scopus 로고    scopus 로고
    • Experiments with a new boosting algorithm
    • Y. Freund and R. Schapire. Experiments with a new boosting algorithm. ICML, 1996.
    • (1996) ICML
    • Freund, Y.1    Schapire, R.2
  • 10
    • 0032139235 scopus 로고    scopus 로고
    • The random subspace method for constructing decision forests
    • T. Ho. The random subspace method for constructing decision forests. IEEE Trans. on PAMI, 20(8):832-844, 1998.
    • (1998) IEEE Trans. on PAMI , vol.20 , Issue.8 , pp. 832-844
    • Ho, T.1
  • 12
    • 0032203257 scopus 로고    scopus 로고
    • Gradient-based learning applied to document recognition
    • Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278-2324, 1998.
    • (1998) Proceedings of the IEEE , vol.86 , Issue.11 , pp. 2278-2324
    • LeCun, Y.1    Bottou, L.2    Bengio, Y.3    Haffner, P.4
  • 13
    • 0028405109 scopus 로고
    • The importance of attribute selection measures in decision tree induction
    • W. Liu and A. White. The importance of attribute selection measures in decision tree induction. Machine Learning, 15(1):25-41, 1994.
    • (1994) Machine Learning , vol.15 , Issue.1 , pp. 25-41
    • Liu, W.1    White, A.2
  • 15
    • 22944453097 scopus 로고    scopus 로고
    • Improving random forests
    • Springer, Berlin
    • M. Robnik-Sikonja. Improving random forests. ECML, LNAI 3210, Springer, Berlin, pages 359-370, 2004.
    • (2004) ECML, LNAI 3210 , pp. 359-370
    • Robnik-Sikonja, M.1
  • 16
    • 33750095186 scopus 로고    scopus 로고
    • Rotation forest : A new classifier ensemble method
    • J. Rodriguez, L. Kuncheva, and C. Alonso. Rotation forest : A new classifier ensemble method. IEEE Trans. on PAMI, 28(10), 2006.
    • (2006) IEEE Trans. on PAMI , vol.28 , Issue.10
    • Rodriguez, J.1    Kuncheva, L.2    Alonso, C.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.