메뉴 건너뛰기




Volumn 2534, Issue , 2002, Pages 340-347

Combining multiple k-nearest neighbor classifiers for text classification by reducts

Author keywords

[No Author keywords available]

Indexed keywords

CLASSIFICATION (OF INFORMATION); DECISION TREES; MOTION COMPENSATION; NEAREST NEIGHBOR SEARCH;

EID: 84871052245     PISSN: 03029743     EISSN: 16113349     Source Type: Book Series    
DOI: 10.1007/3-540-36182-0_34     Document Type: Conference Paper
Times cited : (27)

References (8)
  • 1
    • 0000636553 scopus 로고    scopus 로고
    • Text Classification with Support Vector Machines: Learning with Many Relevant Features
    • T. Joachims, “Text Classification with Support Vector Machines: Learning with Many Relevant Features”, ECML-98, 10th European Conference on Machine Learning, 1998, pp. 170-178.
    • (1998) ECML-98, 10Th European Conference on Machine Learning , pp. 170-178
    • Joachims, T.1
  • 4
    • 27144441097 scopus 로고    scopus 로고
    • An Evaluation of Statistical Approaches to Text Classification
    • Y. Yang, “An Evaluation of Statistical Approaches to Text Classification”, Journal of Infor-mation Retrieval, 1, 1999, pp. 69-90.
    • (1999) Journal of Infor-Mation Retrieval , vol.1 , pp. 69-90
    • Yang, Y.1
  • 6
    • 33744958288 scopus 로고    scopus 로고
    • Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets
    • S.D. Bay, “Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets”, Intelligent Data Analysis, 3(3), 1999, pp. 191-209.
    • (1999) Intelligent Data Analysis , vol.3 , Issue.3 , pp. 191-209
    • Bay, S.D.1
  • 7
    • 33846023240 scopus 로고    scopus 로고
    • Combining Multiple k-Nearest Neighbor Classifiers Using Feature Combinations
    • Itqon, S. Kaneko & S. Igarashi, “Combining Multiple k-Nearest Neighbor Classifiers Using Feature Combinations”, Journal IECI, 2(3), 2000, pp. 23-319.
    • (2000) Journal IECI , vol.2 , Issue.3 , pp. 23-319
    • Itqon1    Kaneko, S.2    Igarashi, S.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.