메뉴 건너뛰기




Volumn 1822, Issue , 2000, Pages 90-101

Identifying and eliminating irrelevant instances using information theory

Author keywords

[No Author keywords available]

Indexed keywords

ALGORITHMS; ARTIFICIAL INTELLIGENCE; COMPUTATION THEORY; INFORMATION THEORY; TOPOLOGY;

EID: 76549111507     PISSN: 03029743     EISSN: 16113349     Source Type: Journal    
DOI: 10.1007/3-540-45486-1_8     Document Type: Article
Times cited : (1)

References (23)
  • 1
    • 0025725905 scopus 로고
    • Instance-based learning algorithms
    • D. Aha, D. Kibler, and M. Albert. Instance-based learning algorithms. Machine Learning, 6:37-66, 1991.
    • (1991) Machine Learning , vol.6 , pp. 37-66
    • Aha, D.1    Kibler, D.2    Albert, M.3
  • 3
    • 85118837783 scopus 로고
    • Adressing the selective superiority problem: Automatic algorithm/ model class selection
    • C. Brodley. Adressing the selective superiority problem: Automatic algorithm/ model class selection. In Tenth International Machine Learning Conference (Amherst), pages 17-24, 1993.
    • (1993) Tenth International Machine Learning Conference (Amherst) , pp. 17-24
    • Brodley, C.1
  • 5
    • 84926662675 scopus 로고
    • Nearest neighbor pattern classification
    • T. Cover and P. Hart. Nearest neighbor pattern classification. IEEE. Trans. Info. Theory, IT13:21-27, 1967.
    • (1967) IEEE. Trans. Info. Theory , vol.IT13 , pp. 21-27
    • Cover, T.P.H.1
  • 8
    • 0015346497 scopus 로고
    • The reduced nearest neighbor rule
    • G. Gates. The reduced nearest neighbor rule. IEEE Trans. Inform. Theory, pages 431-433, 1972.
    • (1972) IEEE Trans. Inform. Theory , pp. 431-433
    • Gates, G.1
  • 9
    • 84931162639 scopus 로고
    • The condensed nearest neighbor rule
    • P. Hart. The condensed nearest neighbor rule. IEEE Trans. Inform. Theory, pages 515-516, 1968.
    • (1968) IEEE Trans. Inform. Theory , pp. 515-516
    • Hart, P.1
  • 16
    • 0012657799 scopus 로고
    • Prototype and feature selection by sampling and random mutation hill climbing algorithms
    • D. Skalak. Prototype and feature selection by sampling and random mutation hill climbing algorithms. In 11th International Conference on Machine Learning, pages 293-301, 1994.
    • (1994) 11Th International Conference on Machine Learning , pp. 293-301
    • Skalak, D.1
  • 17
    • 84943223276 scopus 로고    scopus 로고
    • Prototype selection for composite nearest neighbors classifiers
    • D. Skalak. Prototype selection for composite nearest neighbors classifiers. In Technical report UM-CS-1996-089, 1996.
    • (1996) Technical Report UM-CS-1996-089
    • Skalak, D.1
  • 18
    • 0025796458 scopus 로고
    • Refinements to nearest-neighbor searching in k-dimensional trees
    • R. Sproull. Refinements to nearest-neighbor searching in k-dimensional trees. Algorithmica, 6:579-589, 1991.
    • (1991) Algorithmica , vol.6 , pp. 579-589
    • Sproull, R.1
  • 20
    • 0002564447 scopus 로고
    • An experimental comparison of the nearest neighbor and nearest hyperrectangle algorithms
    • D. Wettschereck and T. Dietterich. An experimental comparison of the nearest neighbor and nearest hyperrectangle algorithms. In Machine Learning, 19, pages 5-28, 1995.
    • (1995) Machine Learning , vol.19 , pp. 5-28
    • Wettschereck, D.1    Dietterich, T.2
  • 22
    • 0033905427 scopus 로고    scopus 로고
    • Reduction techniques for exemplar-based learning algorithms
    • D. Wilson and T. Martinez. Reduction techniques for exemplar-based learning algorithms. In Machine Learning, 1998.
    • (1998) Machine Learning
    • Wilson, D.1    Martinez, T.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.