메뉴 건너뛰기




Volumn 36, Issue 7, 2009, Pages 10570-10582

Boosting k-nearest neighbor classifier by means of input space projection

Author keywords

Boosting; k Nearest neighbors; Subspace methods

Indexed keywords

BOOSTING; BOOSTING METHODS; CLASS LABELS; CLASSIFICATION ALGORITHMS; COMPLEX METHODS; ENSEMBLE METHODS; GENERALIZATION CAPABILITIES; GENERALIZATION ERRORS; INPUT SELECTIONS; INPUT SPACES; K-NEAREST NEIGHBOR CLASSIFIERS; K-NEAREST NEIGHBORS; K-NEAREST NEIGHBORS CLASSIFIERS; K-NN CLASSIFIERS; NOISE TOLERANCES; RANDOM SUBSPACE METHODS; RE SAMPLINGS; STANDARD METHODS; SUBSPACE METHODS; TEST SETS; UCI MACHINE LEARNING REPOSITORIES;

EID: 67349220171     PISSN: 09574174     EISSN: None     Source Type: Journal    
DOI: 10.1016/j.eswa.2009.02.065     Document Type: Article
Times cited : (87)

References (42)
  • 1
    • 34047249450 scopus 로고    scopus 로고
    • Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbation
    • Altinçay H. Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbation. Applied Soft Computing 7 (2007) 1072-1083
    • (2007) Applied Soft Computing , vol.7 , pp. 1072-1083
    • Altinçay, H.1
  • 2
    • 28044457385 scopus 로고    scopus 로고
    • Boosting the distance estimation. Application to the k-nearest neighbor classifier
    • Amores J., Sebe N., and Radeva P. Boosting the distance estimation. Application to the k-nearest neighbor classifier. Pattern Recognition Letters 27 (2006) 201-209
    • (2006) Pattern Recognition Letters , vol.27 , pp. 201-209
    • Amores, J.1    Sebe, N.2    Radeva, P.3
  • 4
    • 33750367353 scopus 로고    scopus 로고
    • Combining multiple k-nearest neighbor classifiers using different distance functions
    • Proceedings of the fifth international conference on intelligent data engineering and automated learning, UK: Springer, Exeter
    • Bao, Y., Ishii, N., & Du, X. (2004). Combining multiple k-nearest neighbor classifiers using different distance functions. In: Proceedings of the fifth international conference on intelligent data engineering and automated learning. Lecture notes in computer science. UK: Springer, Exeter (Vol. 3177, pp. 634-641).
    • (2004) Lecture notes in computer science , vol.3177 , pp. 634-641
    • Bao, Y.1    Ishii, N.2    Du, X.3
  • 5
    • 0032645080 scopus 로고    scopus 로고
    • An empirical comparison of voting classification algorithms: Bagging, boosting, and variants
    • Bauer E., and Kohavi R. An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36 1/2 (1999) 105-142
    • (1999) Machine Learning , vol.36 , Issue.1-2 , pp. 105-142
    • Bauer, E.1    Kohavi, R.2
  • 6
    • 33744958288 scopus 로고    scopus 로고
    • Nearest neighbor classification from multiple feature subsets
    • Bay S.D. Nearest neighbor classification from multiple feature subsets. Intelligent Data Analysis 3 3 (1999) 191-209
    • (1999) Intelligent Data Analysis , vol.3 , Issue.3 , pp. 191-209
    • Bay, S.D.1
  • 7
    • 0030211964 scopus 로고    scopus 로고
    • Bagging predictors
    • Breiman L. Bagging predictors. Machine Learning 24 2 (1996) 123-140
    • (1996) Machine Learning , vol.24 , Issue.2 , pp. 123-140
    • Breiman, L.1
  • 8
    • 0003619255 scopus 로고    scopus 로고
    • Bias, variance, and arcing classifiers
    • Technical Report 460, Department of Statistics, University of California, Berkeley, CA
    • Breiman, L. (1996b). Bias, variance, and arcing classifiers. Technical Report 460, Department of Statistics, University of California, Berkeley, CA.
    • (1996)
    • Breiman, L.1
  • 9
    • 0030196364 scopus 로고    scopus 로고
    • Stacked regressions
    • Breiman L. Stacked regressions. Machine Learning 24 1 (1996) 49-64
    • (1996) Machine Learning , vol.24 , Issue.1 , pp. 49-64
    • Breiman, L.1
  • 10
  • 11
    • 29644438050 scopus 로고    scopus 로고
    • Statistical comparisons of classifiers over multiple data sets
    • Demšar J. Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7 (2006) 1-30
    • (2006) Journal of Machine Learning Research , vol.7 , pp. 1-30
    • Demšar, J.1
  • 12
    • 0000259511 scopus 로고    scopus 로고
    • Approximate statistical tests for comparing supervised classification learning algorithms
    • Dietterich T.G. Approximate statistical tests for comparing supervised classification learning algorithms. Neural Computation 10 7 (1998) 1895-1923
    • (1998) Neural Computation , vol.10 , Issue.7 , pp. 1895-1923
    • Dietterich, T.G.1
  • 13
    • 0034250160 scopus 로고    scopus 로고
    • An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization
    • Dietterich T.G. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning 40 (2000) 139-157
    • (2000) Machine Learning , vol.40 , pp. 139-157
    • Dietterich, T.G.1
  • 17
    • 0038476304 scopus 로고    scopus 로고
    • Resample and combine: An approach to improving uncertainty representation in evidential pattern classification
    • François J., Grandvalet Y., Denœux T., and Roger J.M. Resample and combine: An approach to improving uncertainty representation in evidential pattern classification. Information Fusion 4 (2003) 75-85
    • (2003) Information Fusion , vol.4 , pp. 75-85
    • François, J.1    Grandvalet, Y.2    Denœux, T.3    Roger, J.M.4
  • 19
    • 0001837148 scopus 로고
    • A comparison of alternative tests of significance for the problem of M rankings
    • Friedman M. A comparison of alternative tests of significance for the problem of M rankings. Annals of Mathematical Statistics 11 (1940) 86-92
    • (1940) Annals of Mathematical Statistics , vol.11 , pp. 86-92
    • Friedman, M.1
  • 31
    • 0001648184 scopus 로고
    • Error-correcting output coding corrects bias and variance
    • Prieditis, A, Lemmer, J.F, Eds, Elsevier Science Publishers, pp
    • Kong, E. B. & Dietterich, T.G. (1995). Error-correcting output coding corrects bias and variance. In: Prieditis, A., Lemmer, J.F. (Eds.), Machine Learning: Proceedings of the Twelfth International Conference. Elsevier Science Publishers, pp. 275-283.
    • (1995) Machine Learning: Proceedings of the Twelfth International Conference , pp. 275-283
    • Kong, E.B.1    Dietterich, T.G.2
  • 33
    • 0032822143 scopus 로고    scopus 로고
    • A comparative study of neural networks based feature extraction paradigms
    • Lerner B., Guterman H., Aladjem M., and Dinstein I. A comparative study of neural networks based feature extraction paradigms. Pattern Recognition Letters 20 1 (1999) 7-14
    • (1999) Pattern Recognition Letters , vol.20 , Issue.1 , pp. 7-14
    • Lerner, B.1    Guterman, H.2    Aladjem, M.3    Dinstein, I.4
  • 36
    • 0037365188 scopus 로고    scopus 로고
    • Combining classifiers with meta decision trees
    • Todorovski L., and Dzeroski S. Combining classifiers with meta decision trees. Machine Learning 50 (2003) 223-249
    • (2003) Machine Learning , vol.50 , pp. 223-249
    • Todorovski, L.1    Dzeroski, S.2
  • 38
    • 7544251042 scopus 로고    scopus 로고
    • Fusion of multiple approximate nearest neighbor classifiers for fast and efficient classification
    • Viswanath P., Murty M.N., and Bhatnagar S. Fusion of multiple approximate nearest neighbor classifiers for fast and efficient classification. Information Fusion 5 (2004) 239-250
    • (2004) Information Fusion , vol.5 , pp. 239-250
    • Viswanath, P.1    Murty, M.N.2    Bhatnagar, S.3
  • 39
    • 0034247206 scopus 로고    scopus 로고
    • Multiboosting: A technique for combining boosting and wagging
    • Webb G.I. Multiboosting: A technique for combining boosting and wagging. Machine Learning 40 2 (2000) 159-196
    • (2000) Machine Learning , vol.40 , Issue.2 , pp. 159-196
    • Webb, G.I.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.