메뉴 건너뛰기




Volumn 2130, Issue , 2001, Pages 49-56

Bagging can stabilize without reducing variance

Author keywords

[No Author keywords available]

Indexed keywords

NEURAL NETWORKS; PERTURBATION TECHNIQUES;

EID: 84958949175     PISSN: 03029743     EISSN: 16113349     Source Type: Book Series    
DOI: 10.1007/3-540-44668-0_8     Document Type: Conference Paper
Times cited : (9)

References (12)
  • 1
    • 0030211964 scopus 로고    scopus 로고
    • Bagging predictors
    • L. Breiman. Bagging predictors. Machine learning, 24(2):123–140, 1996.
    • (1996) Machine Learning , vol.24 , Issue.2 , pp. 123-140
    • Breiman, L.1
  • 2
    • 0003929807 scopus 로고    scopus 로고
    • Technical Report 504, Statistics Department, University of California at Berkeley
    • L. Breiman. Prediction games and arcing algorithms. Technical Report 504, Statistics Department, University of California at Berkeley, 1997.
    • (1997) Prediction Games and Arcing Algorithms
    • Breiman, L.1
  • 3
    • 3543059342 scopus 로고    scopus 로고
    • Technical Report 92, Seminar für Statistik, ETH, Zürich
    • P. Bühlmann and B. Yu. Explaining bagging. Technical Report 92, Seminar für Statistik, ETH, Zürich, 2000.
    • (2000) Explaining Bagging
    • Bühlmann, P.1    Yu, B.2
  • 4
    • 3543076974 scopus 로고    scopus 로고
    • Estimating equivalent kernels for neural networks: A data perturbation approach
    • M.C. Mozer, M.I. Jordan, and T. Petsche, editors, MIT Press
    • A. N. Burgess. Estimating equivalent kernels for neural networks: A data perturbation approach. In M.C. Mozer, M.I. Jordan, and T. Petsche, editors, Advances in Neural Information Processing Systems 9, pages 382–388. MIT Press, 1997.
    • (1997) Advances in Neural Information Processing Systems 9 , pp. 382-388
    • Burgess, A.N.1
  • 5
    • 0034250160 scopus 로고    scopus 로고
    • An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting and randomization
    • T.G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting and randomization. Machine Learning, 40(2):1–19, 2000.
    • (2000) Machine Learning , vol.40 , Issue.2 , pp. 1-19
    • Dietterich, T.G.1
  • 6
    • 21744462998 scopus 로고    scopus 로고
    • On bias, variance, 0/1 loss, and the curse of dimensionality
    • J. H. Friedman. On bias, variance, 0/1 loss, and the curse of dimensionality. Data Mining and Knowledge Discovery, 1(1):55–77, 1997.
    • (1997) Data Mining and Knowledge Discovery , vol.1 , Issue.1 , pp. 55-77
    • Friedman, J.H.1
  • 8
    • 0033721860 scopus 로고    scopus 로고
    • Bagging down-weights leverage points
    • S.-I. Amari, C. Lee Giles, M. Gori, and V. Piuri, editors, IEEE
    • Y. Grandvalet. Bagging down-weights leverage points. In S.-I. Amari, C. Lee Giles, M. Gori, and V. Piuri, editors, IJCNN, volume 4, pages 505–510. IEEE, 2000.
    • (2000) IJCNN , vol.4 , pp. 505-510
    • Grandvalet, Y.1
  • 11
    • 0032280519 scopus 로고    scopus 로고
    • Boosting the margin: A new explanation for the effectiveness of voting methods
    • R. Schapire, Y. Freund, P. Bartlett, and W. S. Lee. Boosting the margin: A new explanation for the effectiveness of voting methods. The Annals of Statistics, 26(5):1651–1686, 1998.
    • (1998) The Annals of Statistics , vol.26 , Issue.5 , pp. 1651-1686
    • Schapire, R.1    Freund, Y.2    Bartlett, P.3    Lee, W.S.4
  • 12
    • 0039724913 scopus 로고    scopus 로고
    • The covariance inflation criterion for adaptive model selection
    • R. J. Tibshirani and K. Knight. The covariance inflation criterion for adaptive model selection. Journal of the Royal Statistical Society, B, 61(3):529–546, 1999.
    • (1999) Journal of the Royal Statistical Society, B , vol.61 , Issue.3 , pp. 529-546
    • Tibshirani, R.J.1    Knight, K.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.