메뉴 건너뛰기




Volumn 32, Issue 1, 2004, Pages 56-134

Statistical behavior and consistency of classification methods based on convex risk minimization

Author keywords

Boosting; Classification; Consistency; Kernel methods; Large margin methods

Indexed keywords


EID: 4644257995     PISSN: 00905364     EISSN: None     Source Type: Journal    
DOI: 10.1214/aos/1079120130     Document Type: Article
Times cited : (730)

References (18)
  • 1
    • 49949144765 scopus 로고
    • The relaxation method of finding a common point of convex sets and its application to the solution of problems in convex programming
    • BREGMAN, L. M. (1967). The relaxation method of finding a common point of convex sets and its application to the solution of problems in convex programming. U.S.S.R. Computational Mathematics and Mathematical Physics 7 200-217.
    • (1967) U.S.S.R. Computational Mathematics and Mathematical Physics , vol.7 , pp. 200-217
    • Bregman, L.M.1
  • 2
    • 0346786584 scopus 로고    scopus 로고
    • Arcing classifiers
    • BREIMAN, L. (1998). Arcing classifiers (with discussion). Ann. Statist. 26 801-849.
    • (1998) Ann. Statist. , vol.26 , pp. 801-849
    • Breiman, L.1
  • 3
    • 0000275022 scopus 로고    scopus 로고
    • Prediction games and arcing algorithms
    • BREIMAN, L. (1999). Prediction games and arcing algorithms. Neural Computation 11 1493-1517.
    • (1999) Neural Computation , vol.11 , pp. 1493-1517
    • Breiman, L.1
  • 4
    • 0013228784 scopus 로고    scopus 로고
    • Some infinity theory for predictor ensembles
    • Dept. Statistics, Univ. California, Berkeley
    • BREIMAN, L. (2000). Some infinity theory for predictor ensembles. Technical Report 577, Dept. Statistics, Univ. California, Berkeley.
    • (2000) Technical Report , vol.577
    • Breiman, L.1
  • 6
    • 0031211090 scopus 로고    scopus 로고
    • A decision-theoretic generalization of on-line learning and an application to boosting
    • FREUND, Y. and SCHAPIRE, R. E. (1997). A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. System Sci. 55 119-139.
    • (1997) J. Comput. System Sci. , vol.55 , pp. 119-139
    • Freund, Y.1    Schapire, R.E.2
  • 7
    • 0034164230 scopus 로고    scopus 로고
    • Additive logistic regression: A statistical view of boosting
    • FRIEDMAN, J., HASTIE, T. and TIBSHIRANI, R. (2000). Additive logistic regression: A statistical view of boosting (with discussion). Ann. Statist. 28 337-407.
    • (2000) Ann. Statist. , vol.28 , pp. 337-407
    • Friedman, J.1    Hastie, T.2    Tibshirani, R.3
  • 8
    • 0027262895 scopus 로고
    • Multilayer feedforward networks with a non-polynomial activation function can approximate any function
    • LESHNO, M., LIN, YA. V., PINKUS, A. and SCHOCKEN, S. (1993). Multilayer feedforward networks with a non-polynomial activation function can approximate any function. Neural Networks 6 861-867.
    • (1993) Neural Networks , vol.6 , pp. 861-867
    • Leshno, M.1    Lin, Ya.V.2    Pinkus, A.3    Schocken, S.4
  • 9
    • 9444269961 scopus 로고    scopus 로고
    • On the Bayes-risk consistency of regularized boosting methods
    • LUGOSI, G. and VAYATIS, N. (2004). On the Bayes-risk consistency of regularized boosting methods. Ann. Statist. 32 30-55.
    • (2004) Ann. Statist. , vol.32 , pp. 30-55
    • Lugosi, G.1    Vayatis, N.2
  • 13
    • 0032280519 scopus 로고    scopus 로고
    • Boosting the margin: A new explanation for the effectiveness of voting methods
    • SCHAPIRE, R. E., FREUND, Y., BARTLETT, P. and LEE, W. S. (1998). Boosting the margin: A new explanation for the effectiveness of voting methods. Ann. Statist. 26 1651-1686.
    • (1998) Ann. Statist. , vol.26 , pp. 1651-1686
    • Schapire, R.E.1    Freund, Y.2    Bartlett, P.3    Lee, W.S.4
  • 14
    • 0033281701 scopus 로고    scopus 로고
    • Improved boosting algorithms using confidence-rated predictions
    • SCHAPIRE, R. E. and SINGER, Y. (1999). Improved boosting algorithms using confidence-rated predictions. Machine Learning 37 297-336.
    • (1999) Machine Learning , vol.37 , pp. 297-336
    • Schapire, R.E.1    Singer, Y.2
  • 15
    • 0036749277 scopus 로고    scopus 로고
    • Support vector machines are universally consistent
    • STEINWART, I. (2002). Support vector machines are universally consistent. J. Complexity 18 768-791.
    • (2002) J. Complexity , vol.18 , pp. 768-791
    • Steinwart, I.1
  • 18
    • 84880171586 scopus 로고    scopus 로고
    • A leave-one-out cross validation bound for kernel methods with applications in learning
    • Springer, New York
    • ZHANG, T. (2001). A leave-one-out cross validation bound for kernel methods with applications in learning. In Proc. 14th Annual Conference on Computational Learning Theory 427-443. Springer, New York.
    • (2001) Proc. 14th Annual Conference on Computational Learning Theory , pp. 427-443
    • Zhang, T.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.