메뉴 건너뛰기




Volumn 2, Issue , 2008, Pages 588-596

On the dangers of cross-validation. An experimental evaluation

Author keywords

[No Author keywords available]

Indexed keywords

ESTIMATION; LEARNING ALGORITHMS; RISK PERCEPTION;

EID: 52649131369     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1137/1.9781611972788.54     Document Type: Conference Paper
Times cited : (183)

References (28)
  • 1
    • 0000343716 scopus 로고
    • Submodel selection and evaluation in regression: The x-random case
    • Breiman, L., &, Spector, P. (1992), Submodel selection and evaluation in regression: The x-random case, International Statistical Review, 60, 291-319.
    • (1992) International Statistical Review , vol.60 , pp. 291-319
    • Breiman, L.1    Spector, P.2
  • 2
    • 52649160239 scopus 로고    scopus 로고
    • Efficient leave-one-out cross-validation of kernel fisher discriminant
    • cawley03efficieiit.litnil
    • Cawley, G. C., & Talbot, N. L. C. (2003). Efficient leave-one-out cross-validation of kernel fisher discriminant classifiers. citeseer.ist.psu. edu/cawley03efficieiit.litnil.
    • (2003)
    • Cawley, G.C.1    Talbot, N.L.C.2
  • 5
    • 0035789613 scopus 로고    scopus 로고
    • Proximal support vector machine classifiers
    • August 26-29, San Francisco, CA pp, New York: Asscociation for Computing Machinery
    • Fung, G., & Mangasarian, O. L. (2001). Proximal support vector machine classifiers. Proceedings KDD-2001: Knowledge Discovery and Data Mining, August 26-29, 2001, San Francisco, CA (pp. 77-86). New York: Asscociation for Computing Machinery. ftp://ftp.cs.wisc.edu/pub/dmi/tech- reports/01-02.ps.
    • (2001) Proceedings KDD-2001: Knowledge Discovery and Data Mining , pp. 77-86
    • Fung, G.1    Mangasarian, O.L.2
  • 6
    • 0004236492 scopus 로고    scopus 로고
    • Baltimore: The Johns Hopkins University Press
    • Golub, G. H., & Loan, C. F. V. (1996). Matrix computations. Baltimore: The Johns Hopkins University Press.
    • (1996) Matrix computations
    • Golub, G.H.1    Loan, C.F.V.2
  • 9
    • 0030654389 scopus 로고    scopus 로고
    • Algorithmic stability and sanity-check bounds for leave-one-out cross-validation
    • Kearns, M. J., & Ron, D. (1997). Algorithmic stability and sanity-check bounds for leave-one-out cross-validation. Computational Learing Theory (pp. 152-162).
    • (1997) Computational Learing Theory , pp. 152-162
    • Kearns, M.J.1    Ron, D.2
  • 10
    • 85164392958 scopus 로고
    • A study of cross-validation and bootstrap for accuracy estimation and model selection
    • Kohavi, R. (1995a). A study of cross-validation and bootstrap for accuracy estimation and model selection. IJCAI (pp. 1137-1145).
    • (1995) IJCAI , pp. 1137-1145
    • Kohavi, R.1
  • 11
    • 85164392958 scopus 로고
    • A study of cross-validation and bootstrap for accuracy estimation and model selection
    • Kohavi, R. (1995b). A study of cross-validation and bootstrap for accuracy estimation and model selection. International Joint Conference on Artificial Intelligence IJCAI (pp. 1137-1145).
    • (1995) International Joint Conference on Artificial Intelligence IJCAI , pp. 1137-1145
    • Kohavi, R.1
  • 12
    • 0031381525 scopus 로고    scopus 로고
    • Wrappers for feature subset selection
    • Kohavi, R., & John, G. H. (1997). Wrappers for feature subset selection. Artificial Intelligence, 97, 273-324.
    • (1997) Artificial Intelligence , vol.97 , pp. 273-324
    • Kohavi, R.1    John, G.H.2
  • 17
    • 0013161560 scopus 로고    scopus 로고
    • On feature selection: Learning with exponentially many irrelevant features as training examples
    • Morgan Kaufmann, San Francisco, CA
    • Ng, A. Y. (1998). On feature selection: learning with exponentially many irrelevant features as training examples. Proc. 15th International Conf. on Machine Learning (pp. 404-412). Morgan Kaufmann, San Francisco, CA.
    • (1998) Proc. 15th International Conf. on Machine Learning , pp. 404-412
    • Ng, A.Y.1
  • 19
    • 84890445089 scopus 로고    scopus 로고
    • Overfitting in making comparisons between variable selection methods
    • Reunanen, J. (2003). Overfitting in making comparisons between variable selection methods. Journal of Machine Learning Research, 3, 1371-1382.
    • (2003) Journal of Machine Learning Research , vol.3 , pp. 1371-1382
    • Reunanen, J.1
  • 22
    • 0017336301 scopus 로고
    • Asymptotics for and against cross-validation
    • Stone, M. (1977). Asymptotics for and against cross-validation. Biometrika, 64, 29-35.
    • (1977) Biometrika , vol.64 , pp. 29-35
    • Stone, M.1
  • 23
    • 0032638628 scopus 로고    scopus 로고
    • Least squares support vector machine classifiers
    • Suykens, J., & Vandewalle, J. (1999). Least squares support vector machine classifiers. Neural Processing Letters, 9, 293-300.
    • (1999) Neural Processing Letters , vol.9 , pp. 293-300
    • Suykens, J.1    Vandewalle, J.2
  • 27
    • 0034863533 scopus 로고    scopus 로고
    • Kernel MSE algorithm: A unified framework for KFD, LS-SVM and KRR
    • Xu, J., Zhang, X., & Li, Y. (2001). Kernel MSE algorithm: a unified framework for KFD, LS-SVM and KRR. Proc. of IJCNN-01 (pp. 1486-1491).
    • (2001) Proc. of IJCNN-01 , pp. 1486-1491
    • Xu, J.1    Zhang, X.2    Li, Y.3
  • 28
    • 0042879446 scopus 로고    scopus 로고
    • Leave-one-out bounds for kernel methods
    • Zhang, T. (2003). Leave-one-out bounds for kernel methods. Neural Computation, 15, 1397-1437.
    • (2003) Neural Computation , vol.15 , pp. 1397-1437
    • Zhang, T.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.