메뉴 건너뛰기




Volumn 2, Issue 4, 2002, Pages 413-428

Best Choices for Regularization Parameters in Learning Theory: On the Bias-Variance Problem

Author keywords

[No Author keywords available]

Indexed keywords

BEST CHOICE; LEARNING THEORY; REGULARIZATION PARAMETERS;

EID: 0036436325     PISSN: 16153375     EISSN: 16153383     Source Type: Journal    
DOI: 10.1007/s102080010030     Document Type: Article
Times cited : (244)

References (15)
  • 1
    • 0001325515 scopus 로고
    • Approximation and estimation bounds for artificial neural networks
    • A. R. Barron, Approximation and estimation bounds for artificial neural networks, Machine Learning 14 (1994), 115-133.
    • (1994) Machine Learning , vol.14 , pp. 115-133
    • Barron, A.R.1
  • 4
    • 0036071370 scopus 로고    scopus 로고
    • On the mathematical foundations of learning
    • F. Cucker and S. Smale, On the mathematical foundations of learning, Bull. Amer. Math. Soc. 39 (2002), 1-49.
    • (2002) Bull. Amer. Math. Soc. , vol.39 , pp. 1-49
    • Cucker, F.1    Smale, S.2
  • 5
    • 0034419669 scopus 로고    scopus 로고
    • Regularization networks and support vector machines
    • T. Evgeniou, M. Pontil, and T. Poggio, Regularization networks and support vector machines, Adv. in Comput. Math. 13 (2000), 1-50.
    • (2000) Adv. in Comput. Math. , vol.13 , pp. 1-50
    • Evgeniou, T.1    Pontil, M.2    Poggio, T.3
  • 6
    • 32044449925 scopus 로고
    • Generalized cross-validation as a method for choosing a good ridge parameter
    • G. Golub, M. Heat, and G. Wahba, Generalized cross-validation as a method for choosing a good ridge parameter, Technometrics 21 (1979), 215-223.
    • (1979) Technometrics , vol.21 , pp. 215-223
    • Golub, G.1    Heat, M.2    Wahba, G.3
  • 8
    • 0001066342 scopus 로고
    • Asymptotically minimax adaptive estimation. I: Upper bounds, optimally adaptive estimates
    • O. V. Lepskii, Asymptotically minimax adaptive estimation. I: Upper bounds, optimally adaptive estimates, Theory Probab. Appl. 36 (1991), 682-697.
    • (1991) Theory Probab. Appl. , vol.36 , pp. 682-697
    • Lepskii, O.V.1
  • 9
    • 0007267891 scopus 로고
    • Asymptotically minimax adaptive estimation. II: Schemes without optimal adaption, adaptive estimators
    • O. V. Lepskii, Asymptotically minimax adaptive estimation. II: Schemes without optimal adaption, adaptive estimators, Theory Probab. Appl. 37 (1992), 433-448.
    • (1992) Theory Probab. Appl. , vol.37 , pp. 433-448
    • Lepskii, O.V.1
  • 11
    • 0000482137 scopus 로고    scopus 로고
    • On the relationship between generalization error, hypothesis complexity and sample complexity for radial basis functions
    • P. Niyogi and F. Girosi, On the relationship between generalization error, hypothesis complexity and sample complexity for radial basis functions, Neural Comput. 8 (1996), 819-842.
    • (1996) Neural Comput. , vol.8 , pp. 819-842
    • Niyogi, P.1    Girosi, F.2
  • 12
    • 0033480745 scopus 로고    scopus 로고
    • Generalization bounds for function approximation from scattered noisy data
    • P. Niyogi and F. Girosi, Generalization bounds for function approximation from scattered noisy data, Adv. in Comput. Math. 50 (1999), 51-80.
    • (1999) Adv. in Comput. Math. , vol.50 , pp. 51-80
    • Niyogi, P.1    Girosi, F.2
  • 15
    • 84876628219 scopus 로고    scopus 로고
    • The covering numbers in learning theory
    • to appear
    • D.-X. Zhou, The covering numbers in learning theory, J. Complexity 2001 (to appear).
    • (2001) J. Complexity
    • Zhou, D.-X.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.