메뉴 건너뛰기




Volumn 2005, Issue , 2005, Pages 69-

Optimal size of a feedforward neural network: How much does it matter?

Author keywords

Hidden neurons; Learning; Neural networks; Occam's Razor

Indexed keywords

COMPUTER NETWORKS; COMPUTER SIMULATION; ERROR ANALYSIS;

EID: 33845330227     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1109/ICAS-ICNS.2005.72     Document Type: Conference Paper
Times cited : (14)

References (10)
  • 1
    • 0000177227 scopus 로고    scopus 로고
    • The vapnik-chervonenkis dimension: Information versus complexity in learning
    • Y. Abu-Mostafa. The vapnik-chervonenkis dimension: Information versus complexity in learning. Neural Computation, 1(3):312-317, 1998.
    • (1998) Neural Computation , vol.1 , Issue.3 , pp. 312-317
    • Abu-Mostafa, Y.1
  • 3
    • 0000501656 scopus 로고
    • Information theory and an extention of the maximum likelihood principle
    • B.N. Petrov and F. Csaki, eds.
    • H. Akaike. Information theory and an extention of the maximum likelihood principle. B.N. Petrov and F. Csaki, eds. Proceedings 2nd International Symposium on Information Theory, pages 267-281, 1973.
    • (1973) Proceedings 2nd International Symposium on Information Theory , pp. 267-281
    • Akaike, H.1
  • 4
    • 0039124300 scopus 로고
    • Vapnik-chervonenkis dimension bounds for two-and three-layer networks
    • P. Bartlett. Vapnik-chervonenkis dimension bounds for two-and three-layer networks. Neural Computation, 5(3):371-373, 1993.
    • (1993) Neural Computation , vol.5 , Issue.3 , pp. 371-373
    • Bartlett, P.1
  • 6
    • 0003979410 scopus 로고    scopus 로고
    • What size neural network gives optimal generalization? convergence properties of backpropagation
    • UMIACS-TR-96-22
    • S. Lawrence, C. Giles, and A. Tsoi. What size neural network gives optimal generalization? convergence properties of backpropagation. University of Maryland Technical Report, UMIACS-TR-96-22, 1996.
    • (1996) University of Maryland Technical Report
    • Lawrence, S.1    Giles, C.2    Tsoi, A.3
  • 8
    • 0000902690 scopus 로고
    • The effective number of parameters: An analysis of generalization and regularization learning systems
    • J. Moody, S.J. Hanson, and R.P. Lippmann, eds.
    • J. Moody. The effective number of parameters: An analysis of generalization and regularization learning systems, in J. Moody, S.J. Hanson, and R.P. Lippmann, eds., Advances in Neural Information Processing, 4:847-854, 1992.
    • (1992) Advances in Neural Information Processing , vol.4 , pp. 847-854
    • Moody, J.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.