메뉴 건너뛰기




Volumn 3512, Issue , 2005, Pages 1-8

Role of function complexity and network size in the generalization ability of feedforward networks

Author keywords

[No Author keywords available]

Indexed keywords

BACKPROPAGATION; BOOLEAN ALGEBRA;

EID: 25144436369     PISSN: 03029743     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1007/11494669_1     Document Type: Conference Paper
Times cited : (13)

References (14)
  • 2
    • 0001160588 scopus 로고
    • What size net gives valid generalization ?
    • Baum, E.B. & Haussler, D. (1989) What size net gives valid generalization ? Neural Computation, 1, pp. 151-160.
    • (1989) Neural Computation , vol.1 , pp. 151-160
    • Baum, E.B.1    Haussler, D.2
  • 3
    • 0003979410 scopus 로고    scopus 로고
    • What size neural network gives optimal generalization ? Convergence properties of backpropagation
    • Institute for Advanced Computer Studies, Univ. of Maryland
    • Lawrence, S., Giles, C. L., & Tsoi, A. C. (1996). What Size Neural Network Gives Optimal Generalization ? Convergence Properties of Backpropagation. In Technical Report UMIACS-TR-96-22 and CS-TR-3617, Institute for Advanced Computer Studies, Univ. of Maryland.
    • (1996) Technical Report UMIACS-TR-96-22 and CS-TR-3617
    • Lawrence, S.1    Giles, C.L.2    Tsoi, A.C.3
  • 4
    • 84898932856 scopus 로고    scopus 로고
    • Overfitting in neural networks: Backpropagation, conjugate gradient, and early stopping
    • Leen, T. K., Dietterich, T. G. & Tresp, V. editors, MIT Press
    • Caruana, R., Lawrence, S., & Giles, C.L. (2001). Overfitting in Neural Networks: Backpropagation, Conjugate Gradient, and Early Stopping. In Leen, T. K., Dietterich, T. G. & Tresp, V. editors, Advances in Neural Information Processing Systems, MIT Press, 13, pp. 402-408.
    • (2001) Advances in Neural Information Processing Systems , vol.13 , pp. 402-408
    • Caruana, R.1    Lawrence, S.2    Giles, C.L.3
  • 5
    • 0000029122 scopus 로고
    • A simple weight decay can improve generalization
    • J.E. Moody, S. J. Hanson, & R. P. Lippmann editors, Morgan Kaufmann, San Mateo, CA
    • Krogh, A. & Hertz, J.A. (1992) A simple weight decay can improve generalization. In J.E. Moody, S. J. Hanson, & R. P. Lippmann editors, Advances in Neural Information Processing Systems Morgan Kaufmann, San Mateo, CA, 4, pp. 950-957.
    • (1992) Advances in Neural Information Processing Systems , vol.4 , pp. 950-957
    • Krogh, A.1    Hertz, J.A.2
  • 6
    • 0032099978 scopus 로고    scopus 로고
    • Automatic early stopping using cross validation: Quantifying the criteria
    • Prechelt, L. (1998). Automatic Early Stopping Using Cross Validation: Quantifying the Criteria. Neural Networks, 11, pp.761-767.
    • (1998) Neural Networks , vol.11 , pp. 761-767
    • Prechelt, L.1
  • 7
    • 0035654279 scopus 로고    scopus 로고
    • Feedforward neural network construction using cross-validation
    • Setiono, R. (2001) Feedforward neural network construction using cross-validation, Neural Computation, 13, pp. 2865-2877.
    • (2001) Neural Computation , vol.13 , pp. 2865-2877
    • Setiono, R.1
  • 8
    • 84898957627 scopus 로고    scopus 로고
    • For valid generalization the size of the weights is more important than the size of the network
    • M.C. Mozer, M. I. Jordan, & T. Petsche, editors, MIT Press
    • Bartlett, P.L. (1997). For valid generalization the size of the weights is more important than the size of the network. In M.C. Mozer, M. I. Jordan, & T. Petsche, editors, Advances in Neural Information Processing Systems, MIT Press, 9, pp. 134-140.
    • (1997) Advances in Neural Information Processing Systems , vol.9 , pp. 134-140
    • Bartlett, P.L.1
  • 12
    • 0346846404 scopus 로고    scopus 로고
    • Non glassy ground-state in a long-range anti-ferromagnetic frustrated model in the hypercubic cell
    • Franco, L. & Cannas, S.A. (2004). Non glassy ground-state in a long-range anti-ferromagnetic frustrated model in the hypercubic cell Physica A, 332, pp. 337-348.
    • (2004) Physica A , vol.332 , pp. 337-348
    • Franco, L.1    Cannas, S.A.2
  • 13
    • 0034293035 scopus 로고    scopus 로고
    • Generalization and selection of examples in feedforward neural networks
    • Franco, L. & Cannas, S.A. (2000). Generalization and Selection of Examples in Feedforward Neural Networks. Neural Computation, 12, 10, pp. 2405-2426.
    • (2000) Neural Computation , vol.12 , Issue.10 , pp. 2405-2426
    • Franco, L.1    Cannas, S.A.2
  • 14
    • 0035505551 scopus 로고    scopus 로고
    • Generalization properties of modular networks: Implementing the parity function
    • Franco, L. & Cannas, S.A. (2001). Generalization Properties of Modular Networks: Implementing the Parity Function. IEEE Transactions on Neural Networks, 12, pp. 1306-1313.
    • (2001) IEEE Transactions on Neural Networks , vol.12 , pp. 1306-1313
    • Franco, L.1    Cannas, S.A.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.