메뉴 건너뛰기




Volumn 16, Issue 2, 2002, Pages 84-87

Step acceleration based training algorithm for feedforward neural networks

Author keywords

[No Author keywords available]

Indexed keywords

COMPUTATION THEORY; COMPUTER SIMULATION; CONVERGENCE OF NUMERICAL METHODS; EXTRAPOLATION; FEEDFORWARD NEURAL NETWORKS; FUNCTIONS; ITERATIVE METHODS; NEURAL NETWORKS; PATTERN RECOGNITION;

EID: 33751581175     PISSN: 10514651     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (4)

References (14)
  • 2
    • 0024137490 scopus 로고
    • Increased rates of convergence through learning rate adaptation
    • R.A. Jacobs, "Increased rates of convergence through learning rate adaptation", Neural networks, 1, 295-307 (1988).
    • (1988) Neural Networks , vol.1 , pp. 295-307
    • Jacobs, R.A.1
  • 3
  • 4
    • 0031143834 scopus 로고    scopus 로고
    • Improving the error backpropagation algorithm with a modified error function
    • S.H. Oh, "Improving the error backpropagation algorithm with a modified error function", IEEE Trans. on Neural Network, 8, 799-803 (1997).
    • (1997) IEEE Trans. on Neural Network , vol.8 , pp. 799-803
    • Oh, S.H.1
  • 5
    • 0026711368 scopus 로고
    • Improving the convergence of the back-propagation algorithm
    • A.V. Oyen and B. Nienhuis, "Improving the convergence of the back-propagation algorithm", Neural Networks, 5, 465-471 (1992).
    • (1992) Neural Networks , vol.5 , pp. 465-471
    • Oyen, A.V.1    Nienhuis, B.2
  • 6
    • 0011805758 scopus 로고
    • Extract calculation of Hessian matrix for the multilayer perceptron
    • C. Bishop, " Extract calculation of Hessian matrix for the multilayer perceptron", Neural Computa., 4, 494-501 (1992).
    • (1992) Neural Computa. , vol.4 , pp. 494-501
    • Bishop, C.1
  • 7
    • 0028424954 scopus 로고
    • Computing second derivatives in feed forward networks: A review
    • W.L. Buntine and A.S. Weigend, "Computing second derivatives in feed forward networks: A review", IEEE Trans. on Neural Networks, 5, (1994).
    • (1994) IEEE Trans. on Neural Networks , pp. 5
    • Buntine, W.L.1    Weigend, A.S.2
  • 8
    • 0000014434 scopus 로고    scopus 로고
    • A generalized learning paradigm exploiting the structure of feed forward neural networks
    • R. Parisi, E.D. Di Claudio, G. Orlandi, and B.D. Rao. "A generalized learning paradigm exploiting the structure of feed forward neural networks", IEEE Trans. on Neural Networks, 7, 465-471 (1996).
    • (1996) IEEE Trans. on Neural Networks , vol.7 , pp. 465-471
    • Parisi, R.1    Di Claudio, E.D.2    Orlandi, G.3    Rao, B.D.4
  • 9
    • 0030142280 scopus 로고    scopus 로고
    • A fast multilayer neural network training algorithm based on the layer-by-layer optimizing procedures
    • G.J. Wang and C.C. Chen, "A fast multilayer neural network training algorithm based on the layer-by-layer optimizing procedures", IEEE Trans. on Neural Networks, 7, 768-775 (1996).
    • (1996) IEEE Trans. on Neural Networks , vol.7 , pp. 768-775
    • Wang, G.J.1    Chen, C.C.2
  • 10
    • 0032830354 scopus 로고    scopus 로고
    • Accelerating neural network training using weight extrapolations
    • S.V. Kamarthi and S. Pittner, "Accelerating neural network training using weight extrapolations", Neural Networks, 12, 1285-1299 (1999).
    • (1999) Neural Networks , vol.12 , pp. 1285-1299
    • Kamarthi, S.V.1    Pittner, S.2
  • 12
    • 0027816861 scopus 로고
    • Speeding-up backpropagation- A comparison of orthogonal techniques
    • Nasgoya, Japan: Japanese Neural Network Society
    • M. Pfister and R. Rojas, "Speeding-up backpropagation- a comparison of orthogonal techniques". Proceedings of the International Joint Conference on Neural Networks, 1. Nasgoya, Japan: Japanese Neural Network Society, 517-523 (1993).
    • (1993) Proceedings of the International Joint Conference on Neural Networks , vol.1 , pp. 517-523
    • Pfister, M.1    Rojas, R.2
  • 14


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.