메뉴 건너뛰기




Volumn 7, Issue 1, 1998, Pages 15-25

A Layer-by-Layer Least Squares based Recurrent Networks Training Algorithm: Stalling and Escape

Author keywords

Convergence stalling; Fast convergence speed; Layer by Layer Least Squares algorithm; Recurrent networks

Indexed keywords

CONVERGENCE OF NUMERICAL METHODS; LEAST SQUARES APPROXIMATIONS; MATHEMATICAL MODELS; NEURAL NETWORKS; PROBABILITY;

EID: 0031989111     PISSN: 13704621     EISSN: None     Source Type: Journal    
DOI: 10.1023/A:1009672319869     Document Type: Article
Times cited : (6)

References (10)
  • 2
    • 0001202594 scopus 로고
    • A learning algorithm for continually running fully recurrent neural networks
    • R.J. Williams and D. Ziper, "A learning algorithm for continually running fully recurrent neural networks", Neural Computation, Vol. 1, No. 2, pp. 270-280, 1989.
    • (1989) Neural Computation , vol.1 , Issue.2 , pp. 270-280
    • Williams, R.J.1    Ziper, D.2
  • 3
    • 0024137490 scopus 로고
    • Increased rates of convergence through learning rate adaptation
    • R.A. Jacobs, "Increased rates of convergence through learning rate adaptation", Neural Networks, Vol. 1, pp. 295-307, 1988.
    • (1988) Neural Networks , vol.1 , pp. 295-307
    • Jacobs, R.A.1
  • 4
    • 0029307027 scopus 로고
    • A recurrent Newton algorithm and its convergence properties
    • C.M. Kuan, "A recurrent Newton algorithm and its convergence properties", IEEE Trans. on Neural Networks, Vol. 6, No. 3, pp. 779-783, 1995.
    • (1995) IEEE Trans. on Neural Networks , vol.6 , Issue.3 , pp. 779-783
    • Kuan, C.M.1
  • 5
    • 85132302281 scopus 로고
    • Training Recurrent Networks Using the Extended Kalman Filter
    • Baltimore
    • R.J. Willams, "Training Recurrent Networks Using the Extended Kalman Filter", International Joint Conference on Neural Networks, Vol. IV, pp. 241-246, Baltimore, 1992.
    • (1992) International Joint Conference on Neural Networks , vol.4 , pp. 241-246
    • Willams, R.J.1
  • 7
    • 0028401031 scopus 로고
    • Neurocontrol of Nonlinear Dynamical Systems with Kalman Filter Trained Recurrent Networks
    • G.V. Puskorius and L.A. Feldkamp, "Neurocontrol of Nonlinear Dynamical Systems with Kalman Filter Trained Recurrent Networks", IEEE Trans. on Neural Networks, Vol. 5, No. 2, pp. 279-297, 1994.
    • (1994) IEEE Trans. on Neural Networks , vol.5 , Issue.2 , pp. 279-297
    • Puskorius, G.V.1    Feldkamp, L.A.2
  • 8
    • 0027224761 scopus 로고
    • A learning algorithm for multilayered neural networks based on linear least squares problems
    • F. Biegler-König and F. Bärmann, "A learning algorithm for multilayered neural networks based on linear least squares problems", Neural Networks, Vol. 6, pp. 127-131, 1993.
    • (1993) Neural Networks , vol.6 , pp. 127-131
    • Biegler-König, F.1    Bärmann, F.2
  • 9
    • 0004305684 scopus 로고
    • Accelerated training algorithm for feedforward neural networks based on least squares method
    • Y.F. Yam and T.W.S. Chow, "Accelerated training algorithm for feedforward neural networks based on least squares method", Neural Processing Letter, Vol. 2, No. 4, pp. 20-25, 1995.
    • (1995) Neural Processing Letter , vol.2 , Issue.4 , pp. 20-25
    • Yam, Y.F.1    Chow, T.W.S.2
  • 10
    • 0031139318 scopus 로고    scopus 로고
    • Extended Least Squares Based Algorithm for Training Feedforward Networks
    • May
    • J.Y.F. Yam and T.W.S. Chow, "Extended Least Squares Based Algorithm for Training Feedforward Networks", IEEE Trans. on Neural Networks, Vol. 8, no. 3, pp. 806-810, May, 1997.
    • (1997) IEEE Trans. on Neural Networks , vol.8 , Issue.3 , pp. 806-810
    • Yam, J.Y.F.1    Chow, T.W.S.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.