메뉴 건너뛰기




Volumn 18, Issue 1, 2003, Pages 37-54

Fast learning algorithms for feedforward neural networks

Author keywords

Conjugate gradient; Error function; Fast algorithm; Feedforward neural networks; Global convergence

Indexed keywords

BACKPROPAGATION; CONVERGENCE OF NUMERICAL METHODS; ERROR ANALYSIS; FOURIER TRANSFORMS; GLOBAL OPTIMIZATION; MULTILAYER NEURAL NETWORKS; RECURRENT NEURAL NETWORKS;

EID: 0037275252     PISSN: 0924669X     EISSN: None     Source Type: Journal    
DOI: 10.1023/A:1020922701312     Document Type: Article
Times cited : (14)

References (25)
  • 1
    • 0026955395 scopus 로고
    • Avoiding false local minima by proper initialization of connections
    • L.F.A. Wessels and E. Barnard, "Avoiding false local minima by proper initialization of connections," IEEE Transactions on Neural Networks, vol. 3, no. 6, pp. 899-905, 1992.
    • (1992) IEEE Transactions on Neural Networks , vol.3 , Issue.6 , pp. 899-905
    • Wessels, L.F.A.1    Barnard, E.2
  • 2
    • 0032144731 scopus 로고    scopus 로고
    • A modified back-propagation method to avoid false local minima
    • Yutaka Fukuoka, Hideo Matsuki, and Haruyuki Minamitani et al., "A modified back-propagation method to avoid false local minima," Neural Networks, vol. 11, pp. 1059-1072, 1998.
    • (1998) Neural Networks , vol.11 , pp. 1059-1072
    • Yutaka, F.1    Hideo, M.2    Haruyuki, M.3
  • 3
    • 0031097898 scopus 로고    scopus 로고
    • High-order and multilayer perceptrons initialization
    • G. Thimm and E. Fiesler, "High-order and multilayer perceptrons initialization," IEEE Transactions on Neural Networks, vol. 8, no. 6, pp. 349-359, 1997.
    • (1997) IEEE Transactions on Neural Networks , vol.8 , Issue.6 , pp. 349-359
    • Thimm, G.1    Fiesler, E.2
  • 4
    • 0345628005 scopus 로고    scopus 로고
    • Three methods to speed up the training of feedforward and feedback perceptrons
    • F. Stager and M. Agarwal, "Three methods to speed up the training of feedforward and feedback perceptrons," Neural Networks, vol. 10, no. 8, pp. 1435-144, 1997.
    • (1997) Neural Networks , vol.10 , Issue.8 , pp. 1435-1444
    • Stager, F.1    Agarwal, M.2
  • 5
    • 0031276994 scopus 로고    scopus 로고
    • Fast training of multilayer perceptrons
    • B. Verma, "Fast training of multilayer perceptrons," IEEE Transactions on Neural Networks, vol. 8, no. 6, pp. 1314-1320, 1997.
    • (1997) IEEE Transactions on Neural Networks , vol.8 , Issue.6 , pp. 1314-1320
    • Verma, B.1
  • 6
    • 0028426766 scopus 로고
    • An accelerated learning algorithm for multilayer perceptron networks
    • A.G. Parlos and B. Fernandez, "An accelerated learning algorithm for multilayer perceptron networks," IEEE Transactions on Neural Networks, vol. 5, no. 3, pp. 493-497, 1994.
    • (1994) IEEE Transactions on Neural Networks , vol.5 , Issue.3 , pp. 493-497
    • Parlos, A.G.1    Fernandez, B.2
  • 7
    • 0028570216 scopus 로고
    • Improving back propagation with a new error function
    • B.K. Humpert, "Improving back propagation with a new error function," Neural Networks, vol. 7, no. 8, pp. 1191-1192, 1994.
    • (1994) Neural Networks , vol.7 , Issue.8 , pp. 1191-1192
    • Humpert, B.K.1
  • 8
    • 0026711368 scopus 로고
    • Improving the convergence of the back-propagation algorithm
    • A. Van Ooyen and B. Nienhuis, "Improving the convergence of the back-propagation algorithm," Neural Networks, vol. 5, pp. 465-471, 1992.
    • (1992) Neural Networks , vol.5 , pp. 465-471
    • Van Ooyen, A.1    Nienhuis, B.2
  • 10
    • 0031143834 scopus 로고    scopus 로고
    • Improving the error backpropagation algorithm with a modified error function
    • S.-H. Oh, "Improving the error backpropagation algorithm with a modified error function," IEEE Transactions on Neural Networks, vol. 8, no. 7, pp. 799-802, 1997.
    • (1997) IEEE Transactions on Neural Networks , vol.8 , Issue.7 , pp. 799-802
    • Oh, S.-H.1
  • 11
    • 0030104509 scopus 로고    scopus 로고
    • Accelerating the training of feedforward neural networks using generalized Hebbian rules for initializing the internal representations
    • N. B. Karayinnis, "Accelerating the training of feedforward neural networks using generalized Hebbian rules for initializing the internal representations," IEEE Transactions on Neural Networks, vol. 7, no. 2, pp. 419-426, 1996.
    • (1996) IEEE Transactions on Neural Networks , vol.7 , Issue.2 , pp. 419-426
    • Karayinnis, N.B.1
  • 12
    • 0029304808 scopus 로고
    • Dynamic learning rate optimization of the back-propagation algorithm
    • Xiao-Hu, Yu, Guo-An Chen, and ShiXin Cheng, "Dynamic learning rate optimization of the back-propagation algorithm," IEEE Transactions on Neural Networks, vol. 6, no. 3, pp. 669-677, 1995.
    • (1995) IEEE Transactions on Neural Networks , vol.6 , Issue.3 , pp. 669-677
    • Xiao-Hu, Y.1    Guo-An, C.2    Shixin, C.3
  • 13
    • 0031028741 scopus 로고    scopus 로고
    • Effective backpropagation training with variable stepsize
    • G.D. Magoulas, M.N. Vrahatis, and G.S. Androulakis, "Effective backpropagation training with variable stepsize," Neural Networks, vol. 10, no. 1, pp. 69-82, 1997.
    • (1997) Neural Networks , vol.10 , Issue.1 , pp. 69-82
    • Magoulas, G.D.1    Vrahatis, M.N.2    Androulakis, G.S.3
  • 14
    • 0024137490 scopus 로고
    • Increased rates of convergence through learning rate adaptation
    • R.A. Jacobs, "Increased rates of convergence through learning rate adaptation," Neural Networks, vol. 1, pp. 295-307, 1988.
    • (1988) Neural Networks , vol.1 , pp. 295-307
    • Jacobs, R.A.1
  • 15
    • 0001024110 scopus 로고
    • First-and second-order methods for learning: Between steepest descent and Newton's method
    • R. Battiti, "First- and second-order methods for learning: Between steepest descent and Newton's method," Neural Computation, vol. 4, pp. 141-166, 1992.
    • (1992) Neural Computation , vol.4 , pp. 141-166
    • Battiti, R.1
  • 16
    • 0000293377 scopus 로고
    • Backpropagation learning for multilayer feed-forward neural networks using the conjugate gradient method
    • E.M. Johansson, F.U. Dowla, and D.M. Goodman, "Backpropagation learning for multilayer feed-forward neural networks using the conjugate gradient method," International Journal of Neural Systems, vol. 2, no. 4, pp. 291-301, 1992.
    • (1992) International Journal of Neural Systems , vol.2 , Issue.4 , pp. 291-301
    • Johansson, E.M.1    Dowla, F.U.2    Goodman, D.M.3
  • 17
    • 0022471098 scopus 로고
    • Learning representations by back-propagation errors
    • D.E. Rumelhart, G.E. Hinton, and R.J. Williams, "Learning representations by back-propagation errors," Nature, vol. 323, pp. 533-536, 1986.
    • (1986) Nature , vol.323 , pp. 533-536
    • Rumelhart, D.E.1    Hinton, G.E.2    Williams, R.J.3
  • 18
    • 0025488663 scopus 로고
    • 30 years of adaptive neural networks: Perceptron, madaline, and back-propagation
    • B. Widrow and M.A. Lehr, "30 years of adaptive neural networks: Perceptron, madaline, and back-propagation," Proceedings of IEEE, vol. 78, no. 9, pp. 1415-1441, 1990.
    • (1990) Proceedings of IEEE , vol.78 , Issue.9 , pp. 1415-1441
    • Widrow, B.1    Lehr, M.A.2
  • 21
    • 0000615669 scopus 로고
    • Function minimization by conjugate gradients
    • R. Fletcher and C.M. Reeves, "Function minimization by conjugate gradients," Computer Journal, vol. 7, pp. 149-154, 1964.
    • (1964) Computer Journal , vol.7 , pp. 149-154
    • Fletcher, R.1    Reeves, C.M.2
  • 22
    • 0013231599 scopus 로고    scopus 로고
    • Some properties of a new conjugate gradient method
    • edited by Ya-xiang Yuan, Kluwer Academic Publishers
    • Y.H. Dai and Y. Yuan, "Some properties of a new conjugate gradient method," in Advances in Nonlinear Programming, edited by Ya-xiang Yuan, Kluwer Academic Publishers, pp. 251-262, 1998.
    • (1998) Advances in Nonlinear Programming , pp. 251-262
    • Dai, Y.H.1    Yuan, Y.2
  • 23
    • 0000121176 scopus 로고
    • Restart procedures of the conjugate gradient method
    • M.J.D. Powell, "Restart procedures of the conjugate gradient method," Math. Program, vol. 2, pp. 14-254, 1977.
    • (1977) Math. Program , vol.2 , pp. 14-254
    • Powell, M.J.D.1
  • 24
    • 0001219541 scopus 로고
    • Noncovex minimization calculations and conjugate gradient method
    • Springer-Verlag, Berlin
    • M.J.D. Powell, "Noncovex minimization calculations and conjugate gradient method," Lecture Notes in Mathmatics, 1984, vol. 1066, Springer-Verlag, Berlin, pp. 122-144.
    • (1984) Lecture Notes in Mathmatics , vol.1066 , pp. 122-144
    • Powell, M.J.D.1
  • 25
    • 0035730017 scopus 로고    scopus 로고
    • An efficient hybrid conjugate gradient method for unconstrained optimization
    • Y.H. Dai and Y. Yuan, "An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization," Annals of Operations Research, vol. 103, pp. 33-47, 2001.
    • (2001) Annals of Operations Research , vol.103 , pp. 33-47
    • Dai, Y.H.1    Yuan, Y.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.