메뉴 건너뛰기




Volumn 11, Issue 7, 1999, Pages 1769-1796

Improving the Convergence of the Backpropagation Algorithm Using Learning Rate Adaptation Methods

Author keywords

[No Author keywords available]

Indexed keywords

ADAPTIVE BEHAVIOR; ALGORITHM; ARTICLE; ARTIFICIAL INTELLIGENCE; BIOLOGICAL MODEL;

EID: 0033209687     PISSN: 08997667     EISSN: None     Source Type: Journal    
DOI: 10.1162/089976699300016223     Document Type: Article
Times cited : (122)

References (53)
  • 1
    • 0346725529 scopus 로고
    • Connection between gradient methods and Newton's method for functionals
    • Altman, M. (1961). Connection between gradient methods and Newton's method for functionals. Bull. Acad. Polon. Sci. Ser. Sci. Math. Astronom. Phys., 9, 877-880.
    • (1961) Bull. Acad. Polon. Sci. Ser. Sci. Math. Astronom. Phys. , vol.9 , pp. 877-880
    • Altman, M.1
  • 2
    • 84966275544 scopus 로고
    • Minimization of functions having Lipschitz continuous first partial derivatives
    • Armijo, L. (1966). Minimization of functions having Lipschitz continuous first partial derivatives. Pacific Journal of Mathematics, 16, 1-3.
    • (1966) Pacific Journal of Mathematics , vol.16 , pp. 1-3
    • Armijo, L.1
  • 3
    • 0001209372 scopus 로고
    • Accelerated backpropagation learning: Two optimization methods
    • Battiti, R. (1989). Accelerated backpropagation learning: Two optimization methods. Complex Systems, 3, 331-342.
    • (1989) Complex Systems , vol.3 , pp. 331-342
    • Battiti, R.1
  • 4
    • 0001024110 scopus 로고
    • First- and second-order methods for learning: Between steepest descent and Newton's method
    • Battiti, R. (1992). First- and second-order methods for learning: Between steepest descent and Newton's method. Neural Computation, 4, 141-166.
    • (1992) Neural Computation , vol.4 , pp. 141-166
    • Battiti, R.1
  • 5
    • 0002906163 scopus 로고
    • Improving the convergence of the back-propagation learning with second order methods
    • D. S. Touretzky, G. E. Hinton, & T. J. Sejnowski (Eds.), San Mateo, CA: Morgan Kaufmann
    • Becker, S., & Le Cun, Y. (1988). Improving the convergence of the back-propagation learning with second order methods. In D. S. Touretzky, G. E. Hinton, & T. J. Sejnowski (Eds.), Proceedings of the 1988 Connectionist Models Summer School (pp. 29-37). San Mateo, CA: Morgan Kaufmann.
    • (1988) Proceedings of the 1988 Connectionist Models Summer School , pp. 29-37
    • Becker, S.1    Le Cun, Y.2
  • 6
    • 0346094862 scopus 로고
    • An application of the method of steepest descent to the solution of systems of nonlinear simultaneous equations
    • Booth, A. (1949). An application of the method of steepest descent to the solution of systems of nonlinear simultaneous equations. Quart. J. Mech. Appl. Math., 2, 460-468.
    • (1949) Quart. J. Mech. Appl. Math. , vol.2 , pp. 460-468
    • Booth, A.1
  • 7
    • 0000478038 scopus 로고
    • Méthode générale pour la résolution des systèmes d'équations simultanées
    • Cauchy, A. (1847). Méthode générale pour la résolution des systèmes d'équations simultanées. Comp. Rend. Acad. Sci. Paris, 25, 536-538.
    • (1847) Comp. Rend. Acad. Sci. Paris , vol.25 , pp. 536-538
    • Cauchy, A.1
  • 8
    • 0001031887 scopus 로고
    • An adaptive training algorithm for back-propagation networks
    • Chan, L. W., & Fallside, F. (1987). An adaptive training algorithm for back-propagation networks. Computers, Speech and Language, 2, 205-218.
    • (1987) Computers, Speech and Language , vol.2 , pp. 205-218
    • Chan, L.W.1    Fallside, F.2
  • 11
    • 0002663672 scopus 로고
    • Quasi-Newton methods, motivation and theory
    • Dennis, J. E., & Moré, J. J. (1977). Quasi-Newton methods, motivation and theory. SIAM Review, 19, 46-89.
    • (1977) SIAM Review , vol.19 , pp. 46-89
    • Dennis, J.E.1    Moré, J.J.2
  • 13
    • 0003000735 scopus 로고
    • Faster-learning variations on back-propagation: An empirical study
    • D. S. Touretzky, G. E. Hinton, & T. J. Sejnowski (Eds.), San Mateo, CA: Morgan Kaufmann
    • Fahlman, S. E. (1989). Faster-learning variations on back-propagation: An empirical study. In D. S. Touretzky, G. E. Hinton, & T. J. Sejnowski (Eds.), Proceedings of the 1988 Connectionist Models Summer School (pp. 38-51). San Mateo, CA: Morgan Kaufmann.
    • (1989) Proceedings of the 1988 Connectionist Models Summer School , pp. 38-51
    • Fahlman, S.E.1
  • 15
    • 85034539204 scopus 로고    scopus 로고
    • A high-performance text-independent speaker identification and verification system based on vowel spotting and neural nets
    • forthcoming
    • Fakotakis, N., & Sirigos, J. (forthcoming). A high-performance text-independent speaker identification and verification system based on vowel spotting and neural nets. IEEE Trans. Speech and Audio processing.
    • IEEE Trans. Speech and Audio Processing
    • Fakotakis, N.1    Sirigos, J.2
  • 17
    • 0011854355 scopus 로고
    • Cauchy's method of minimization
    • Goldstein, A. A. (1962). Cauchy's method of minimization. Numerische Mathematik, 4, 146-150.
    • (1962) Numerische Mathematik , vol.4 , pp. 146-150
    • Goldstein, A.A.1
  • 19
    • 0025964567 scopus 로고
    • Back-propagation algorithm which varies the number of hidden units
    • Hirose, Y., Yamashita, K., & Hijiya, S. (1991). Back-propagation algorithm which varies the number of hidden units. Neural Networks, 4, 61-66.
    • (1991) Neural Networks , vol.4 , pp. 61-66
    • Hirose, Y.1    Yamashita, K.2    Hijiya, S.3
  • 20
    • 0026896593 scopus 로고
    • Learning with limited numerical precision using the cascade-correlation algorithm
    • Hoehfeld, M., & Fahlman, S. E. (1992). Learning with limited numerical precision using the cascade-correlation algorithm. IEEE Trans. on Neural Networks, 3, 602-611.
    • (1992) IEEE Trans. on Neural Networks , vol.3 , pp. 602-611
    • Hoehfeld, M.1    Fahlman, S.E.2
  • 22
    • 0024137490 scopus 로고
    • Increased rates of convergence through learning rate adaptation
    • Jacobs, R. A. (1988). Increased rates of convergence through learning rate adaptation. Neural Networks, 1, 295-307.
    • (1988) Neural Networks , vol.1 , pp. 295-307
    • Jacobs, R.A.1
  • 24
    • 0347986251 scopus 로고
    • Generalized perceptron networks with nonlinear discriminant functions
    • R. J. Mammone & Y. Y. Zeevi (Eds.), New York: Academic Press
    • Kung, S. Y., Diamantaras, K., Mao, W. D., Taur, J. S. (1991). Generalized perceptron networks with nonlinear discriminant functions. In R. J. Mammone & Y. Y. Zeevi (Eds.), Neural networks theory and applications (pp. 245-279). New York: Academic Press.
    • (1991) Neural Networks Theory and Applications , pp. 245-279
    • Kung, S.Y.1    Diamantaras, K.2    Mao, W.D.3    Taur, J.S.4
  • 25
    • 0001298583 scopus 로고
    • Automatic learning rate maximization by on-line estimation of the Hessian's eigenvectors
    • S. J. Hanson, J. D. Cowan, & C. L. Giles (Eds.), San Mateo, CA: Morgan Kaufmann
    • Le Cun, Y., Simard, P. Y., & Pearlmutter, B. A. (1993). Automatic learning rate maximization by on-line estimation of the Hessian's eigenvectors. In S. J. Hanson, J. D. Cowan, & C. L. Giles (Eds.), Advances in neural information processing systems, 5 (pp. 156-163). San Mateo, CA: Morgan Kaufmann.
    • (1993) Advances in Neural Information Processing Systems , vol.5 , pp. 156-163
    • Le Cun, Y.1    Simard, P.Y.2    Pearlmutter, B.A.3
  • 26
    • 0027226690 scopus 로고
    • An analysis of premature saturation in backpropagation learning
    • Lee, Y., Oh, S.-H., & Kim, M. W. (1993). An analysis of premature saturation in backpropagation learning. Neural Networks, 6, 719-728.
    • (1993) Neural Networks , vol.6 , pp. 719-728
    • Lee, Y.1    Oh, S.-H.2    Kim, M.W.3
  • 27
    • 0000294383 scopus 로고
    • Complete solution of the local minima in the XOR problem
    • Lisboa, P. J. G., & Perantonis S. J. (1991). Complete solution of the local minima in the XOR problem. Network, 2, 119-124.
    • (1991) Network , vol.2 , pp. 119-124
    • Lisboa, P.J.G.1    Perantonis, S.J.2
  • 31
    • 0027205884 scopus 로고
    • A scaled conjugate gradient algorithm, for fast supervised learning
    • Møller, M. F. (1993). A scaled conjugate gradient algorithm, for fast supervised learning. Neural Networks, 6, 525-533.
    • (1993) Neural Networks , vol.6 , pp. 525-533
    • Møller, M.F.1
  • 32
    • 84972047841 scopus 로고
    • Theory of algorithms for unconstrained optimization
    • Nocedal, J. (1991). Theory of algorithms for unconstrained optimization. Acta Numerica, 199-242.
    • (1991) Acta Numerica , pp. 199-242
    • Nocedal, J.1
  • 34
    • 0023602770 scopus 로고
    • Optimal algorithms for adaptive networks: Second order back-propagation, second order direct propagation, and second order Hebbian learning
    • Parker, D. B. (1987). Optimal algorithms for adaptive networks: Second order back-propagation, second order direct propagation, and second order Hebbian learning. In Proceedings of the IEEE International Conference on Neural Networks, 2, 593-600.
    • (1987) Proceedings of the IEEE International Conference on Neural Networks , vol.2 , pp. 593-600
    • Parker, D.B.1
  • 36
    • 0342533951 scopus 로고
    • Gradient descent: Second-order momentum and saturating error
    • J. E. Moody, S. J. Hanson, & R. P. Lippmann (Eds)., San Mateo, CA: Morgan Kaufmann
    • Pearlmutter, B. (1992). Gradient descent: Second-order momentum and saturating error. In J. E. Moody, S. J. Hanson, & R. P. Lippmann (Eds)., Advances in neural information processing systems, 4 (pp. 887-894). San Mateo, CA: Morgan Kaufmann.
    • (1992) Advances in Neural Information Processing Systems , vol.4 , pp. 887-894
    • Pearlmutter, B.1
  • 37
    • 0027816861 scopus 로고
    • Speeding-up backpropagation - A comparison of orthogonal techniques
    • Nagoya, Japan
    • Pfister, M., & Rojas, R. (1993). Speeding-up backpropagation - A comparison of orthogonal techniques. In Proceedings of the Joint Conference on Neural Networks. (pp. 517-523). Nagoya, Japan.
    • (1993) Proceedings of the Joint Conference on Neural Networks , pp. 517-523
    • Pfister, M.1    Rojas, R.2
  • 40
    • 0025841422 scopus 로고
    • Rescaling of variables in back-propagation learning
    • Rigler, A. K., Irvine, J. M., & Vogl, T. P. (1991). Rescaling of variables in back-propagation learning. Neural Networks, 4, 225-229.
    • (1991) Neural Networks , vol.4 , pp. 225-229
    • Rigler, A.K.1    Irvine, J.M.2    Vogl, T.P.3
  • 45
    • 84882420668 scopus 로고
    • Acceleration techniques for the back-propagation algorithm
    • Silva, F., & Almeida, L. (1990). Acceleration techniques for the back-propagation algorithm. Lecture Notes in Computer Science, 412, 110-119.
    • (1990) Lecture Notes in Computer Science , vol.412 , pp. 110-119
    • Silva, F.1    Almeida, L.2
  • 48
    • 0028292832 scopus 로고
    • Minimization methods for training feedforward neural networks
    • Van der Smagt, P. P. (1994). Minimization methods for training feedforward neural networks. Neural Networks, 7, 1-11.
    • (1994) Neural Networks , vol.7 , pp. 1-11
    • Van Der Smagt, P.P.1
  • 50
    • 0023541050 scopus 로고
    • Learning algorithms for connectionist networks: Applied gradient of nonlinear optimization
    • Watrous, R. L. (1987). Learning algorithms for connectionist networks: Applied gradient of nonlinear optimization. In Proceedings of the IEEE International Conference on Neural Networks, 2, 619-627.
    • (1987) Proceedings of the IEEE International Conference on Neural Networks , vol.2 , pp. 619-627
    • Watrous, R.L.1
  • 51
    • 0026955395 scopus 로고
    • Avoiding false local minima by proper initialization of connections
    • Wessel, L. F., & Barnard, E. (1992). Avoiding false local minima by proper initialization of connections. IEEE Trans. Neural Networks, 3, 899-905.
    • (1992) IEEE Trans. Neural Networks , vol.3 , pp. 899-905
    • Wessel, L.F.1    Barnard, E.2
  • 52
    • 0014492147 scopus 로고
    • Convergence conditions for ascent methods
    • Wolfe, P. (1969). Convergence conditions for ascent methods. SIAM Review, 11, 226-235.
    • (1969) SIAM Review , vol.11 , pp. 226-235
    • Wolfe, P.1
  • 53
    • 0001796942 scopus 로고
    • Convergence conditions for ascent methods. II: Some corrections
    • Wolfe, P. (1971). Convergence conditions for ascent methods. II: Some corrections. SIAM Review, 13, 185-188.
    • (1971) SIAM Review , vol.13 , pp. 185-188
    • Wolfe, P.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.