메뉴 건너뛰기




Volumn 7, Issue 6, 1996, Pages 1450-1460

A generalized learning paradigm exploiting the structure of feedforward neural networks

Author keywords

[No Author keywords available]

Indexed keywords


EID: 0000014434     PISSN: 10459227     EISSN: None     Source Type: Journal    
DOI: 10.1109/72.548172     Document Type: Article
Times cited : (76)

References (28)
  • 1
    • 0011805758 scopus 로고
    • Exact calculation of the Hessian matrix for the multilayer perceptron
    • C. Bishop, "Exact calculation of the Hessian matrix for the multilayer perceptron," Neural Computa., vol. 4, pp. 494-501, 1992.
    • (1992) Neural Computa. , vol.4 , pp. 494-501
    • Bishop, C.1
  • 2
    • 0028424954 scopus 로고
    • Computing second derivatives in feedforward networks: A review
    • May
    • W. L. Buntine and A. S. Weigend, "Computing second derivatives in feedforward networks: A review," IEEE Trans. Neural Networks, vol. 5, May 1994.
    • (1994) IEEE Trans. Neural Networks , vol.5
    • Buntine, W.L.1    Weigend, A.S.2
  • 3
    • 0000443781 scopus 로고
    • Ill conditioning in neural-network training problems
    • May
    • S. Saarinen, R. Bramley, and G. Cybenko, "Ill conditioning in neural-network training problems," SIAM J. Sci. Comput., vol. 14, no. 3, pp. 693-714, May 1993.
    • (1993) SIAM J. Sci. Comput. , vol.14 , Issue.3 , pp. 693-714
    • Saarinen, S.1    Bramley, R.2    Cybenko, G.3
  • 4
    • 0001024110 scopus 로고
    • First- And second-order methods for learning: Between steepest descent and Newton's method
    • R. Battiti, "First- and second-order methods for learning: Between steepest descent and Newton's method," Neural Computa., vol. 4, pp. 141-166, 1992.
    • (1992) Neural Computa. , vol.4 , pp. 141-166
    • Battiti, R.1
  • 10
    • 0024137490 scopus 로고
    • Increased rates of convergence through learning rate adaptation
    • R. A. Jacobs, "Increased rates of convergence through learning rate adaptation," Neural Networks, vol. 1, pp. 295-307, 1988.
    • (1988) Neural Networks , vol.1 , pp. 295-307
    • Jacobs, R.A.1
  • 11
    • 34250094997 scopus 로고
    • Accelerating the convergence of the backpropagation method
    • T. P. Vogl, J. K. Mangis, A. K. Rigler, W. T. Zink, and D. L. Alkon, "Accelerating the convergence of the backpropagation method," Biol. Cybern., vol. 59, pp. 257-263, 1988.
    • (1988) Biol. Cybern. , vol.59 , pp. 257-263
    • Vogl, T.P.1    Mangis, J.K.2    Rigler, A.K.3    Zink, W.T.4    Alkon, D.L.5
  • 12
    • 3743062328 scopus 로고    scopus 로고
    • Learning algorithms for connectionist networks: Applied gradient methods of nonlinear optimization
    • R. Watrous, "Learning algorithms for connectionist networks: Applied gradient methods of nonlinear optimization," Univ. Pennsylvania, Tech. Rep. MS-CIS-87-51.
    • Univ. Pennsylvania, Tech. Rep. MS-CIS-87-51
    • Watrous, R.1
  • 14
    • 0001136317 scopus 로고
    • Algorithms for the solution of the nonlinear least-squares problem
    • Oct.
    • P. E. Gill and W. Murray, "Algorithms for the solution of the nonlinear least-squares problem," SIAM J. Numer. Anal., vol. 15, no. 5, pp. 977-992, Oct. 1978.
    • (1978) SIAM J. Numer. Anal. , vol.15 , Issue.5 , pp. 977-992
    • Gill, P.E.1    Murray, W.2
  • 15
    • 0019610360 scopus 로고
    • An adaptive nonlinear least-squares algorithm
    • Sept.
    • J. E. Dennis, Jr., D. M. Gay, and R. E. Welsch, "An adaptive nonlinear least-squares algorithm," ACM Trans. Math. Software, vol. 7, no. 3, pp. 348-368, Sept. 1981.
    • (1981) ACM Trans. Math. Software , vol.7 , Issue.3 , pp. 348-368
    • Dennis Jr., J.E.1    Gay, D.M.2    Welsch, R.E.3
  • 16
    • 0026953321 scopus 로고
    • Enhanced training algorithms, and integrated training/architecture selection for multilayer perceptrons networks
    • Nov.
    • M. G. Bello, "Enhanced training algorithms, and integrated training/architecture selection for multilayer perceptrons networks," IEEE Trans. Neural Networks, vol. 3, Nov. 1992.
    • (1992) IEEE Trans. Neural Networks , vol.3
    • Bello, M.G.1
  • 17
    • 0005303130 scopus 로고
    • On large scale nonlinear least squares calculations
    • May
    • P. L. Joint, "On large scale nonlinear least squares calculations," SIAM J. Sci. Stat. Comput., vol. 8, no. 3, May 1987, pp. 416-435.
    • (1987) SIAM J. Sci. Stat. Comput. , vol.8 , Issue.3 , pp. 416-435
    • Joint, P.L.1
  • 18
    • 0009772169 scopus 로고
    • Second-order properties of error surfaces: Learning time and generalization
    • Y. Le Cua, "Second-order properties of error surfaces: Learning time and generalization," in Advances in Neural Information Processing Systems 3, 1991.
    • (1991) Advances in Neural Information Processing Systems , vol.3
    • Le Cua, Y.1
  • 19
    • 0003000735 scopus 로고    scopus 로고
    • Faster-learning variations on backpropagation: An empirical study
    • Carnegie Mellon Univ, Pittsburgh, PA
    • S. E. Fahlman, "Faster-learning variations on backpropagation: An empirical study," in Proc. 1988 Connectionist Models Summer School, Carnegie Mellon Univ, Pittsburgh, PA.
    • Proc. 1988 Connectionist Models Summer School
    • Fahlman, S.E.1
  • 20
    • 0024715766 scopus 로고    scopus 로고
    • An adaptive least squares algorithm for the efficient training of artificial neural networks
    • Aug.
    • S. Kollias and D. Anastassiou, "An adaptive least squares algorithm for the efficient training of artificial neural networks," IEEE Trans. Circuits Syst., vol. 36, Aug. 1989.
    • IEEE Trans. Circuits Syst. , vol.36 , pp. 1989
    • Kollias, S.1    Anastassiou, D.2
  • 21
    • 0026727333 scopus 로고
    • A fast new algorithm for training feedforward neural networks
    • Jan.
    • R. S. Scalero and N. Tepedelenlioglu, "A fast new algorithm for training feedforward neural networks," IEEE Trans. Signal Processing, vol. 40, Jan. 1992.
    • (1992) IEEE Trans. Signal Processing , vol.40
    • Scalero, R.S.1    Tepedelenlioglu, N.2
  • 22
    • 0026820187 scopus 로고
    • Fast learning process of multilayer neural networks using recursive least squares method
    • Feb.
    • M. R. Azimi-Sadjadi and R. J. Liou, "Fast learning process of multilayer neural networks using recursive least squares method," IEEE Trans. Signal Processing, vol. 40, Feb. 1992.
    • (1992) IEEE Trans. Signal Processing , vol.40
    • Azimi-Sadjadi, M.R.1    Liou, R.J.2
  • 23
    • 0026923239 scopus 로고
    • Optimal filtering algorithms for fast learning in feedforward neural networks
    • S. Shah, F. Palmieri, and M. Datum, "Optimal filtering algorithms for fast learning in feedforward neural networks," Neural Networks, vol. 5, pp. 779-787, 1992.
    • (1992) Neural Networks , vol.5 , pp. 779-787
    • Shah, S.1    Palmieri, F.2    Datum, M.3
  • 25
    • 0000803444 scopus 로고
    • The differentiation of pseudoinverses and nonlinear least squares problems whose variables are separate
    • G. Golub and V. Pereyra, "The differentiation of pseudoinverses and nonlinear least squares problems whose variables are separate," SIAM J. Numerical Analysis, vol. 10, no. 2, pp. 413-432, 1973.
    • (1973) SIAM J. Numerical Analysis , vol.10 , Issue.2 , pp. 413-432
    • Golub, G.1    Pereyra, V.2
  • 26
    • 0000118161 scopus 로고
    • Elimination of linear parameters in nonlinear regression
    • W. H. Lawton and E. A. Sylvestre, "Elimination of linear parameters in nonlinear regression," Technometrics, vol. 13, no. 3, pp. 461-467, 1971.
    • (1971) Technometrics , vol.13 , Issue.3 , pp. 461-467
    • Lawton, W.H.1    Sylvestre, E.A.2
  • 27
    • 0001652904 scopus 로고
    • Least squares estimation for a class of nonlinear models
    • I. Guttman, V. Pereyra, and H. D. Scolnik, "Least squares estimation for a class of nonlinear models," Technometrics, vol. 15, no. 2, pp. 209-218, 1973.
    • (1973) Technometrics , vol.15 , Issue.2 , pp. 209-218
    • Guttman, I.1    Pereyra, V.2    Scolnik, H.D.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.