메뉴 건너뛰기




Volumn 3, Issue , 2003, Pages 2028-2032

An Efficient Learning Algorithm with Second-Order Convergence for Multilayer Neural Networks

Author keywords

[No Author keywords available]

Indexed keywords

ALGORITHMS; BACKPROPAGATION; COMPUTER SIMULATION; ITERATIVE METHODS; NUMERICAL ANALYSIS;

EID: 0141682840     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (5)

References (11)
  • 1
    • 0022471098 scopus 로고
    • Learning representations by back-propagation errors
    • Oct.
    • D. E. Rumelhart, G. E. Hinton and R. J. Williams: "Learning representations by back-propagation errors", NATURE, vol.323, 9, pp.533-536, Oct., 1986.
    • (1986) NATURE , vol.323 , pp. 533-536
    • Rumelhart, D.E.1    Hinton, G.E.2    Williams, R.J.3
  • 3
    • 0033315725 scopus 로고    scopus 로고
    • A New learning Algorithm without Explicit Error Back-Propagation
    • July
    • H. Ninomiya and N. Kinoshita: "A New learning Algorithm without Explicit Error Back-Propagation", ProcIEEE&INNS/IJCNN'99, July, 1999.
    • (1999) Proc IEEE&INNS/IJCNN'99
    • Ninomiya, H.1    Kinoshita, N.2
  • 4
    • 0141759106 scopus 로고    scopus 로고
    • 3-Layer Recurrent Neural Networks and their Supervised Learning Algorithm
    • July
    • H. Ninomiya and A. Sasaki: "3-Layer Recurrent Neural Networks and their Supervised Learning Algorithm", Proc.IEEE&INNS/IJCNN'01, July, 2001.
    • (2001) Proc. IEEE&INNS/IJCNN'01
    • Ninomiya, H.1    Sasaki, A.2
  • 5
    • 0001024110 scopus 로고
    • First and second-order methods for learning: Between steepest descent and Newton's method
    • R. Battiti: "First and second-order methods for learning: Between steepest descent and Newton's method", Neural Computa., vol.4, no.2, pp. 141-166, 1992.
    • (1992) Neural Computa. , vol.4 , Issue.2 , pp. 141-166
    • Battiti, R.1
  • 6
    • 0000873069 scopus 로고
    • A method for the solution of certain problems in least squares
    • K. Levenberg: "A method for the solution of certain problems in least squares", Quart. Appl. Math., vol.5, pp. 164-168, 1944.
    • (1944) Quart. Appl. Math. , vol.5 , pp. 164-168
    • Levenberg, K.1
  • 7
    • 0000169232 scopus 로고
    • An algorithm for least squares estimation of nonlinear parameters
    • D. Marquardt: "An algorithm for least squares estimation of nonlinear parameters", SIAM J. Appl. Math., vol.11, pp.431-441, 1963.
    • (1963) SIAM J. Appl. Math. , vol.11 , pp. 431-441
    • Marquardt, D.1
  • 8
    • 0028543366 scopus 로고
    • Training feedforward networks with the Marquardt algorithm
    • Nov.
    • M. T. Hagan and M. Menhaj: "Training feedforward networks with the Marquardt algorithm", IEEE Trans. Neural Networks, vol.5, pp.989-993, Nov., 1994.
    • (1994) IEEE Trans. Neural Networks , vol.5 , pp. 989-993
    • Hagan, M.T.1    Menhaj, M.2
  • 9
    • 0024121749 scopus 로고
    • Implicit Steepest Descent Method and its Analogy with Charge-Up Method based on virtual Capacitors in Network Analysis
    • H. Asai: "Implicit Steepest Descent Method and its Analogy with Charge-Up Method based on virtual Capacitors in Network Analysis", Proc. IEEE/ISCAS'88, pp.1115-1118, 1988.
    • (1988) Proc. IEEE/ISCAS'88 , pp. 1115-1118
    • Asai, H.1
  • 10
    • 84989493958 scopus 로고    scopus 로고
    • Equivalent Property between Network Analysis Using Virtual Capacitors and Steepest Descent Method
    • May, in Japanese
    • H. Asai: "Equivalent Property between Network Analysis Using Virtual Capacitors and Steepest Descent Method", Trans. IEICE(A), Vol.J71-A, no.5, pp. 1132-1138, May, 1998(in Japanese).
    • (1998) Trans. IEICE(A) , vol.J71-A , Issue.5 , pp. 1132-1138
    • Asai, H.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.