메뉴 건너뛰기




Volumn 64, Issue 3, 1998, Pages 359-370

A homotopy method for training neural networks

Author keywords

Gauss Newton method; Homotopy; Multilayer perceptron; Second order learning

Indexed keywords

COMPUTER SIMULATION; CONVERGENCE OF NUMERICAL METHODS; ERROR CORRECTION; LEARNING SYSTEMS; OPTIMIZATION; PROBLEM SOLVING; REGRESSION ANALYSIS;

EID: 0032002106     PISSN: 01651684     EISSN: None     Source Type: Journal    
DOI: 10.1016/S0165-1684(97)00201-6     Document Type: Article
Times cited : (2)

References (20)
  • 2
    • 0001024110 scopus 로고
    • First and second order methods for learning: Between steepest descent and Newton's method
    • R. Battiti, First and second order methods for learning: between steepest descent and Newton's method, Neural Computation 4 (2) (1992) 141-166.
    • (1992) Neural Computation , vol.4 , Issue.2 , pp. 141-166
    • Battiti, R.1
  • 5
    • 0024861871 scopus 로고
    • Approximation by superposition of a sigmoidal function
    • G. Cybenko, Approximation by superposition of a sigmoidal function, Math. Control Signals Systems 2 (1989) 304-314.
    • (1989) Math. Control Signals Systems , vol.2 , pp. 304-314
    • Cybenko, G.1
  • 6
    • 0042850261 scopus 로고
    • A continuation method for nonlinear regression
    • December
    • N. de Villiers, D. Glasser, A continuation method for nonlinear regression, SIAM J. Numer. Anal. 18 (6) (December 1981) 1139-1154.
    • (1981) SIAM J. Numer. Anal. , vol.18 , Issue.6 , pp. 1139-1154
    • De Villiers, N.1    Glasser, D.2
  • 10
    • 0024866495 scopus 로고
    • On the approximate realization of continuous mappings by neural networks
    • K. Funahashi, On the approximate realization of continuous mappings by neural networks, Neural Networks 2 (1989) 183-192.
    • (1989) Neural Networks , vol.2 , pp. 183-192
    • Funahashi, K.1
  • 11
    • 0028543366 scopus 로고
    • Training feedforward networks with the Marquardt algorithm
    • November
    • M.T. Hagan, M.B. Menhaj, Training feedforward networks with the Marquardt algorithm, IEEE Transactions on Neural Networks 5 (6) (November 1994) 989-993.
    • (1994) IEEE Transactions on Neural Networks , vol.5 , Issue.6 , pp. 989-993
    • Hagan, M.T.1    Menhaj, M.B.2
  • 13
    • 0024880831 scopus 로고
    • Neural networks are universal approximators
    • K. Hornik, M. Stincombe, H. White, Neural networks are universal approximators, Neural Networks 2 (1990) 359-366.
    • (1990) Neural Networks , vol.2 , pp. 359-366
    • Hornik, K.1    Stincombe, M.2    White, H.3
  • 14
    • 0000873069 scopus 로고
    • A method for the solution of certain problems in least squares
    • K. Levenberg, A method for the solution of certain problems in least squares, Quart. Appl. Math 2 (1944) 164-168.
    • (1944) Quart. Appl. Math , vol.2 , pp. 164-168
    • Levenberg, K.1
  • 15
    • 0000169232 scopus 로고
    • An algorithm for least-squares estimation of nonlinear parameters
    • D. Marquardt, An algorithm for least-squares estimation of nonlinear parameters, SIAM J. Appl. Math 11 (1963) 431-441.
    • (1963) SIAM J. Appl. Math , vol.11 , pp. 431-441
    • Marquardt, D.1
  • 16
    • 0025536870 scopus 로고
    • Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights
    • San Diego
    • D. Nguyen, B. Widrow, Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights, in: IJCNN 1990, 1990, Vol. 3, San Diego, pp. 21-26.
    • (1990) IJCNN 1990 , vol.3 , pp. 21-26
    • Nguyen, D.1    Widrow, B.2
  • 17
    • 0024921865 scopus 로고
    • Efficient training of the back-propagation network by solving a system of stiff ordinary differential equations
    • San Diego
    • A.J. Owens, D.L. Filkin, Efficient training of the back-propagation network by solving a system of stiff ordinary differential equations, in: IJCNN 1989, San Diego, 1989, Vol. 2, pp. 381-386.
    • (1989) IJCNN 1989 , vol.2 , pp. 381-386
    • Owens, A.J.1    Filkin, D.L.2
  • 18
  • 19
    • 0003652104 scopus 로고
    • Optimization of the backpropagation algorithm for training multilayer perceptrons
    • Institute of Computer Science, University of Koblenz
    • W. Schiffmann, M. Joost, R. Werner, Optimization of the backpropagation algorithm for training multilayer perceptrons, Tech. Rep., Institute of Computer Science, University of Koblenz, ftp://128.146.8.52/pub/neuroprose/ schiff.bp_speedup.ps.Z, 1993.
    • (1993) Tech. Rep.
    • Schiffmann, W.1    Joost, M.2    Werner, R.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.