메뉴 건너뛰기




Volumn 70, Issue 1-3, 2006, Pages 525-535

An efficient hidden layer training method for the multilayer perceptron

Author keywords

Adaptive learning factor; Convergence; Hidden layer error function; Hidden weight optimization (HWO); Saturation

Indexed keywords

ALGORITHMS; LEARNING SYSTEMS; LINEAR EQUATIONS; OPTIMIZATION;

EID: 33750376626     PISSN: 09252312     EISSN: None     Source Type: Journal    
DOI: 10.1016/j.neucom.2005.11.008     Document Type: Article
Times cited : (35)

References (30)
  • 1
    • 0011729314 scopus 로고
    • A matrix method for optimizing a neural network
    • Barton S.A. A matrix method for optimizing a neural network. Neural Computation 3 3 (1991) 450-459
    • (1991) Neural Computation , vol.3 , Issue.3 , pp. 450-459
    • Barton, S.A.1
  • 2
    • 0001024110 scopus 로고
    • First- and Second-order methods for learning: between steepest descent and Newton's method
    • Battiti R. First- and Second-order methods for learning: between steepest descent and Newton's method. Neural computation 4 2 (1992) 141-166
    • (1992) Neural computation , vol.4 , Issue.2 , pp. 141-166
    • Battiti, R.1
  • 3
    • 0032938952 scopus 로고    scopus 로고
    • A neural network training algorithm utilizing multiple set of linear equations
    • Chen H.-H., Manry M.T., and Chandrasekaran H. A neural network training algorithm utilizing multiple set of linear equations. Neurocomputing 25 1-3 (1999) 55-72
    • (1999) Neurocomputing , vol.25 , Issue.1-3 , pp. 55-72
    • Chen, H.-H.1    Manry, M.T.2    Chandrasekaran, H.3
  • 4
    • 33750312956 scopus 로고    scopus 로고
    • Y. L.eCun, Generalization and network design strategies, in: Proceedings of Connectionism in perspective, 1989.
  • 5
    • 0027382740 scopus 로고
    • Surface parameter retrieval using fast learning neural networks
    • Dawson M.S., Fung A.K., and Manry M.T. Surface parameter retrieval using fast learning neural networks. Remote Sensing Reviews 7 1 (1993) 1-18
    • (1993) Remote Sensing Reviews , vol.7 , Issue.1 , pp. 1-18
    • Dawson, M.S.1    Fung, A.K.2    Manry, M.T.3
  • 7
    • 0029767526 scopus 로고    scopus 로고
    • M.H. Fun, M.T. Hagan, Levenberg-Marquardt training for modular networks, The 1996 IEEE International Conference on Neural Networks, vol. 1, 1996, pp. 468-473.
  • 8
    • 0029487497 scopus 로고    scopus 로고
    • C.M. Hadzer, R. Hasan, et al., Improved singular value decomposition by using neural networks, IEEE International Conference on Neural Networks, vol. 1, 1995 438-442.
  • 9
    • 0028543366 scopus 로고
    • Training feedforward networks with the Marquardt algorithm
    • Hagan M.T., and Menhaj M.B. Training feedforward networks with the Marquardt algorithm. IEEE Transaction on Neural Networks 5 6 (1994) 989-993
    • (1994) IEEE Transaction on Neural Networks , vol.5 , Issue.6 , pp. 989-993
    • Hagan, M.T.1    Menhaj, M.B.2
  • 12
    • 0141571437 scopus 로고    scopus 로고
    • T.H. Kim, M.T. Manry, F.J. Maldonado, New learning factor and testing methods for conjugate gradient training algorithm, IJCNN'03. International Joint Conference on Neural Networks, 2003, pp. 2011-2016.
  • 13
    • 0035659877 scopus 로고    scopus 로고
    • Learning efficiency improvement of back-propagation algorithm by error saturation prevention method
    • Lee H.-M., Chen C.-M., and Huang T.-C. Learning efficiency improvement of back-propagation algorithm by error saturation prevention method. Neurocomputing 41 (2001) 125-143
    • (2001) Neurocomputing , vol.41 , pp. 125-143
    • Lee, H.-M.1    Chen, C.-M.2    Huang, T.-C.3
  • 14
    • 0027226690 scopus 로고
    • An analysis of premature saturation in back-propagation learning
    • Lee Y., Oh S.-H., and Kim M.W. An analysis of premature saturation in back-propagation learning. Neural Networks 6 (1993) 719-728
    • (1993) Neural Networks , vol.6 , pp. 719-728
    • Lee, Y.1    Oh, S.-H.2    Kim, M.W.3
  • 15
    • 0033209687 scopus 로고    scopus 로고
    • Improving the convergence of the backpropagation algorithm using learning adaptation Methods
    • Magoulas G.D., Vrahatis M.N., and Androulakis G.S. Improving the convergence of the backpropagation algorithm using learning adaptation Methods. Neural Computation 11 (1999) 1769-1796
    • (1999) Neural Computation , vol.11 , pp. 1769-1796
    • Magoulas, G.D.1    Vrahatis, M.N.2    Androulakis, G.S.3
  • 17
    • 0028681243 scopus 로고
    • Fast training of neural networks for remote sensing
    • Manry M.T., et al. Fast training of neural networks for remote sensing. Remote Sensing Reviews 9 (1994) 77-96
    • (1994) Remote Sensing Reviews , vol.9 , pp. 77-96
    • Manry, M.T.1
  • 19
    • 0031143834 scopus 로고    scopus 로고
    • Improving the error back-propagation algorithm with a modified error function
    • Oh S.-H. Improving the error back-propagation algorithm with a modified error function. IEEE Transactions on Neural Networks 8 3 (1997) 799-803
    • (1997) IEEE Transactions on Neural Networks , vol.8 , Issue.3 , pp. 799-803
    • Oh, S.-H.1
  • 20
    • 0032678016 scopus 로고    scopus 로고
    • A new error function at hidden layers for fast training of multilayer perceptrons
    • Oh S.-H., and Lee S.-Y. A new error function at hidden layers for fast training of multilayer perceptrons. IEEE Transactions on Neural Networks 10 (1999) 960-964
    • (1999) IEEE Transactions on Neural Networks , vol.10 , pp. 960-964
    • Oh, S.-H.1    Lee, S.-Y.2
  • 21
    • 85131708944 scopus 로고    scopus 로고
    • S.-H. Oh, S.-Y. Lee, Optimal learning rates for each pattern and neuron in gradient descent training of multilayer perceptrons, IJCNN'99. International Joint Conference on Neural Networks, vol 3, 1999, pp. 1635-1638.
  • 22
    • 0026838708 scopus 로고
    • An empirical model and an inversion technique for radar scattering from bare soil surfaces
    • Oh Y., Sarabandi K., and Ulaby F.T. An empirical model and an inversion technique for radar scattering from bare soil surfaces. IEEE Transactions on Geoscience and Remote Sensing 30 2 (1992) 370-381
    • (1992) IEEE Transactions on Geoscience and Remote Sensing , vol.30 , Issue.2 , pp. 370-381
    • Oh, Y.1    Sarabandi, K.2    Ulaby, F.T.3
  • 23
    • 0026711368 scopus 로고
    • Improving the convergence of the back-propagation algorithm
    • Ooyen V., and Nienhuis B. Improving the convergence of the back-propagation algorithm. Neural Networks 5 (1992) 465-471
    • (1992) Neural Networks , vol.5 , pp. 465-471
    • Ooyen, V.1    Nienhuis, B.2
  • 24
    • 0004161838 scopus 로고
    • Cambridge University Press, New York
    • Press W.H., et al. Numerical Recipes (1986), Cambridge University Press, New York
    • (1986) Numerical Recipes
    • Press, W.H.1
  • 27
    • 0026727333 scopus 로고    scopus 로고
    • R.S. Scalero, N. Tepedelenlioglu, A fast new algorithm for training feedforward neural networks, IEEE Transactions on signal processing, vol. 40 (1) 1992 pp. 202-210.
  • 28
    • 0030142280 scopus 로고    scopus 로고
    • A fast multilayer neural-network training algorithm based on the layer-by-layer optimizing procedures
    • Wang G.-J., and Chen C.-C. A fast multilayer neural-network training algorithm based on the layer-by-layer optimizing procedures. IEEE Transactions on Neural Networks 7 3 (1996) 768-775
    • (1996) IEEE Transactions on Neural Networks , vol.7 , Issue.3 , pp. 768-775
    • Wang, G.-J.1    Chen, C.-C.2
  • 29
    • 33750310222 scopus 로고    scopus 로고
    • P.J. Werbos, Beyond regression: new tools for prediction and analysis in the behavioral science, Ph.D. Thesis, Harvard University, Cambridge, Mass, 1974.
  • 30
    • 0033990683 scopus 로고    scopus 로고
    • A weight initialization method for improving training speed in feedforward neural network
    • Yam J.Y.F., and Chow T.W.S. A weight initialization method for improving training speed in feedforward neural network. Neurocomputing 30 (2000) 219-232
    • (2000) Neurocomputing , vol.30 , pp. 219-232
    • Yam, J.Y.F.1    Chow, T.W.S.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.