메뉴 건너뛰기




Volumn 25, Issue 1-3, 1999, Pages 115-131

Training multilayer neural networks using fast global learning algorithm - Least-squares and penalized optimization methods

Author keywords

Global learning algorithm; Least squares method; Multilayer neural networks; Penalized optimization

Indexed keywords

CONVERGENCE OF NUMERICAL METHODS; ERROR ANALYSIS; HEURISTIC METHODS; LEARNING ALGORITHMS; LEARNING SYSTEMS; LEAST SQUARES APPROXIMATIONS; OPTIMIZATION; PROBLEM SOLVING;

EID: 0032935466     PISSN: 09252312     EISSN: None     Source Type: Journal    
DOI: 10.1016/S0925-2312(99)00055-7     Document Type: Article
Times cited : (45)

References (32)
  • 3
    • 0026820187 scopus 로고
    • Fast learning process of multilayer neural networks using recursive least squares technique
    • Azimi-Sadjadi M.R., Liou R. Fast learning process of multilayer neural networks using recursive least squares technique. IEEE Trans. Signal Process. 40(2):1992;446-450.
    • (1992) IEEE Trans. Signal Process. , vol.40 , Issue.2 , pp. 446-450
    • Azimi-Sadjadi, M.R.1    Liou, R.2
  • 4
    • 0001024110 scopus 로고
    • First- And second-order methods for learning: Between steepest descent and Newton's method
    • Battiti R. First- and second-order methods for learning: between steepest descent and Newton's method. Neural Computa. 4(2):1992;141-166.
    • (1992) Neural Computa. , vol.4 , Issue.2 , pp. 141-166
    • Battiti, R.1
  • 5
    • 0029373724 scopus 로고
    • Training neural nets with the reactive tabu search
    • Battiti R., Tecchiolli G. Training neural nets with the reactive tabu search. IEEE Trans. Neural Networks. 6(5):1995;1185-1200.
    • (1995) IEEE Trans. Neural Networks , vol.6 , Issue.5 , pp. 1185-1200
    • Battiti, R.1    Tecchiolli, G.2
  • 6
    • 0345428905 scopus 로고    scopus 로고
    • An efficient and stable recurrent neural networks training algorithm for time series forecasting
    • Cho S.-Y., Chow T.W.S. An efficient and stable recurrent neural networks training algorithm for time series forecasting. Int. J. Knowledge-Based Intell. Eng. Systems. 1(3):1997;138-148.
    • (1997) Int. J. Knowledge-Based Intell. Eng. Systems , vol.1 , Issue.3 , pp. 138-148
    • Cho, S.-Y.1    Chow, T.W.S.2
  • 7
    • 0031268660 scopus 로고    scopus 로고
    • An accelerated recurrent network training algorithm using IIR filter model and recursive least squares method, IEEE Trans. Circuits Systems - I: Fund
    • Chow T.W.S., Cho S.Y. An accelerated recurrent network training algorithm using IIR filter model and recursive least squares method, IEEE Trans. Circuits Systems - I. Fund Theory Appl. 44(11):1997;1082-1086.
    • (1997) Theory Appl. , vol.44 , Issue.11 , pp. 1082-1086
    • Chow, T.W.S.1    Cho, S.Y.2
  • 12
    • 43949152886 scopus 로고
    • Global optimization of statistical functions with simulated annealing
    • Goffe W.L., Ferrier G.D., Rogers J. Global optimization of statistical functions with simulated annealing. J. Econometrics. 60:1994;65-99.
    • (1994) J. Econometrics , vol.60 , pp. 65-99
    • Goffe, W.L.1    Ferrier, G.D.2    Rogers, J.3
  • 13
    • 0011838923 scopus 로고
    • Some examples of local minima during learning with back-propagation
    • Vietri sul Mare(IT), May
    • M. Gori, A. Tesi, Some examples of local minima during learning with back-propagation, Parallel Architectures and Neural Networks, Vietri sul Mare(IT), May 1990.
    • (1990) Parallel Architectures and Neural Networks
    • Gori, M.1    Tesi, A.2
  • 14
    • 0026745182 scopus 로고
    • On the problem of local minima in backpropagation
    • Gori M., Tesi A. On the problem of local minima in backpropagation. IEEE Trans. Pattern Anal. Mach. Intell. 14(1):1992;76-86.
    • (1992) IEEE Trans. Pattern Anal. Mach. Intell. , vol.14 , Issue.1 , pp. 76-86
    • Gori, M.1    Tesi, A.2
  • 15
    • 0028543366 scopus 로고
    • Training feedforward networks with the Marquardt algorithm
    • Hagan M.T., Menhaj M.B. Training feedforward networks with the Marquardt algorithm. IEEE Trans. Neural Networks. 5(6):1994;989-993.
    • (1994) IEEE Trans. Neural Networks , vol.5 , Issue.6 , pp. 989-993
    • Hagan, M.T.1    Menhaj, M.B.2
  • 17
    • 0003308797 scopus 로고    scopus 로고
    • A genetic algorithm for function optimization: A matlab implementation
    • submitted for Publication
    • C.R. Houch, J.A. Joines, M.G. Kay, A genetic algorithm for function optimization: a matlab implementation, ACM Trans. Math. Software, submitted for Publication.
    • ACM Trans. Math. Software
    • Houch, C.R.1    Joines, J.A.2    Kay, M.G.3
  • 19
    • 0024137490 scopus 로고
    • Increases rates of convergence through learning rate adaptation
    • Jacobs R.A. Increases rates of convergence through learning rate adaptation. Neural Networks. 1:1988;295-307.
    • (1988) Neural Networks , vol.1 , pp. 295-307
    • Jacobs, R.A.1
  • 22
    • 0004258746 scopus 로고
    • Optimization Theory and Applications
    • New York: Wiley
    • Rao S.S. Optimization Theory and Applications. Second ed. 1984;Wiley, New York.
    • (1984) Second Ed.
    • Rao, S.S.1
  • 24
    • 0030104504 scopus 로고    scopus 로고
    • Global optimization for neural network training
    • Shang Y., Wah B.W. Global optimization for neural network training. IEEE Comput. 29(3):1996;45-54.
    • (1996) IEEE Comput. , vol.29 , Issue.3 , pp. 45-54
    • Shang, Y.1    Wah, B.W.2
  • 25
    • 0025593679 scopus 로고    scopus 로고
    • Super SAB: Fast adaptive back propagation with good scaling properties
    • T. Tolleneace, Super SAB: fast adaptive back propagation with good scaling properties, Neural Networks 3, 561-573.
    • Neural Networks , vol.3 , pp. 561-573
    • Tolleneace, T.1
  • 27
    • 0029519369 scopus 로고
    • Training a neural network with conjugate gradient methods
    • Perth, Western Australia
    • M. Towsey, D. Alpsan, L. Sztriha, Training a neural network with conjugate gradient methods, Proceeding of ICNN'95, Perth, Western Australia, (1995), pp. 373-378.
    • (1995) Proceeding of ICNN'95 , pp. 373-378
    • Towsey, M.1    Alpsan, D.2    Sztriha, L.3
  • 28
    • 85132302281 scopus 로고
    • Training recurrent networks using the extended kalman filter
    • Baltimore
    • R.J. Williams, Training recurrent networks using the extended kalman filter, International Joint Conference on Neural Networks, Baltimore, 1992, vol. IV, pp. 241-246.
    • (1992) International Joint Conference on Neural Networks , vol.4 , pp. 241-246
    • Williams, R.J.1
  • 29
    • 0027668649 scopus 로고
    • Extended backpropagation algorithm
    • Yam Y.F., Chow T.W.S. Extended backpropagation algorithm. Electron. Lett. 29(19):1993;1701-1702.
    • (1993) Electron. Lett. , vol.29 , Issue.19 , pp. 1701-1702
    • Yam, Y.F.1    Chow, T.W.S.2
  • 30
    • 0004305684 scopus 로고
    • Accelerated training algorithm for feedforward neural networks based on linear least squares problems
    • Yam Y.F., Chow T.W.S. Accelerated training algorithm for feedforward neural networks based on linear least squares problems. Neural Process. Lett. 2(4):1995;20-25.
    • (1995) Neural Process. Lett. , vol.2 , Issue.4 , pp. 20-25
    • Yam, Y.F.1    Chow, T.W.S.2
  • 31
    • 0031139318 scopus 로고    scopus 로고
    • Extended least squares based algorithm for training feedforward networks
    • Yam Y.F., S Chow T.W. Extended least squares based algorithm for training feedforward networks. IEEE Trans. Neural Networks. 8(3):1997;806-810.
    • (1997) IEEE Trans. Neural Networks , vol.8 , Issue.3 , pp. 806-810
    • Yam, Y.F.1    S. Chow, T.W.2
  • 32
    • 0039247278 scopus 로고
    • Integral global optimization of constrained problems in functional spaces with discontinuous penalty functions
    • C.A. Floudas, & P.M. Pardalos. Princeton, NJ: Princeton University press
    • Zheng Q., Zhuang D. Integral global optimization of constrained problems in functional spaces with discontinuous penalty functions. Floudas C.A., Pardalos P.M. Recent Advances in Global Optimization. 1992;298-320 Princeton University press, Princeton, NJ.
    • (1992) Recent Advances in Global Optimization , pp. 298-320
    • Zheng, Q.1    Zhuang, D.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.