-
3
-
-
0026820187
-
Fast learning process of multilayer neural networks using recursive least squares technique
-
Azimi-Sadjadi M.R., Liou R. Fast learning process of multilayer neural networks using recursive least squares technique. IEEE Trans. Signal Process. 40(2):1992;446-450.
-
(1992)
IEEE Trans. Signal Process.
, vol.40
, Issue.2
, pp. 446-450
-
-
Azimi-Sadjadi, M.R.1
Liou, R.2
-
4
-
-
0001024110
-
First- And second-order methods for learning: Between steepest descent and Newton's method
-
Battiti R. First- and second-order methods for learning: between steepest descent and Newton's method. Neural Computa. 4(2):1992;141-166.
-
(1992)
Neural Computa.
, vol.4
, Issue.2
, pp. 141-166
-
-
Battiti, R.1
-
5
-
-
0029373724
-
Training neural nets with the reactive tabu search
-
Battiti R., Tecchiolli G. Training neural nets with the reactive tabu search. IEEE Trans. Neural Networks. 6(5):1995;1185-1200.
-
(1995)
IEEE Trans. Neural Networks
, vol.6
, Issue.5
, pp. 1185-1200
-
-
Battiti, R.1
Tecchiolli, G.2
-
6
-
-
0345428905
-
An efficient and stable recurrent neural networks training algorithm for time series forecasting
-
Cho S.-Y., Chow T.W.S. An efficient and stable recurrent neural networks training algorithm for time series forecasting. Int. J. Knowledge-Based Intell. Eng. Systems. 1(3):1997;138-148.
-
(1997)
Int. J. Knowledge-Based Intell. Eng. Systems
, vol.1
, Issue.3
, pp. 138-148
-
-
Cho, S.-Y.1
Chow, T.W.S.2
-
7
-
-
0031268660
-
An accelerated recurrent network training algorithm using IIR filter model and recursive least squares method, IEEE Trans. Circuits Systems - I: Fund
-
Chow T.W.S., Cho S.Y. An accelerated recurrent network training algorithm using IIR filter model and recursive least squares method, IEEE Trans. Circuits Systems - I. Fund Theory Appl. 44(11):1997;1082-1086.
-
(1997)
Theory Appl.
, vol.44
, Issue.11
, pp. 1082-1086
-
-
Chow, T.W.S.1
Cho, S.Y.2
-
10
-
-
0003420910
-
-
Technical Report, CMU-CS-90-100, School of Computer Science, Carnegie Mellon University
-
S.E. Fahlman, C. Lebiere, The cascade-correlation learning architecture, Technical Report, CMU-CS-90-100, School of Computer Science, Carnegie Mellon University, 1990.
-
(1990)
The Cascade-correlation Learning Architecture
-
-
Fahlman, S.E.1
Lebiere, C.2
-
12
-
-
43949152886
-
Global optimization of statistical functions with simulated annealing
-
Goffe W.L., Ferrier G.D., Rogers J. Global optimization of statistical functions with simulated annealing. J. Econometrics. 60:1994;65-99.
-
(1994)
J. Econometrics
, vol.60
, pp. 65-99
-
-
Goffe, W.L.1
Ferrier, G.D.2
Rogers, J.3
-
13
-
-
0011838923
-
Some examples of local minima during learning with back-propagation
-
Vietri sul Mare(IT), May
-
M. Gori, A. Tesi, Some examples of local minima during learning with back-propagation, Parallel Architectures and Neural Networks, Vietri sul Mare(IT), May 1990.
-
(1990)
Parallel Architectures and Neural Networks
-
-
Gori, M.1
Tesi, A.2
-
15
-
-
0028543366
-
Training feedforward networks with the Marquardt algorithm
-
Hagan M.T., Menhaj M.B. Training feedforward networks with the Marquardt algorithm. IEEE Trans. Neural Networks. 5(6):1994;989-993.
-
(1994)
IEEE Trans. Neural Networks
, vol.5
, Issue.6
, pp. 989-993
-
-
Hagan, M.T.1
Menhaj, M.B.2
-
17
-
-
0003308797
-
A genetic algorithm for function optimization: A matlab implementation
-
submitted for Publication
-
C.R. Houch, J.A. Joines, M.G. Kay, A genetic algorithm for function optimization: a matlab implementation, ACM Trans. Math. Software, submitted for Publication.
-
ACM Trans. Math. Software
-
-
Houch, C.R.1
Joines, J.A.2
Kay, M.G.3
-
18
-
-
0028428443
-
Regression modeling in backpropagation and projection pursuit learning
-
Hwang J., Lay S., Maechler M., Martin R.D., Schimert J. Regression modeling in backpropagation and projection pursuit learning. IEEE Trans. Neural Networks. 5:1994;1-24.
-
(1994)
IEEE Trans. Neural Networks
, vol.5
, pp. 1-24
-
-
Hwang, J.1
Lay, S.2
Maechler, M.3
Martin, R.D.4
Schimert, J.5
-
19
-
-
0024137490
-
Increases rates of convergence through learning rate adaptation
-
Jacobs R.A. Increases rates of convergence through learning rate adaptation. Neural Networks. 1:1988;295-307.
-
(1988)
Neural Networks
, vol.1
, pp. 295-307
-
-
Jacobs, R.A.1
-
21
-
-
84892148954
-
Determining neural networks connectivity using evolutionary programming
-
Monterey, CA
-
J.R. McDonnell, D. Waagen, Determining neural networks connectivity using evolutionary programming, Twenty-Sixth Asilormar Conference on Signals, Systems, and Computers, Monterey, CA, 1992.
-
(1992)
Twenty-Sixth Asilormar Conference on Signals, Systems, and Computers
-
-
McDonnell, J.R.1
Waagen, D.2
-
22
-
-
0004258746
-
Optimization Theory and Applications
-
New York: Wiley
-
Rao S.S. Optimization Theory and Applications. Second ed. 1984;Wiley, New York.
-
(1984)
Second Ed.
-
-
Rao, S.S.1
-
23
-
-
0024922715
-
Phoneme classification experiments using radial basis functions
-
Washington, DC, June
-
S. Renals, R. Rohwer, Phoneme classification experiments using radial basis functions, Proceedings of IEEE International Joint Conference on Neural Networks, Washington, DC, June 1989, pp. I:461-467.
-
(1989)
Proceedings of IEEE International Joint Conference on Neural Networks
-
-
Renals, S.1
Rohwer, R.2
-
24
-
-
0030104504
-
Global optimization for neural network training
-
Shang Y., Wah B.W. Global optimization for neural network training. IEEE Comput. 29(3):1996;45-54.
-
(1996)
IEEE Comput.
, vol.29
, Issue.3
, pp. 45-54
-
-
Shang, Y.1
Wah, B.W.2
-
25
-
-
0025593679
-
Super SAB: Fast adaptive back propagation with good scaling properties
-
T. Tolleneace, Super SAB: fast adaptive back propagation with good scaling properties, Neural Networks 3, 561-573.
-
Neural Networks
, vol.3
, pp. 561-573
-
-
Tolleneace, T.1
-
27
-
-
0029519369
-
Training a neural network with conjugate gradient methods
-
Perth, Western Australia
-
M. Towsey, D. Alpsan, L. Sztriha, Training a neural network with conjugate gradient methods, Proceeding of ICNN'95, Perth, Western Australia, (1995), pp. 373-378.
-
(1995)
Proceeding of ICNN'95
, pp. 373-378
-
-
Towsey, M.1
Alpsan, D.2
Sztriha, L.3
-
28
-
-
85132302281
-
Training recurrent networks using the extended kalman filter
-
Baltimore
-
R.J. Williams, Training recurrent networks using the extended kalman filter, International Joint Conference on Neural Networks, Baltimore, 1992, vol. IV, pp. 241-246.
-
(1992)
International Joint Conference on Neural Networks
, vol.4
, pp. 241-246
-
-
Williams, R.J.1
-
29
-
-
0027668649
-
Extended backpropagation algorithm
-
Yam Y.F., Chow T.W.S. Extended backpropagation algorithm. Electron. Lett. 29(19):1993;1701-1702.
-
(1993)
Electron. Lett.
, vol.29
, Issue.19
, pp. 1701-1702
-
-
Yam, Y.F.1
Chow, T.W.S.2
-
30
-
-
0004305684
-
Accelerated training algorithm for feedforward neural networks based on linear least squares problems
-
Yam Y.F., Chow T.W.S. Accelerated training algorithm for feedforward neural networks based on linear least squares problems. Neural Process. Lett. 2(4):1995;20-25.
-
(1995)
Neural Process. Lett.
, vol.2
, Issue.4
, pp. 20-25
-
-
Yam, Y.F.1
Chow, T.W.S.2
-
31
-
-
0031139318
-
Extended least squares based algorithm for training feedforward networks
-
Yam Y.F., S Chow T.W. Extended least squares based algorithm for training feedforward networks. IEEE Trans. Neural Networks. 8(3):1997;806-810.
-
(1997)
IEEE Trans. Neural Networks
, vol.8
, Issue.3
, pp. 806-810
-
-
Yam, Y.F.1
S. Chow, T.W.2
-
32
-
-
0039247278
-
Integral global optimization of constrained problems in functional spaces with discontinuous penalty functions
-
C.A. Floudas, & P.M. Pardalos. Princeton, NJ: Princeton University press
-
Zheng Q., Zhuang D. Integral global optimization of constrained problems in functional spaces with discontinuous penalty functions. Floudas C.A., Pardalos P.M. Recent Advances in Global Optimization. 1992;298-320 Princeton University press, Princeton, NJ.
-
(1992)
Recent Advances in Global Optimization
, pp. 298-320
-
-
Zheng, Q.1
Zhuang, D.2
|