-
1
-
-
0001632845
-
On the capabilities of multilayer perceptrons
-
E. B. Baum, "On the capabilities of multilayer perceptrons," J. Complexity, vol. 4, pp. 193-215, 1988.
-
(1988)
J. Complexity
, vol.4
, pp. 193-215
-
-
Baum, E.B.1
-
3
-
-
0024861871
-
Continuous value neural networks with two hidden layers are sufficient
-
G. Cybenko, "Continuous value neural networks with two hidden layers are sufficient," Math. Contr. Signals Syst., vol. 2, pp. 303-314, 1989.
-
(1989)
Math. Contr. Signals Syst.
, vol.2
, pp. 303-314
-
-
Cybenko, G.1
-
4
-
-
0003442864
-
-
Ph.D. dissertation, Comput. Sci. Dep., Yale Univ., New Haven, CT
-
H. C. Elman, "Iterative methods for large sparse nonsymmetric systems of linear equations," Ph.D. dissertation, Comput. Sci. Dep., Yale Univ., New Haven, CT, 1982.
-
(1982)
Iterative Methods for Large Sparse Nonsymmetric Systems of Linear Equations
-
-
Elman, H.C.1
-
6
-
-
0000783575
-
The upstart algorithm: A method for constructing and training feedforward neural networks
-
M. Frean, "The upstart algorithm: A method for constructing and training feedforward neural networks," Neural Computa., vol. 2, pp. 198-209, 1990.
-
(1990)
Neural Computa.
, vol.2
, pp. 198-209
-
-
Frean, M.1
-
7
-
-
0023515080
-
Counterpropagation networks
-
R. Hecht-Nielsen, "Counterpropagation networks," Appl. Optics, vol. 26, no. 23, pp. 4979-4984, 1987.
-
(1987)
Appl. Optics
, vol.26
, Issue.23
, pp. 4979-4984
-
-
Hecht-Nielsen, R.1
-
10
-
-
0024880831
-
Multilayer feedforward networks are universal approximators
-
K. M. Hornik, M. Stinchocombe, and H. White, "Multilayer feedforward networks are universal approximators," Neural Networks, vol. 2, pp. 359-366, 1989.
-
(1989)
Neural Networks
, vol.2
, pp. 359-366
-
-
Hornik, K.M.1
Stinchocombe, M.2
White, H.3
-
11
-
-
0025503458
-
Constructive approximations for neural networks by sigmoidal functions
-
L. K. Jones, "Constructive approximations for neural networks by sigmoidal functions," Proc. IEEE, vol. 78, no. 10, pp. 1586-1589, 1990.
-
(1990)
Proc. IEEE
, vol.78
, Issue.10
, pp. 1586-1589
-
-
Jones, L.K.1
-
13
-
-
0002290223
-
Efficient parallel learning algorithm for neural networks
-
D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann
-
A. H. Kramer and A. Sangiovanni-Vincentelli, "Efficient parallel learning algorithm for neural networks," in Advances in Neural Information Processing Systems I, D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1988, pp. 40-48.
-
(1988)
Advances in Neural Information Processing Systems I
, pp. 40-48
-
-
Kramer, A.H.1
Sangiovanni-Vincentelli, A.2
-
14
-
-
84956226983
-
A convergence theorem for sequential learning in two-layer perceptrons
-
M. Marchand, M. Golea, and P. Ruján, "A convergence theorem for sequential learning in two-layer perceptrons," Europhysics Lett., vol. 11, pp. 487-492, 1990.
-
(1990)
Europhysics Lett.
, vol.11
, pp. 487-492
-
-
Marchand, M.1
Golea, M.2
Ruján, P.3
-
15
-
-
36149031331
-
Learning in feedforward neural networks: The tiling algorithm
-
M. Mézard and J.-P. Nadal, "Learning in feedforward neural networks: The tiling algorithm," J. Physics, vol. A, no. 22, pp. 2191-2204, 1989.
-
(1989)
J. Physics
, vol.A
, Issue.22
, pp. 2191-2204
-
-
Mézard, M.1
Nadal, J.-P.2
-
16
-
-
33747693511
-
-
Master's thesis, Dep. Electrical Engineering, Univ. Notre Dame, IN
-
J. O. Moody, "A new method for constructing and training multilayer neural, networks," Master's thesis, Dep. Electrical Engineering, Univ. Notre Dame, IN, 1993.
-
(1993)
A New Method for Constructing and Training Multilayer Neural, Networks
-
-
Moody, J.O.1
-
19
-
-
0000646059
-
Learning internal representations by error propagation
-
Foundation, D. E. Rumelhart and J. L. McClelland, Eds. Cambridge, MA: MIT Press
-
D. E. Rumelhart, G. E. Hinton, and R. J. Williams, "Learning internal representations by error propagation," in Parallel Distributed Processing: Explanations in the Microstructure of Cognition, vol. 1: Foundation, D. E. Rumelhart and J. L. McClelland, Eds. Cambridge, MA: MIT Press, 1986, pp. 318-362.
-
(1986)
Parallel Distributed Processing: Explanations in the Microstructure of Cognition
, vol.1
, pp. 318-362
-
-
Rumelhart, D.E.1
Hinton, G.E.2
Williams, R.J.3
-
20
-
-
85067180927
-
Neural network training via quadratic optimization
-
San Diego, CA, May 10-13, 1992, and Tech. Rep. 90-05-01, Dep. Electrical Comput. Eng., Univ. Notre Dame, revised Apr.
-
M. A. Sartori and P. J. Antsaklis, "Neural network training via quadratic optimization," in Proc. ISCAS, San Diego, CA, May 10-13, 1992, and Tech. Rep. 90-05-01, Dep. Electrical Comput. Eng., Univ. Notre Dame, revised Apr. 1991.
-
(1991)
Proc. ISCAS
-
-
Sartori, M.A.1
Antsaklis, P.J.2
-
21
-
-
0026190194
-
A simple method to derive bounds on the size and to train multilayer neural networks
-
July
-
_, "A simple method to derive bounds on the size and to train multilayer neural networks," IEEE Trans. Neural Networks, vol. 2, no. 4, pp. 467-471, July 1991.
-
(1991)
IEEE Trans. Neural Networks
, vol.2
, Issue.4
, pp. 467-471
-
-
-
22
-
-
0041770497
-
Emergence of grandmother memory in feed forward networks: Learning with noise and forgetfulness
-
D. Waltz and J. A. Feldman, Eds. Norwood, MA: Ablex
-
R. Scalettar and A. Zee, "Emergence of grandmother memory in feed forward networks: Learning with noise and forgetfulness," in Connectionist Models and their Implications: Readings from Cognitive Science, D. Waltz and J. A. Feldman, Eds. Norwood, MA: Ablex, 1988, pp. 309-332.
-
(1988)
Connectionist Models and Their Implications: Readings from Cognitive Science
, pp. 309-332
-
-
Scalettar, R.1
Zee, A.2
-
23
-
-
0002855385
-
Scaling relations in backpropagation learning
-
G. Tesauro and B. Janssens, "Scaling relations in backpropagation learning," Complex Syst., vol. 2, pp. 39-44, 1988.
-
(1988)
Complex Syst.
, vol.2
, pp. 39-44
-
-
Tesauro, G.1
Janssens, B.2
|