-
1
-
-
0024878952
-
Theory of backpropagation neural network
-
Washington, DC, June
-
R. Hecht-Nielsen, “Theory of backpropagation neural network,” in Proc. IJCNN′89, Washington, DC, vol. I, June 1989, pp. 593–605.
-
(1989)
Proc. IJCNN′89
, vol.1
, pp. 593-605
-
-
Hecht-Nielsen, R.1
-
2
-
-
0024880831
-
Multilayer feedforward networks are universal approximators
-
K. Hornik, M. Stinchcombe, and H. White, “Multilayer feedforward networks are universal approximators,” Neural Networks, vol. 2, pp. 359–366, 1989.
-
(1989)
Neural Networks
, vol.2
, pp. 359-366
-
-
Hornik, K.1
Stinchcombe, M.2
White, H.3
-
3
-
-
0024861871
-
Continuous valued neural networks with two hidden layers are sufficient
-
G. Cybenko, “Continuous valued neural networks with two hidden layers are sufficient,” Mathematics of Controls, Signals and Systems, vol. 2, pp. 303–314, 1989.
-
(1989)
Mathematics of Controls, Signals and Systems
, vol.2
, pp. 303-314
-
-
Cybenko, G.1
-
4
-
-
0025747581
-
Approximation theory and feedforward networks
-
E.K. Blum and L.K. Li, “Approximation theory and feedforward networks,” Neural Networks, vol. 4, pp. 511–515, 1991.
-
(1991)
Neural Networks
, vol.4
, pp. 511-515
-
-
Blum, E.K.1
Li, L.K.2
-
5
-
-
0001632845
-
On the capabilities of multilayer perceptrons
-
E.B. Baum, “On the capabilities of multilayer perceptrons,” J. Complexity, vol. 4, pp. 193–215, 1988.
-
(1988)
J. Complexity
, vol.4
, pp. 193-215
-
-
Baum, E.B.1
-
6
-
-
0025792215
-
Bounds on number of hidden neurons in multilayer perceptrons
-
Jan.
-
S.C. Huang and Y.F. Huang, “Bounds on number of hidden neurons in multilayer perceptrons,” IEEE Trans. Neural Networks, vol. 2, pp. 47–55, Jan. 1991.
-
(1991)
IEEE Trans. Neural Networks
, vol.2
, pp. 47-55
-
-
Huang, S.C.1
Huang, Y.F.2
-
7
-
-
0026190194
-
A simple method to derive bounds on the size and to train multilayer neural networks
-
July
-
M.A. Sartori and P.J. Antsaklis, “A simple method to derive bounds on the size and to train multilayer neural networks,” IEEE Trans. Neural Networks, vol. 2, pp. 467–471, July 1991.
-
(1991)
IEEE Trans. Neural Networks
, vol.2
, pp. 467-471
-
-
Sartori, M.A.1
Antsaklis, P.J.2
-
8
-
-
0000939933
-
Supervised learning technique for backpropagation networks
-
San Diego, CA, June
-
L.G. Allred and G.E. Kelly, “Supervised learning technique for backpropagation networks,” in Proc. IJCNN′90, San Diego, CA, June 1990, pp. 702–709.
-
(1990)
Proc. IJCNN′90
, pp. 702-709
-
-
Allred, L.G.1
Kelly, G.E.2
-
9
-
-
0025493673
-
Training algorithms for backpropagation neural network with optimal descent factor
-
Sept.
-
X. -H. Yu “Training algorithms for backpropagation neural network with optimal descent factor,” IEE Electron. Lett., vol. 27, pp. 1698–1700, Sept. 1990.
-
(1990)
IEE Electron. Lett.
, vol.27
, pp. 1698-1700
-
-
Yu, X.H.1
-
10
-
-
0025449162
-
Parallel recursive prediction error algorithm for training layered neural networks
-
S. Chen, C.F. Cowan, S.A. Billings, and P.M. Grant, “Parallel recursive prediction error algorithm for training layered neural networks,” Int. J. Control, vol. 51, pp. 1215–1228, 1990.
-
(1990)
Int. J. Control
, vol.51
, pp. 1215-1228
-
-
Chen, S.1
Cowan, C.F.2
Billings, S.A.3
Grant, P.M.4
-
11
-
-
0024874676
-
Backpropagation separates when perceptrons do
-
Washington, DC, June
-
E.D. Sontag and H.J. Sussman, “Backpropagation separates when perceptrons do,” in Proc. IJCNN′89, Washington, DC, vol. II, June 1989, pp. 639–642.
-
(1989)
Proc. IJCNN′89
, vol.2
, pp. 639-642
-
-
Sontag, E.D.1
Sussman, H.J.2
-
12
-
-
0026714071
-
Local minima and back propagation
-
Seattle, WA, July
-
T. Poston, C. -N. Lee, Y. Choie, and Y. Kwon, “Local minima and back propagation,” in Proc. IJCNN′91, Seattle, WA, vol. II, July 1991, pp. 173–176.
-
(1991)
Proc. IJCNN′91
, vol.2
, pp. 173-176
-
-
Poston, T.1
Lee, C.N.2
Choie, Y.3
Kwon, Y.4
-
13
-
-
0026624309
-
Using the symmetries of a multilayered network to reduce the weight space
-
Seattle, WA, July, Second-Order Neural Nets for Constrained Optimization Shengwei Zhang, Xianing Zhu, and Li-He Zou Abstract-Finding a global minimum of an arbitrary function is never easy, but certain statistical systems, such as simulated annealing, can help us to achieve this goal. For deterministic systems a useful strategy in dealing with optimization is to find local minima, or points satisfying the necessary conditions for optimality. In this letter analog neural nets for constrained optimization are proposed as an analogue of Newton's algorithm in numerical analysis. The neural model is globally stable and can converge to the constrained stationary points. Nonlinear neurons are introduced into the net, which makes it possible to solve optimization problems where the variables take discrete values, i.e., combinatorial optimization.
-
J. Jordan and G. Clement, “Using the symmetries of a multilayered network to reduce the weight space,” in Proc. IJCNN′91, Seattle, WA, vol. II, July 1991, pp. 391–396. Second-Order Neural Nets for Constrained Optimization Shengwei Zhang, Xianing Zhu, and Li-He Zou Abstract-Finding a global minimum of an arbitrary function is never easy, but certain statistical systems, such as simulated annealing, can help us to achieve this goal. For deterministic systems a useful strategy in dealing with optimization is to find local minima, or points satisfying the necessary conditions for optimality. In this letter analog neural nets for constrained optimization are proposed as an analogue of Newton's algorithm in numerical analysis. The neural model is globally stable and can converge to the constrained stationary points. Nonlinear neurons are introduced into the net, which makes it possible to solve optimization problems where the variables take discrete values, i.e., combinatorial optimization.
-
(1991)
Proc. IJCNN′91
, vol.2
, pp. 391-396
-
-
Jordan, J.1
Clement, G.2
-
14
-
-
84941544559
-
Adaptive implementation of minimum error rate channel equalization via backpropagation
-
San Francisco, CA paper 57.7, Mar.
-
X. -H. Yu and S.X. Cheng, “Adaptive implementation of minimum error rate channel equalization via backpropagation,” in Proc. ICASSP′92, San Francisco, CA paper 57.7, Mar. 1992.
-
(1992)
Proc. ICASSP′92
-
-
Yu, X.H.1
Cheng, S.X.2
-
15
-
-
0026304918
-
On the nonexistence of local minima of backpropagation error surfaces
-
Singapore, P.R. China, Nov.
-
X. -H. Yu, “On the nonexistence of local minima of backpropagation error surfaces,” in Proc. IJCNN′91, Singapore, P.R. China, Nov. 1991.
-
(1991)
Proc. IJCNN′91
-
-
Yu, X.H.1
-
19
-
-
0006055406
-
Constrained differential optimization
-
D.Z. Anderson, Ed. New York: American Institute of Physics
-
J.C. Piatt and A.H. Barr, “Constrained differential optimization,” in Neural Information Processing Systems, D.Z. Anderson, Ed. New York: American Institute of Physics, 1988.
-
(1988)
Neural Information Processing Systems
-
-
Piatt, J.C.1
Barr, A.H.2
-
20
-
-
84941539484
-
Nonlinear neural nets for constrained optimization
-
submitted to
-
S. Zhang and X. Zhu, “Nonlinear neural nets for constrained optimization,” submitted to IEEE Trans. Neural Networks.
-
IEEE Trans. Neural Networks.
-
-
Zhang, S.1
Zhu, X.2
-
21
-
-
0004469897
-
Neurons with graded response have collective computational properties like those of two-state neurons
-
J.J. Hopfield, “Neurons with graded response have collective computational properties like those of two-state neurons,” Proc. Nat. Acad. Sci. U.S., vol. 81, pp. 3088–3092, 1984.
-
(1984)
Proc. Nat. Acad. Sci. U.S.
, vol.81
, pp. 3088-3092
-
-
Hopfield, J.J.1
|