-
1
-
-
0003073642
-
Statistical learning networks: A unifying view
-
A. Barron and R. Baron, "Statistical learning networks: A unifying view," in Proc. 20th Symp. Interface, 1988, pp. 192-203.
-
(1988)
Proc. 20th Symp. Interface
, pp. 192-203
-
-
Barron, A.1
Baron, R.2
-
2
-
-
0001160588
-
What sized net gives valid generalization?
-
Spring
-
E. B. Baum and D. Haussler, "What sized net gives valid generalization?" Neur. Comput., vol. 1, no. 1, pp. 151-160, Spring 1989.
-
(1989)
Neur. Comput.
, vol.1
, Issue.1
, pp. 151-160
-
-
Baum, E.B.1
Haussler, D.2
-
3
-
-
0026953321
-
Enhanced training algorithms, and integrated training/architecture selection for multilayer perceptron networks
-
Nov.
-
M. G. Bello, "Enhanced training algorithms, and integrated training/architecture selection for multilayer perceptron networks," IEEE Trans. Neural Networks, vol. 3, no. 6, Nov. 1992.
-
(1992)
IEEE Trans. Neural Networks
, vol.3
, Issue.6
-
-
Bello, M.G.1
-
4
-
-
33747696660
-
Multivariate statistical techniques for determining neural network architecture and data requirements
-
Brighton, U.K., April
-
L. M. Belue, J. M. Steppe, and K. W. Bauer, "Multivariate statistical techniques for determining neural network architecture and data requirements," AISB96 Wrkshp. Tutorial Programme, Brighton, U.K., April 1996.
-
(1996)
AISB96 Wrkshp. Tutorial Programme
-
-
Belue, L.M.1
Steppe, J.M.2
Bauer, K.W.3
-
5
-
-
0029256585
-
Methods of determining input features for multilayer perceptrons
-
L. M. Belue and K. W. Bauer, "Methods of determining input features for multilayer perceptrons," Neur. Comput., vol. 7, no. 2, 1995.
-
(1995)
Neur. Comput.
, vol.7
, Issue.2
-
-
Belue, L.M.1
Bauer, K.W.2
-
6
-
-
84918441630
-
Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition
-
June
-
T. M. Cover, "Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition," IEEE Trans. Elec. Comp., vol. EC-14, pp. 326-334, June 1965.
-
(1965)
IEEE Trans. Elec. Comp.
, vol.EC-14
, pp. 326-334
-
-
Cover, T.M.1
-
8
-
-
28544442266
-
SPLITnet dynamically adjusting the number of hidden units in a neural network
-
T. Kohonen, K. Mäkisara, O. Simula, and J. Kangas, Eds. Amsterdam: North Holland
-
M. de la Maza, "SPLITnet dynamically adjusting the number of hidden units in a neural network," Artificial Neural Networks, T. Kohonen, K. Mäkisara, O. Simula, and J. Kangas, Eds. Amsterdam: North Holland, 1991, pp. 647-651.
-
(1991)
Artificial Neural Networks
, pp. 647-651
-
-
De La Maza, M.1
-
9
-
-
0026221027
-
An information criterion for optimal neural network selection
-
Sept.
-
D. B. Fogel, "An information criterion for optimal neural network selection," IEEE Trans. Neural Networks, vol. 2, no. 5, Sept. 1991.
-
(1991)
IEEE Trans. Neural Networks
, vol.2
, Issue.5
-
-
Fogel, D.B.1
-
11
-
-
0024903275
-
Estimating hidden unit number for two-layer perceptrons
-
Washington, DC
-
J. W. Gutierrez and R. Grondin, "Estimating hidden unit number for two-layer perceptrons," in IEEE Proc. Int. Joint Conf. Neural Networks, Washington, DC, 1989, pp. 677-681.
-
(1989)
IEEE Proc. Int. Joint Conf. Neural Networks
, pp. 677-681
-
-
Gutierrez, J.W.1
Grondin, R.2
-
12
-
-
0024878952
-
Theory of the backpropagation neural network
-
New York, IEEE Press, June
-
R. Hecht-Nielsen, "Theory of the backpropagation neural network," in Proc. International Joint Conference on Neural Networks, I, New York, IEEE Press, June 1989, pp. 593-611.
-
(1989)
Proc. International Joint Conference on Neural Networks, I
, pp. 593-611
-
-
Hecht-Nielsen, R.1
-
13
-
-
0025964567
-
Back-propagation algorithm which varies the number of hidden units
-
Y. Hirose, K. Yamashita, and S. Hijiya, "Back-propagation algorithm which varies the number of hidden units," Neural Networks, vol. 4, pp. 61-66, 1991.
-
(1991)
Neural Networks
, vol.4
, pp. 61-66
-
-
Hirose, Y.1
Yamashita, K.2
Hijiya, S.3
-
14
-
-
0004319985
-
-
Dep. Economics, Univ. California, San Diego, manuscript, June
-
K. Hornik, M. Stinchcombe, and H. White, "Multilayer feedforward networks are universal approximators," Dep. Economics, Univ. California, San Diego, manuscript, June 1988.
-
(1988)
Multilayer Feedforward Networks Are Universal Approximators
-
-
Hornik, K.1
Stinchcombe, M.2
White, H.3
-
15
-
-
0025792215
-
Bounds on the number of hidden neurons in multilayer perceptrons
-
Jan.
-
S. Huang and Y. Huang, "Bounds on the number of hidden neurons in multilayer perceptrons," IEEE Trans. Neural Networks, vol. 2, no. 1, pp. 47-55, Jan. 1991.
-
(1991)
IEEE Trans. Neural Networks
, vol.2
, Issue.1
, pp. 47-55
-
-
Huang, S.1
Huang, Y.2
-
16
-
-
33747738700
-
Neural network wavelet feature selection for breast cancer diagnosis
-
to appear
-
C. M. Kocur, S. K. Rogers, L. R. Myers, T. Burns, J. W. Hoffmeister, K. W. Bauer, and J. M. Steppe, "Neural network wavelet feature selection for breast cancer diagnosis," IEEE Trans. Biomed. Eng., to appear.
-
IEEE Trans. Biomed. Eng.
-
-
Kocur, C.M.1
Rogers, S.K.2
Myers, L.R.3
Burns, T.4
Hoffmeister, J.W.5
Bauer, K.W.6
Steppe, J.M.7
-
17
-
-
0024124324
-
An algebraic analysis for optimal hidden units size and learning rates in back-propagation learning
-
363370, San Diego, CA, July 24-27
-
S. Y. Kung and J. N. Hwang, "An algebraic analysis for optimal hidden units size and learning rates in back-propagation learning," in Int. Conf. Neural Networks, 1, 363370, San Diego, CA, July 24-27, 1988.
-
(1988)
Int. Conf. Neural Networks, 1
-
-
Kung, S.Y.1
Hwang, J.N.2
-
18
-
-
0000494466
-
Optimal brain damage
-
D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann
-
Y. Le Cun et al., "Optimal brain damage," in Neur. Infor. Proc. Syst. II, D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1990, pp. 598-605.
-
(1990)
Neur. Infor. Proc. Syst. II
, pp. 598-605
-
-
Le Cun, Y.1
-
19
-
-
84950942444
-
Why stepdown procedures in variable selection
-
Aug.
-
N. Mantel, "Why stepdown procedures in variable selection," Technometrics, vol. 12, no. 2, pp. 621-625, Aug. 1970.
-
(1970)
Technometrics
, vol.12
, Issue.2
, pp. 621-625
-
-
Mantel, N.1
-
20
-
-
0000900876
-
Skeletonization: A technique for trimming the fat from a network via relevance assessment
-
David S. Touretzky, Ed. San Mateo, CA: Morgan Kaufman
-
M. C. Mozer and P. Smolensky, "Skeletonization: A technique for trimming the fat from a network via relevance assessment," in Advanced Neural Information Systems I, David S. Touretzky, Ed. San Mateo, CA: Morgan Kaufman.
-
Advanced Neural Information Systems I
-
-
Mozer, M.C.1
Smolensky, P.2
-
22
-
-
0027577112
-
Bayesian selection of important features for feedforward neural networks
-
K. L. Priddy et al., "Bayesian selection of important features for feedforward neural networks," Neurocomputing, vol. 5, no. 2 and 3, 1993.
-
(1993)
Neurocomputing
, vol.5
, Issue.2-3
-
-
Priddy, K.L.1
-
23
-
-
33747696658
-
Higher order separability and minimal hidden-unit fan-in
-
T. Kohonen, K. Mäkisara, O. Simula, and J. Kangas, Eds. Ansterdam: North Holland
-
N. J. Redding et al., "Higher order separability and minimal hidden-unit fan-in," in Artificial Neural Networks, T. Kohonen, K. Mäkisara, O. Simula, and J. Kangas, Eds. Ansterdam: North Holland, 1991, pp. 25-30.
-
(1991)
Artificial Neural Networks
, pp. 25-30
-
-
Redding, N.J.1
-
25
-
-
58749111597
-
An approach to multiple sensor target detection
-
M. C. Roggemann et al., "An approach to multiple sensor target detection," in Proc. SPIE Conf. Sens. Fus., 1989, vol. 1100.
-
(1989)
Proc. SPIE Conf. Sens. Fus.
, vol.1100
-
-
Roggemann, M.C.1
-
26
-
-
33747661871
-
Multiple sensor tactical target detection in FLIR and range images
-
May
-
_, "Multiple sensor tactical target detection in FLIR and range images," in Proc. Tri-Serv. Data Fus. Symp., May 1989.
-
(1989)
Proc. Tri-Serv. Data Fus. Symp.
-
-
-
27
-
-
0002429329
-
Feature selection using a multilayer perceptron
-
Fall
-
D. W. Ruck et al., "Feature selection using a multilayer perceptron," J. Neur. Net. Comp., vol. 2, no. 2, pp. 40-48, Fall 1990.
-
(1990)
J. Neur. Net. Comp.
, vol.2
, Issue.2
, pp. 40-48
-
-
Ruck, D.W.1
-
28
-
-
0026190194
-
A simple method to derive bounds on the size and to train multilayer neural networks
-
July
-
M. A. Sartori and P. J. Antsaklis, "A simple method to derive bounds on the size and to train multilayer neural networks," IEEE Trans. Neural Networks, vol. 2, pp. 407-471, July 1991.
-
(1991)
IEEE Trans. Neural Networks
, vol.2
, pp. 407-471
-
-
Sartori, M.A.1
Antsaklis, P.J.2
-
30
-
-
0026904597
-
Feedforward nets for interpolation and classification
-
E. D. Sontag, "Feedforward nets for interpolation and classification," J. Comp. Syst. Sci., vol. 45, pp. 20-48, 1992.
-
(1992)
J. Comp. Syst. Sci.
, vol.45
, pp. 20-48
-
-
Sontag, E.D.1
-
32
-
-
0030248249
-
Improved feature screening in feedforward neural networks
-
to appear
-
J. M. Steppe and K. W. Bauer, "Improved feature screening in feedforward neural networks," Neurocomputing, to appear.
-
Neurocomputing
-
-
Steppe, J.M.1
Bauer, K.W.2
-
33
-
-
0001024505
-
On the convergence of relative frequencies of events to their probabilities
-
V. N. Vapnik and A. Y. Chervonenkis, "On the convergence of relative frequencies of events to their probabilities," Theory Probab. Its Appl., vol. 16, no. 2, pp. 264-280, 1971.
-
(1971)
Theory Probab. Its Appl.
, vol.16
, Issue.2
, pp. 264-280
-
-
Vapnik, V.N.1
Chervonenkis, A.Y.2
-
35
-
-
0024944645
-
An additional hidden unit test for neglected nonlinearity in multilayer feedforward networks
-
Washington, DC, June 18-22
-
_, "An additional hidden unit test for neglected nonlinearity in multilayer feedforward networks," in IEEE INNS Int. J. Conf. Neur. Net. II, Washington, DC, June 18-22, 1989, pp. 451-455.
-
(1989)
IEEE INNS Int. J. Conf. Neur. Net. II
, pp. 451-455
-
-
-
36
-
-
0000243355
-
Learning in artificial neural networks: A statistical perspective
-
_, "Learning in artificial neural networks: A statistical perspective," Neur. Comput. 1, pp. 425-464, 1989.
-
(1989)
Neur. Comput. 1
, pp. 425-464
-
-
|