메뉴 건너뛰기




Volumn 7, Issue 4, 1996, Pages 1007-1014

Integrated feature and architecture selection

Author keywords

[No Author keywords available]

Indexed keywords

ALGORITHMS; BACKPROPAGATION; COMPUTER ARCHITECTURE; HEURISTIC METHODS; MATHEMATICAL MODELS; MATRIX ALGEBRA; NONLINEAR EQUATIONS; OBJECT RECOGNITION; REGRESSION ANALYSIS; STATISTICAL METHODS; VECTORS;

EID: 0030190724     PISSN: 10459227     EISSN: None     Source Type: Journal    
DOI: 10.1109/72.508942     Document Type: Review
Times cited : (54)

References (36)
  • 1
    • 0003073642 scopus 로고
    • Statistical learning networks: A unifying view
    • A. Barron and R. Baron, "Statistical learning networks: A unifying view," in Proc. 20th Symp. Interface, 1988, pp. 192-203.
    • (1988) Proc. 20th Symp. Interface , pp. 192-203
    • Barron, A.1    Baron, R.2
  • 2
    • 0001160588 scopus 로고
    • What sized net gives valid generalization?
    • Spring
    • E. B. Baum and D. Haussler, "What sized net gives valid generalization?" Neur. Comput., vol. 1, no. 1, pp. 151-160, Spring 1989.
    • (1989) Neur. Comput. , vol.1 , Issue.1 , pp. 151-160
    • Baum, E.B.1    Haussler, D.2
  • 3
    • 0026953321 scopus 로고
    • Enhanced training algorithms, and integrated training/architecture selection for multilayer perceptron networks
    • Nov.
    • M. G. Bello, "Enhanced training algorithms, and integrated training/architecture selection for multilayer perceptron networks," IEEE Trans. Neural Networks, vol. 3, no. 6, Nov. 1992.
    • (1992) IEEE Trans. Neural Networks , vol.3 , Issue.6
    • Bello, M.G.1
  • 4
    • 33747696660 scopus 로고    scopus 로고
    • Multivariate statistical techniques for determining neural network architecture and data requirements
    • Brighton, U.K., April
    • L. M. Belue, J. M. Steppe, and K. W. Bauer, "Multivariate statistical techniques for determining neural network architecture and data requirements," AISB96 Wrkshp. Tutorial Programme, Brighton, U.K., April 1996.
    • (1996) AISB96 Wrkshp. Tutorial Programme
    • Belue, L.M.1    Steppe, J.M.2    Bauer, K.W.3
  • 5
    • 0029256585 scopus 로고
    • Methods of determining input features for multilayer perceptrons
    • L. M. Belue and K. W. Bauer, "Methods of determining input features for multilayer perceptrons," Neur. Comput., vol. 7, no. 2, 1995.
    • (1995) Neur. Comput. , vol.7 , Issue.2
    • Belue, L.M.1    Bauer, K.W.2
  • 6
    • 84918441630 scopus 로고
    • Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition
    • June
    • T. M. Cover, "Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition," IEEE Trans. Elec. Comp., vol. EC-14, pp. 326-334, June 1965.
    • (1965) IEEE Trans. Elec. Comp. , vol.EC-14 , pp. 326-334
    • Cover, T.M.1
  • 8
    • 28544442266 scopus 로고
    • SPLITnet dynamically adjusting the number of hidden units in a neural network
    • T. Kohonen, K. Mäkisara, O. Simula, and J. Kangas, Eds. Amsterdam: North Holland
    • M. de la Maza, "SPLITnet dynamically adjusting the number of hidden units in a neural network," Artificial Neural Networks, T. Kohonen, K. Mäkisara, O. Simula, and J. Kangas, Eds. Amsterdam: North Holland, 1991, pp. 647-651.
    • (1991) Artificial Neural Networks , pp. 647-651
    • De La Maza, M.1
  • 9
    • 0026221027 scopus 로고
    • An information criterion for optimal neural network selection
    • Sept.
    • D. B. Fogel, "An information criterion for optimal neural network selection," IEEE Trans. Neural Networks, vol. 2, no. 5, Sept. 1991.
    • (1991) IEEE Trans. Neural Networks , vol.2 , Issue.5
    • Fogel, D.B.1
  • 11
  • 13
    • 0025964567 scopus 로고
    • Back-propagation algorithm which varies the number of hidden units
    • Y. Hirose, K. Yamashita, and S. Hijiya, "Back-propagation algorithm which varies the number of hidden units," Neural Networks, vol. 4, pp. 61-66, 1991.
    • (1991) Neural Networks , vol.4 , pp. 61-66
    • Hirose, Y.1    Yamashita, K.2    Hijiya, S.3
  • 15
    • 0025792215 scopus 로고
    • Bounds on the number of hidden neurons in multilayer perceptrons
    • Jan.
    • S. Huang and Y. Huang, "Bounds on the number of hidden neurons in multilayer perceptrons," IEEE Trans. Neural Networks, vol. 2, no. 1, pp. 47-55, Jan. 1991.
    • (1991) IEEE Trans. Neural Networks , vol.2 , Issue.1 , pp. 47-55
    • Huang, S.1    Huang, Y.2
  • 17
    • 0024124324 scopus 로고
    • An algebraic analysis for optimal hidden units size and learning rates in back-propagation learning
    • 363370, San Diego, CA, July 24-27
    • S. Y. Kung and J. N. Hwang, "An algebraic analysis for optimal hidden units size and learning rates in back-propagation learning," in Int. Conf. Neural Networks, 1, 363370, San Diego, CA, July 24-27, 1988.
    • (1988) Int. Conf. Neural Networks, 1
    • Kung, S.Y.1    Hwang, J.N.2
  • 18
    • 0000494466 scopus 로고
    • Optimal brain damage
    • D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann
    • Y. Le Cun et al., "Optimal brain damage," in Neur. Infor. Proc. Syst. II, D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1990, pp. 598-605.
    • (1990) Neur. Infor. Proc. Syst. II , pp. 598-605
    • Le Cun, Y.1
  • 19
    • 84950942444 scopus 로고
    • Why stepdown procedures in variable selection
    • Aug.
    • N. Mantel, "Why stepdown procedures in variable selection," Technometrics, vol. 12, no. 2, pp. 621-625, Aug. 1970.
    • (1970) Technometrics , vol.12 , Issue.2 , pp. 621-625
    • Mantel, N.1
  • 20
    • 0000900876 scopus 로고    scopus 로고
    • Skeletonization: A technique for trimming the fat from a network via relevance assessment
    • David S. Touretzky, Ed. San Mateo, CA: Morgan Kaufman
    • M. C. Mozer and P. Smolensky, "Skeletonization: A technique for trimming the fat from a network via relevance assessment," in Advanced Neural Information Systems I, David S. Touretzky, Ed. San Mateo, CA: Morgan Kaufman.
    • Advanced Neural Information Systems I
    • Mozer, M.C.1    Smolensky, P.2
  • 22
    • 0027577112 scopus 로고
    • Bayesian selection of important features for feedforward neural networks
    • K. L. Priddy et al., "Bayesian selection of important features for feedforward neural networks," Neurocomputing, vol. 5, no. 2 and 3, 1993.
    • (1993) Neurocomputing , vol.5 , Issue.2-3
    • Priddy, K.L.1
  • 23
    • 33747696658 scopus 로고
    • Higher order separability and minimal hidden-unit fan-in
    • T. Kohonen, K. Mäkisara, O. Simula, and J. Kangas, Eds. Ansterdam: North Holland
    • N. J. Redding et al., "Higher order separability and minimal hidden-unit fan-in," in Artificial Neural Networks, T. Kohonen, K. Mäkisara, O. Simula, and J. Kangas, Eds. Ansterdam: North Holland, 1991, pp. 25-30.
    • (1991) Artificial Neural Networks , pp. 25-30
    • Redding, N.J.1
  • 25
    • 58749111597 scopus 로고
    • An approach to multiple sensor target detection
    • M. C. Roggemann et al., "An approach to multiple sensor target detection," in Proc. SPIE Conf. Sens. Fus., 1989, vol. 1100.
    • (1989) Proc. SPIE Conf. Sens. Fus. , vol.1100
    • Roggemann, M.C.1
  • 26
    • 33747661871 scopus 로고
    • Multiple sensor tactical target detection in FLIR and range images
    • May
    • _, "Multiple sensor tactical target detection in FLIR and range images," in Proc. Tri-Serv. Data Fus. Symp., May 1989.
    • (1989) Proc. Tri-Serv. Data Fus. Symp.
  • 27
    • 0002429329 scopus 로고
    • Feature selection using a multilayer perceptron
    • Fall
    • D. W. Ruck et al., "Feature selection using a multilayer perceptron," J. Neur. Net. Comp., vol. 2, no. 2, pp. 40-48, Fall 1990.
    • (1990) J. Neur. Net. Comp. , vol.2 , Issue.2 , pp. 40-48
    • Ruck, D.W.1
  • 28
    • 0026190194 scopus 로고
    • A simple method to derive bounds on the size and to train multilayer neural networks
    • July
    • M. A. Sartori and P. J. Antsaklis, "A simple method to derive bounds on the size and to train multilayer neural networks," IEEE Trans. Neural Networks, vol. 2, pp. 407-471, July 1991.
    • (1991) IEEE Trans. Neural Networks , vol.2 , pp. 407-471
    • Sartori, M.A.1    Antsaklis, P.J.2
  • 30
    • 0026904597 scopus 로고
    • Feedforward nets for interpolation and classification
    • E. D. Sontag, "Feedforward nets for interpolation and classification," J. Comp. Syst. Sci., vol. 45, pp. 20-48, 1992.
    • (1992) J. Comp. Syst. Sci. , vol.45 , pp. 20-48
    • Sontag, E.D.1
  • 32
    • 0030248249 scopus 로고    scopus 로고
    • Improved feature screening in feedforward neural networks
    • to appear
    • J. M. Steppe and K. W. Bauer, "Improved feature screening in feedforward neural networks," Neurocomputing, to appear.
    • Neurocomputing
    • Steppe, J.M.1    Bauer, K.W.2
  • 33
    • 0001024505 scopus 로고
    • On the convergence of relative frequencies of events to their probabilities
    • V. N. Vapnik and A. Y. Chervonenkis, "On the convergence of relative frequencies of events to their probabilities," Theory Probab. Its Appl., vol. 16, no. 2, pp. 264-280, 1971.
    • (1971) Theory Probab. Its Appl. , vol.16 , Issue.2 , pp. 264-280
    • Vapnik, V.N.1    Chervonenkis, A.Y.2
  • 35
    • 0024944645 scopus 로고
    • An additional hidden unit test for neglected nonlinearity in multilayer feedforward networks
    • Washington, DC, June 18-22
    • _, "An additional hidden unit test for neglected nonlinearity in multilayer feedforward networks," in IEEE INNS Int. J. Conf. Neur. Net. II, Washington, DC, June 18-22, 1989, pp. 451-455.
    • (1989) IEEE INNS Int. J. Conf. Neur. Net. II , pp. 451-455
  • 36
    • 0000243355 scopus 로고
    • Learning in artificial neural networks: A statistical perspective
    • _, "Learning in artificial neural networks: A statistical perspective," Neur. Comput. 1, pp. 425-464, 1989.
    • (1989) Neur. Comput. 1 , pp. 425-464


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.