메뉴 건너뛰기




Volumn 12, Issue 2-3, 1996, Pages 223-248

Variable selection with neural networks

Author keywords

Dimensionality reduction; Neural network pruning; Regularization; Variable selection

Indexed keywords

CONSTRAINT THEORY; FUNCTIONS; LEARNING SYSTEMS; SELECTION; STATISTICAL METHODS;

EID: 0030608136     PISSN: 09252312     EISSN: None     Source Type: Journal    
DOI: 10.1016/0925-2312(95)00121-2     Document Type: Article
Times cited : (73)

References (37)
  • 1
    • 0016355478 scopus 로고
    • A new look at the statistical model identification
    • [1] H. Akaike, A new look at the statistical model identification, IEEE Trans. Auto. Control 19 (1974) 716-723.
    • (1974) IEEE Trans. Auto. Control , vol.19 , pp. 716-723
    • Akaike, H.1
  • 2
    • 0028468293 scopus 로고
    • Using mutual information for selecting features in supervised neural net learning
    • [2] R. Battiti, Using mutual information for selecting features in supervised neural net learning, IEEE Trans. NN 5 (4) (1994).
    • (1994) IEEE Trans. NN , vol.5 , Issue.4
    • Battiti, R.1
  • 3
    • 0011805758 scopus 로고
    • Exact calculation of the Hessian matrix for the multilayer perceptron
    • [3] C. Bishop, Exact calculation of the Hessian matrix for the multilayer perceptron, Neural Comp. 4 (1992) 494-501.
    • (1992) Neural Comp. , vol.4 , pp. 494-501
    • Bishop, C.1
  • 4
    • 0026715339 scopus 로고
    • Cooperation of neural nets and task decomposition
    • Seattle
    • [4] M. Bollivier de, P. Gallinari and S. Thiria, Cooperation of neural nets and task decomposition, IJCNN'91, Seattle, vol. II (1991) 573-576.
    • (1991) IJCNN'91 , vol.2 , pp. 573-576
    • De Bollivier, M.1    Gallinari, P.2    Thiria, S.3
  • 8
    • 0000473247 scopus 로고
    • A back-propagation algorithm with optimal use of hidden units
    • D. Touretzky ed., Morgan Kaufmann
    • [8] Y. Chauvin, A back-propagation algorithm with optimal use of hidden units, in: Neural Information Processing Systems, NIPS'88, D. Touretzky ed., vol. 1 (Morgan Kaufmann, 1989) 519-526.
    • (1989) Neural Information Processing Systems, NIPS'88 , vol.1 , pp. 519-526
    • Chauvin, Y.1
  • 9
    • 0000494465 scopus 로고
    • Dynamic behavior of constrained back-propagation networks
    • Denver 1989, D. Touretzky ed. Morgan Kaufmann
    • [9] Y. Chauvin, Dynamic behavior of constrained back-propagation networks, in: Neural Information Processing Systems, Denver 1989, D. Touretzky ed. (Morgan Kaufmann, 1990) vol. 2, 643-649.
    • (1990) Neural Information Processing Systems , vol.2 , pp. 643-649
    • Chauvin, Y.1
  • 12
    • 0011847141 scopus 로고
    • Transforming neural net output levels to probability distributions
    • R. Lippmann, J. Moody and D. Touretzky eds., Morgan Kaufmann
    • [12] J.S. Denker and Y. Le Cun, Transforming neural net output levels to probability distributions, in Advances in Neural Information Processing Systems, R. Lippmann, J. Moody and D. Touretzky eds., vol. 3 (Morgan Kaufmann, 1991) 853-859.
    • (1991) Advances in Neural Information Processing Systems , vol.3 , pp. 853-859
    • Denker, J.S.1    Le Cun, Y.2
  • 14
    • 0027294340 scopus 로고
    • Improving model selection by nonconvergent methods
    • [14] W. Finnoff, F. Hergert and H.G. Zimmermann, Improving model selection by nonconvergent methods, Neural Networks 6 (6) (1993) 771-783.
    • (1993) Neural Networks , vol.6 , Issue.6 , pp. 771-783
    • Finnoff, W.1    Hergert, F.2    Zimmermann, H.G.3
  • 16
    • 0000364411 scopus 로고
    • Selection of variables in discriminant analysis by F-statistic and error rate
    • [16] J.D.F. Habema and J. Hermans, Selection of variables in discriminant analysis by F-statistic and error rate, Technometrics 19 (4) (1977).
    • (1977) Technometrics , vol.19 , Issue.4
    • Habema, J.D.F.1    Hermans, J.2
  • 17
    • 0000991092 scopus 로고
    • Comparing biases for minimal network construction with back-propagation
    • D.S. Touretzky ed., Morgan Kaufmann
    • [17] S.J. Hanson and L.Y. Pratt, Comparing biases for minimal network construction with back-propagation in: Neural Information Processing Systems, NIPS'89, D.S. Touretzky ed., (Morgan Kaufmann, 1989) vol. 1, 177-185.
    • (1989) Neural Information Processing Systems, NIPS'89 , vol.1 , pp. 177-185
    • Hanson, S.J.1    Pratt, L.Y.2
  • 18
    • 0001234705 scopus 로고
    • Second order derivatives for network pruning: Optimal brain surgeon
    • S.J. Hanson, J.D. Cowan and C.L. Giles eds. Morgan Kaufmann
    • [18] B. Hassibi and D.G. Stork, Second order derivatives for network pruning: Optimal brain surgeon, In Neural Information Processing Systems, NIPS'92, S.J. Hanson, J.D. Cowan and C.L. Giles eds. (Morgan Kaufmann, 1993) vol. 5, 164-171.
    • (1993) Neural Information Processing Systems, NIPS'92 , vol.5 , pp. 164-171
    • Hassibi, B.1    Stork, D.G.2
  • 20
    • 0025447562 scopus 로고
    • A simple procedure for pruning back-propagation trained nural networks
    • [20] E.D. Karnin, A simple procedure for pruning back-propagation trained nural networks, IEEE Trans. NN 1 (2) (1990) 239-242.
    • (1990) IEEE Trans. NN , vol.1 , Issue.2 , pp. 239-242
    • Karnin, E.D.1
  • 23
    • 0001025418 scopus 로고
    • A practical bayesian framework for backpropagation networks
    • [23] D.J.C. McKay, A practical bayesian framework for backpropagation networks, Neural Computation 4 (1992) 448-472.
    • (1992) Neural Computation , vol.4 , pp. 448-472
    • McKay, D.J.C.1
  • 24
    • 0001025418 scopus 로고
    • Bayesian interpolation
    • [24] D.J.C. McKay, Bayesian interpolation, Neural Computation 4 (1992) 415-447.
    • (1992) Neural Computation , vol.4 , pp. 415-447
    • McKay, D.J.C.1
  • 25
    • 0011216908 scopus 로고
    • Bayesian non-linear modeling for the energy prediction competition
    • University of Cambridge
    • [25] D.J.C. McKay, Bayesian non-linear modeling for the energy prediction competition, Tech. Rep., University of Cambridge, 1993.
    • (1993) Tech. Rep.
    • McKay, D.J.C.1
  • 27
    • 0000130839 scopus 로고
    • Bayesian variable selection in linear regression
    • [27] T.J. Mitchell and J.J. Beauchamp, Bayesian variable selection in linear regression, JASA 83, (1988) 1023-1036.
    • (1988) JASA , vol.83 , pp. 1023-1036
    • Mitchell, T.J.1    Beauchamp, J.J.2
  • 29
    • 0000900876 scopus 로고
    • Skeletonization: A technique for trimming the fat from a network via relevance assesment
    • [29] M.C. Mozer and P. Smolensky, Skeletonization: a technique for trimming the fat from a network via relevance assesment, NIPS 1 (1989) 107-115.
    • (1989) NIPS , vol.1 , pp. 107-115
    • Mozer, M.C.1    Smolensky, P.2
  • 30
    • 0001765492 scopus 로고
    • Simplifying neural networks by soft weight-sharing
    • [30] S.J. Nowlan and G.E. Hinton, Simplifying neural networks by soft weight-sharing, Neural Computation 4 (1992) 473-493.
    • (1992) Neural Computation , vol.4 , pp. 473-493
    • Nowlan, S.J.1    Hinton, G.E.2
  • 31
    • 0001219859 scopus 로고
    • Regularization theory and neural networks architectures
    • [31] F. Girosi, M. Jones and T. Poggio, Regularization theory and neural networks architectures, Neural Computation 1 (2) (1995) 219-269.
    • (1995) Neural Computation , vol.1 , Issue.2 , pp. 219-269
    • Girosi, F.1    Jones, M.2    Poggio, T.3
  • 32
    • 0003444646 scopus 로고
    • Learning internal representations by error propagation
    • D.E. Rumelhart and J.L. McClelland eds., MIT Press
    • [32] D.E. Rumelhart, G.E. Hinton and R.J. Williams, Learning internal representations by error propagation in Parallel Distributed Processing. D.E. Rumelhart and J.L. McClelland eds., (MIT Press, 1986) vol. 1, 318-362.
    • (1986) Parallel Distributed Processing , vol.1 , pp. 318-362
    • Rumelhart, D.E.1    Hinton, G.E.2    Williams, R.J.3
  • 34
    • 0000120766 scopus 로고
    • Estimating the dimension of a model
    • [34] G. Schwarz, Estimating the dimension of a model. Annals of Statistics, 6-2, (1978) 461-464.
    • (1978) Annals of Statistics , vol.6 , Issue.2 , pp. 461-464
    • Schwarz, G.1
  • 36
    • 0002583992 scopus 로고
    • Selection of variables in multiple regression: Part 1 & 2
    • [36] M.L. Thompson, Selection of variables in multiple regression: Part 1 & 2, Int. Stat. Rev, 46 (1978) 1-19 & 129-146.
    • (1978) Int. Stat. Rev , vol.46 , pp. 1-19
    • Thompson, M.L.1
  • 37
    • 0000539096 scopus 로고
    • Generalization by weight elimination with application to forecasting
    • R.P. Lippmann, J.E. Moody and D.S. Touretzky eds., Morgan Kaufmann
    • [37] A.S. Weigend, D.E. Rumelhart and B.A. Huberman, Generalization by weight elimination with application to forecasting, In: Neural Information Processing Systems, NIPS'90. R.P. Lippmann, J.E. Moody and D.S. Touretzky eds., (Morgan Kaufmann, 1991) vol. 3, 875-882.
    • (1991) Neural Information Processing Systems, NIPS'90 , vol.3 , pp. 875-882
    • Weigend, A.S.1    Rumelhart, D.E.2    Huberman, B.A.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.