메뉴 건너뛰기




Volumn 8, Issue 3, 1997, Pages 519-531

An iterative pruning algorithm for feedforward neural networks

Author keywords

Feedforward neural networks; Generalization; Hidden neurons; Iterative methods; Least squares methods; Network pruning; Pattern recognition; Structure simplification

Indexed keywords

ALGORITHMS; ITERATIVE METHODS; LEAST SQUARES APPROXIMATIONS; PATTERN RECOGNITION; PROBLEM SOLVING;

EID: 0031142667     PISSN: 10459227     EISSN: None     Source Type: Journal    
DOI: 10.1109/72.572092     Document Type: Article
Times cited : (276)

References (55)
  • 1
    • 85032752004 scopus 로고
    • Progress in supervised neural networks
    • D. R. Hush and B. G. Horne, "Progress in supervised neural networks," IEEE Signal Processing Mag., vol. 10, pp. 8-39, 1993.
    • (1993) IEEE Signal Processing Mag. , vol.10 , pp. 8-39
    • Hush, D.R.1    Horne, B.G.2
  • 2
    • 0000646059 scopus 로고
    • Learning internal representations by error propagation
    • D. E. Rumelhart and J. L. McClelland, Eds. Cambridge, MA: MIT Press
    • D. E. Rumelhart, G. E. Hinton, and R. J. Williams, "Learning internal representations by error propagation," in Parallel Distributed Processing - Vol. 1: Foundations, D. E. Rumelhart and J. L. McClelland, Eds. Cambridge, MA: MIT Press, 1986, pp. 318-362.
    • (1986) Parallel Distributed Processing - Vol. 1: Foundations , vol.1 , pp. 318-362
    • Rumelhart, D.E.1    Hinton, G.E.2    Williams, R.J.3
  • 3
    • 0024124324 scopus 로고
    • An algebraic projection analysis for optimal hidden units size and learning rates in backpropagation learning
    • San Diego, CA
    • S. Y. Kung and J. N. Hwang, "An algebraic projection analysis for optimal hidden units size and learning rates in backpropagation learning," in Proc. IEEE Int. Conf. Neural Networks, San Diego, CA, vol. 1, 1988, pp. 363-370.
    • (1988) Proc. IEEE Int. Conf. Neural Networks , vol.1 , pp. 363-370
    • Kung, S.Y.1    Hwang, J.N.2
  • 4
    • 0001969496 scopus 로고
    • Learning sets of filters using backpropagation
    • D. C. Plaut and G. E. Hinton, "Learning sets of filters using backpropagation," Comput. Speech Language, vol. 2, pp. 35-61, 1987.
    • (1987) Comput. Speech Language , vol.2 , pp. 35-61
    • Plaut, D.C.1    Hinton, G.E.2
  • 5
    • 0024052850 scopus 로고
    • Experiments on neural net recognition of spoken and written text
    • D. J. Burr, "Experiments on neural net recognition of spoken and written text," IEEE Trans. Acoust., Speech, Signal Processing, vol. ASSP-36, pp. 1162-1168, 1988.
    • (1988) IEEE Trans. Acoust., Speech, Signal Processing , vol.ASSP-36 , pp. 1162-1168
    • Burr, D.J.1
  • 6
    • 0022471098 scopus 로고
    • Learning representations by backpropagating errors
    • D. E. Rumelhart, G. E. Hinton, and R. J. Williams, "Learning representations by backpropagating errors," Nature, vol. 323, pp. 533-536, 1986.
    • (1986) Nature , vol.323 , pp. 533-536
    • Rumelhart, D.E.1    Hinton, G.E.2    Williams, R.J.3
  • 7
    • 0026955101 scopus 로고
    • Can backpropagation error surface not have local minima
    • X.-H. Yu, "Can backpropagation error surface not have local minima," IEEE Trans. Neural Networks, vol. 3, pp. 1019-1021, 1992.
    • (1992) IEEE Trans. Neural Networks , vol.3 , pp. 1019-1021
    • Yu, X.-H.1
  • 9
    • 0027668417 scopus 로고
    • Determining and improving the fault tolerance of multilayer perceptrons in a pattern-recognition application
    • M. D. Emmerson and R. I. Damper, "Determining and improving the fault tolerance of multilayer perceptrons in a pattern-recognition application," IEEE Trans. Neural Networks, vol. 4, pp. 788-793, 1993.
    • (1993) IEEE Trans. Neural Networks , vol.4 , pp. 788-793
    • Emmerson, M.D.1    Damper, R.I.2
  • 10
    • 0001160588 scopus 로고
    • What size net gives valid generalization?
    • E. B. Baum and D. Haussler, "What size net gives valid generalization?" Neural Computa., vol. 1, pp. 151-160, 1989.
    • (1989) Neural Computa. , vol.1 , pp. 151-160
    • Baum, E.B.1    Haussler, D.2
  • 12
    • 0002291365 scopus 로고
    • Generalization and network design strategies
    • R. Pfeifer, Z. Schreter, F. Fogelman-Soulie, and L. Steels, Eds. Amsterdam: Elsevier
    • Y. Le Cun, "Generalization and network design strategies," in Connectionism in Perspective, R. Pfeifer, Z. Schreter, F. Fogelman-Soulie, and L. Steels, Eds. Amsterdam: Elsevier, 1989, pp. 143-155.
    • (1989) Connectionism in Perspective , pp. 143-155
    • Le Cun, Y.1
  • 13
    • 85025505893 scopus 로고
    • Generalization performance of overtrained backpropagation networks
    • L. B. Almeida and C. J. Wellekens, Eds. Berlin: Springer-Verlag
    • Y. Chauvin, "Generalization performance of overtrained backpropagation networks," in Neural Networks - Proc. EURASIP Wkshp. 1990, L. B. Almeida and C. J. Wellekens, Eds. Berlin: Springer-Verlag, 1990, pp. 46-55.
    • (1990) Neural Networks - Proc. EURASIP Wkshp. 1990 , pp. 46-55
    • Chauvin, Y.1
  • 14
    • 84898184690 scopus 로고
    • Constructive induction in knowledge-based neural networks
    • L. A. Birnbaum and G. C. Collins, Eds. San Mateo, CA: Morgan Kaufmann
    • G. G. Towell, M. K. Craven, and J. W. Shavlik, "Constructive induction in knowledge-based neural networks," in Proc. 8th Int. Wkshp. Machine Learning, L. A. Birnbaum and G. C. Collins, Eds. San Mateo, CA: Morgan Kaufmann, 1991, pp. 213-217.
    • (1991) Proc. 8th Int. Wkshp. Machine Learning , pp. 213-217
    • Towell, G.G.1    Craven, M.K.2    Shavlik, J.W.3
  • 17
    • 0000155950 scopus 로고
    • The cascade-correlation learning architecture
    • D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann
    • S. E. Fahlman and C. Lebiere, "The cascade-correlation learning architecture," in Advances in Neural Information Processing Systems 2, D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1990, pp. 524-532.
    • (1990) Advances in Neural Information Processing Systems 2 , pp. 524-532
    • Fahlman, S.E.1    Lebiere, C.2
  • 19
    • 84945797434 scopus 로고
    • Dynamic node creation in backpropagation networks
    • T. Ash, "Dynamic node creation in backpropagation networks," Connection Sci., vol. 1, no. 4, pp. 365-375, 1989.
    • (1989) Connection Sci. , vol.1 , Issue.4 , pp. 365-375
    • Ash, T.1
  • 20
    • 36149031331 scopus 로고
    • Learning in feedforward layered networks: The Tiling algorithm
    • M. Mézard and J.-P. Nadal, "Learning in feedforward layered networks: The Tiling algorithm," J. Phys. A, vol. 22, pp. 2191-2204, 1989.
    • (1989) J. Phys. A , vol.22 , pp. 2191-2204
    • Mézard, M.1    Nadal, J.-P.2
  • 21
    • 0027662338 scopus 로고
    • Pruning algorithms - A review
    • R. Reed, "Pruning algorithms - A review," IEEE Trans. Neural Networks, vol. 4, pp. 740-747, 1993.
    • (1993) IEEE Trans. Neural Networks , vol.4 , pp. 740-747
    • Reed, R.1
  • 22
    • 0025792215 scopus 로고
    • Bounds on the number of hidden neurons in multilayer perceptrons
    • S. C. Huang and Y. F. Huang, "Bounds on the number of hidden neurons in multilayer perceptrons," IEEE Trans. Neural Networks, vol. 2, pp. 47-55, 1991.
    • (1991) IEEE Trans. Neural Networks , vol.2 , pp. 47-55
    • Huang, S.C.1    Huang, Y.F.2
  • 23
    • 0024732792 scopus 로고
    • Connectionist learning procedures
    • G. E. Hinton, "Connectionist learning procedures, "Artificial Intell., vol. 40, no. 1, pp. 143-150, 1989.
    • (1989) Artificial Intell. , vol.40 , Issue.1 , pp. 143-150
    • Hinton, G.E.1
  • 24
    • 0000473247 scopus 로고
    • A backpropagation algorithm with optimal use of hidden units
    • D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann
    • Y. Chauvin, "A backpropagation algorithm with optimal use of hidden units," in Advances in Neural Information Processing Systems 1, D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1989, pp. 519-526.
    • (1989) Advances in Neural Information Processing Systems 1 , pp. 519-526
    • Chauvin, Y.1
  • 25
    • 0000539096 scopus 로고
    • Generalization by weight-elimination with application to forecasting
    • R. P. Lippmann, J. E. Moody, and D. S. Touretzky, Eds. San Mateo, CA: Morgan Kaufmann
    • A. S. Weigend, D. E. Rumelhart, and B. A. Huberman, "Generalization by weight-elimination with application to forecasting," in Advances in Neural Information Processing Systems 3, R. P. Lippmann, J. E. Moody, and D. S. Touretzky, Eds. San Mateo, CA: Morgan Kaufmann, 1991, pp. 875-882.
    • (1991) Advances in Neural Information Processing Systems 3 , pp. 875-882
    • Weigend, A.S.1    Rumelhart, D.E.2    Huberman, B.A.3
  • 26
    • 0003383280 scopus 로고
    • Creating local and distributed bottlenecks in hidden layers of backpropagation networks
    • D. S. Touretzky, G. E. Hinton, and T. J. Sejnowski, Eds. San Mateo, CA: Morgan Kaufmann
    • J. K. Kruschke, "Creating local and distributed bottlenecks in hidden layers of backpropagation networks," in Proc. 1988 Connectionist Models Summer School, D. S. Touretzky, G. E. Hinton, and T. J. Sejnowski, Eds. San Mateo, CA: Morgan Kaufmann, 1988, pp. 120-126.
    • (1988) Proc. 1988 Connectionist Models Summer School , pp. 120-126
    • Kruschke, J.K.1
  • 29
    • 0026017007 scopus 로고
    • Creating artificial neural networks that generalize
    • _, "Creating artificial neural networks that generalize," Neural Networks, vol. 4, pp. 67-79, 1991.
    • (1991) Neural Networks , vol.4 , pp. 67-79
  • 31
    • 0001234705 scopus 로고
    • Second-order derivatives for network pruning: Optimal brain surgeon
    • S. J. Hanson, J. D. Cowan, and C. L. Giles, Eds. San Mateo, CA: Morgan Kaufmann
    • B. Hassibi and D. G. Stork, "Second-order derivatives for network pruning: Optimal brain surgeon," in Advances in Neural Information Processing Systems 5, S. J. Hanson, J. D. Cowan, and C. L. Giles, Eds. San Mateo, CA: Morgan Kaufmann, 1993, pp. 164-171.
    • (1993) Advances in Neural Information Processing Systems 5 , pp. 164-171
    • Hassibi, B.1    Stork, D.G.2
  • 32
    • 84892167083 scopus 로고
    • Using relevance to reduce network size automatically
    • M. C. Mozer and P. Smolensky, "Using relevance to reduce network size automatically," Connection Sci., vol. 1, no. 1, pp. 3-16, 1989.
    • (1989) Connection Sci. , vol.1 , Issue.1 , pp. 3-16
    • Mozer, M.C.1    Smolensky, P.2
  • 33
    • 0025447562 scopus 로고
    • A simple procedure for pruning backpropagation trained neural networks
    • E. D. Karnin, "A simple procedure for pruning backpropagation trained neural networks," IEEE Trans. Neural Networks, vol. 1, pp. 239-242, 1990.
    • (1990) IEEE Trans. Neural Networks , vol.1 , pp. 239-242
    • Karnin, E.D.1
  • 34
    • 0345706091 scopus 로고
    • Optimization of the architecture of feedforward neural networks with hidden layers by unit elimination
    • A. Burkitt, "Optimization of the architecture of feedforward neural networks with hidden layers by unit elimination," Complex Syst., vol. 5, pp. 371-380, 1991.
    • (1991) Complex Syst. , vol.5 , pp. 371-380
    • Burkitt, A.1
  • 35
    • 0010320018 scopus 로고
    • A node pruning algorithm for backpropagation networks
    • F. L. Chung and T. Lee, "A node pruning algorithm for backpropagation networks," Int. J. Neural Syst., vol. 3, no. 3, pp. 301-314, 1992.
    • (1992) Int. J. Neural Syst. , vol.3 , Issue.3 , pp. 301-314
    • Chung, F.L.1    Lee, T.2
  • 37
    • 0026627410 scopus 로고
    • A Frobenius approximation reduction method (FARM) for determining optimal number of hidden units
    • Seattle, WA
    • S. Y. Kung and Y. H. Hu, "A Frobenius approximation reduction method (FARM) for determining optimal number of hidden units," in Proc. Int. J. Conf. Neural Networks, Seattle, WA, vol. 2, 1991, pp. 163-168.
    • (1991) Proc. Int. J. Conf. Neural Networks , vol.2 , pp. 163-168
    • Kung, S.Y.1    Hu, Y.H.2
  • 38
    • 0001639524 scopus 로고
    • Analyzes of the hidden units of backpropagation model by singular value decomposition (SVD)
    • Washington, D.C.
    • Q. Xue, Y. H. Hu, and W. J. Tompkins, "Analyzes of the hidden units of backpropagation model by singular value decomposition (SVD)," in Proc. Int. J. Conf. Neural Networks, Washington, D.C., vol. 1, 1990, pp. 739-742.
    • (1990) Proc. Int. J. Conf. Neural Networks , vol.1 , pp. 739-742
    • Xue, Q.1    Hu, Y.H.2    Tompkins, W.J.3
  • 39
    • 0026401070 scopus 로고
    • Structural simplification of a feedforward multilayer perceptron artificial neural network
    • Toronto, Canada
    • Y. H. Hu, Q. Xue, and W. J. Tompkins, "Structural simplification of a feedforward multilayer perceptron artificial neural network," in Proc. Int. Conf. Acoust., Speech, Signal Processing, Toronto, Canada, 1991, pp. 1061-1064.
    • (1991) Proc. Int. Conf. Acoust., Speech, Signal Processing , pp. 1061-1064
    • Hu, Y.H.1    Xue, Q.2    Tompkins, W.J.3
  • 40
    • 0026190194 scopus 로고
    • A simple method to derive bounds on the size and to train multilayer neural networks
    • M. A. Sartori and P. J. Antsaklis, "A simple method to derive bounds on the size and to train multilayer neural networks," IEEE Trans. Neural Networks, vol. 2, pp. 467-471, 1991.
    • (1991) IEEE Trans. Neural Networks , vol.2 , pp. 467-471
    • Sartori, M.A.1    Antsaklis, P.J.2
  • 42
    • 3343000925 scopus 로고
    • Methods for sparse linear least-squares problems
    • J. R. Bunch and D. J. Rose, Eds. New York: Academic
    • A. Björck, "Methods for sparse linear least-squares problems," in Sparse Matrix Computations, J. R. Bunch and D. J. Rose, Eds. New York: Academic, 1976, pp. 177-199.
    • (1976) Sparse Matrix Computations , pp. 177-199
    • Björck, A.1
  • 43
    • 0017482605 scopus 로고
    • A survey of sparse matrix research
    • I. S. Duff, "A survey of sparse matrix research," Proc. IEEE, vol. 65, no. 4, pp. 500-535, 1977.
    • (1977) Proc. IEEE , vol.65 , Issue.4 , pp. 500-535
    • Duff, I.S.1
  • 44
    • 0018711879 scopus 로고
    • Accelerated projection methods for computing pseudoinverse solutions of systems of linear equations
    • A. Björck and T. Elfving, "Accelerated projection methods for computing pseudoinverse solutions of systems of linear equations," BIT, vol. 19, pp. 145-163, 1979.
    • (1979) BIT , vol.19 , pp. 145-163
    • Björck, A.1    Elfving, T.2
  • 46
  • 48
    • 0027313792 scopus 로고
    • Speed up learning and network optimization with extended backpropagation
    • A. Sperduti and A. Starita, "Speed up learning and network optimization with extended backpropagation," Neural Networks, vol. 6, pp. 365-383, 1993.
    • (1993) Neural Networks , vol.6 , pp. 365-383
    • Sperduti, A.1    Starita, A.2
  • 49
    • 0024899341 scopus 로고
    • How limited training data can allow a neural network to outperform an optimal statistical classifier
    • Glasgow, Scotland
    • L. Niles, H. Silverman, J. Tajchman, and M. Bush, "How limited training data can allow a neural network to outperform an optimal statistical classifier," in Proc. Int. Conf. Acoust., Speech, Signal Processing, Glasgow, Scotland, vol. 1, 1989, pp. 17-20.
    • (1989) Proc. Int. Conf. Acoust., Speech, Signal Processing , vol.1 , pp. 17-20
    • Niles, L.1    Silverman, H.2    Tajchman, J.3    Bush, M.4
  • 50
    • 0026188751 scopus 로고
    • Learning vector quantization for the probabilistic neural network
    • P. Burrascano, "Learning vector quantization for the probabilistic neural network," IEEE Trans. Neural Networks, vol. 2, pp. 458-461, 1991.
    • (1991) IEEE Trans. Neural Networks , vol.2 , pp. 458-461
    • Burrascano, P.1
  • 51
    • 84943731607 scopus 로고
    • Comparison of generalization in multilayer perceptrons with the log-likelihood and least-squares cost functions
    • The Hague, The Netherlands
    • M. J. J. Holt, "Comparison of generalization in multilayer perceptrons with the log-likelihood and least-squares cost functions," in Proc. 11th Int. Conf. Pattern Recognition, The Hague, The Netherlands, vol. 2, 1992, pp. 17-20.
    • (1992) Proc. 11th Int. Conf. Pattern Recognition , vol.2 , pp. 17-20
    • Holt, M.J.J.1
  • 52
    • 0027556171 scopus 로고
    • On an asymptotically optimal adaptive classifier design criterion
    • W. T. Lee and M. F. Tenorio, "On an asymptotically optimal adaptive classifier design criterion," IEEE Trans. Pattern Anal. Machine Intell., vol. 15, pp. 312-318, 1993.
    • (1993) IEEE Trans. Pattern Anal. Machine Intell. , vol.15 , pp. 312-318
    • Lee, W.T.1    Tenorio, M.F.2
  • 53
    • 0002595536 scopus 로고
    • Generalization and parameter estimation in feedforward nets: Some experiments
    • D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann
    • N. Morgan and H. Bourlard, "Generalization and parameter estimation in feedforward nets: Some experiments," in Advances in Neural Information Processing Systems 2, D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1990, pp. 630-637.
    • (1990) Advances in Neural Information Processing Systems 2 , pp. 630-637
    • Morgan, N.1    Bourlard, H.2
  • 54
    • 0000057581 scopus 로고
    • Predicting the future: A connectionist approach
    • A. S. Weigend, B. A. Huberman, and D. E. Rumelhart, "Predicting the future: A connectionist approach," Int. J. Neural Syst., vol. 1, no. 3, pp. 193-209, 1990.
    • (1990) Int. J. Neural Syst. , vol.1 , Issue.3 , pp. 193-209
    • Weigend, A.S.1    Huberman, B.A.2    Rumelhart, D.E.3
  • 55
    • 0027294340 scopus 로고
    • Improving model selection by nonconvergent methods
    • W. Zinnoff, F. Hergert, and H. G. Zimmermann, "Improving model selection by nonconvergent methods," Neural Networks, vol. 6, pp. 771-783, 1993.
    • (1993) Neural Networks , vol.6 , pp. 771-783
    • Zinnoff, W.1    Hergert, F.2    Zimmermann, H.G.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.