메뉴 건너뛰기




Volumn 4, Issue 5, 1993, Pages 740-747

Pruning Algorithms—A Survey

Author keywords

[No Author keywords available]

Indexed keywords

ASSOCIATIVE PROCESSING; COMPUTATIONAL COMPLEXITY; CONVERGENCE OF NUMERICAL METHODS; CORRELATION METHODS; DATA STRUCTURES; ERROR ANALYSIS; LEARNING SYSTEMS; NEURAL NETWORKS; SYSTEMS ANALYSIS; TREES (MATHEMATICS);

EID: 0027662338     PISSN: 10459227     EISSN: 19410093     Source Type: Journal    
DOI: 10.1109/72.248452     Document Type: Article
Times cited : (1344)

References (37)
  • 1
    • 84957345624 scopus 로고
    • When are k-nearest neighbor and back propagation accurate for feasible sized sets of examples?
    • L. B. Almeida and C. J. Wellekens, Eds., Feb.
    • E. B. Baum, “When are k-nearest neighbor and back propagation accurate for feasible sized sets of examples?,” in Neural Networks, Proc. EURASIP Workshop, L. B. Almeida and C. J. Wellekens, Eds., Feb. 1990, pp. 2–25.
    • (1990) Neural Networks, Proc. EURASIP Workshop , pp. 2-25
    • Baum, E.B.1
  • 2
    • 0001160588 scopus 로고
    • What size net gives valid generalization?
    • E. B. Baum and D. Haussler, “What size net gives valid generalization?,” Neural Computation, vol. 1, pp. 151–160, 1989.
    • (1989) Neural Computation , vol.1 , pp. 151-160
    • Baum, E.B.1    Haussler, D.2
  • 3
    • 0024750852 scopus 로고
    • Learnability and the Vapnik-Chervonenkis dimension
    • A. Blumer, A. Ehrenfeucht, D. Haussler, and M. Warmuth, “Learnability and the Vapnik-Chervonenkis dimension,” J. Ass. Comput. Mach., vol. 36, no. 4, pp. 929–965, 1989.
    • (1989) J. Ass. Comput. Mach. , vol.36 , Issue.4 , pp. 929-965
    • Blumer, A.1    Ehrenfeucht, A.2    Haussler, D.3    Warmuth, M.4
  • 4
    • 0000473247 scopus 로고
    • A back-propagation algorithm with optimal use of hidden units
    • D. S. Touretzky, Ed. (Denver 1988)
    • Y. Chauvin, “A back-propagation algorithm with optimal use of hidden units,” in Advances in Neural Information Processing (1), D. S. Touretzky, Ed. (Denver 1988), 1989, pp. 519–526.
    • (1988) Advances in Neural Information Processing (1) , vol.1989 , pp. 519-526
    • Chauvin, Y.1
  • 5
    • 0000494465 scopus 로고
    • Dynamic behavior of constrained back-propagation networks
    • D.S. Touretzky, Ed. (Denver 1989)
    • Y. Chauvin, “Dynamic behavior of constrained back-propagation networks,” in Advances in Neural Information Processing (2), D.S. Touretzky, Ed. (Denver 1989), 1990, pp. 642–649.
    • (1989) Advances in Neural Information Processing (2) , vol.1990 , pp. 642-649
    • Chauvin, Y.1
  • 6
    • 85025505893 scopus 로고
    • Generalization performance of overtrained backpropagation networks
    • L. B. Almeida and C. J. Wellekens, Eds., Feb.
    • Y. Chauvin, “Generalization performance of overtrained backpropagation networks,” in Neural Networks, Proc. EUROSIP Workshop, L. B. Almeida and C. J. Wellekens, Eds., Feb. 1990, pp. 46–55.
    • (1990) Neural Networks, Proc. EUROSIP Workshop , pp. 46-55
    • Chauvin, Y.1
  • 9
    • 0000991092 scopus 로고
    • Comparing biases for minimal network construction with back-propagation
    • D.S. Touretzky, Ed. (Denver 1988)
    • S. J. Hanson and L. Y. Pratt, “Comparing biases for minimal network construction with back-propagation,” in Advances in Neural Information Processing (1), D.S. Touretzky, Ed. (Denver 1988), 1989, pp. 177–185.
    • (1988) Advances in Neural Information Processing (1) , vol.1989 , pp. 177-185
    • Hanson, S.J.1    Pratt, L.Y.2
  • 10
    • 85046400375 scopus 로고
    • A comparison of weight elimination methods for reducing complexity in neural networks
    • San Diego, CA
    • F. Hergert, W. Finnoff, and H. G. Zimmermann, “A comparison of weight elimination methods for reducing complexity in neural networks,” in Proc. Int. Joint Conf Neural Networks, San Diego, CA, vol. III, 1992, pp. 980–987.
    • (1992) Proc. Int. Joint Conf Neural Networks , vol.3 , pp. 980-987
    • Hergert, F.1    Finnoff, W.2    Zimmermann, H.G.3
  • 12
    • 0000974760 scopus 로고
    • Generalizing smoothness constraints from discrete samples
    • C. Ji, R. R. Snapp, and D. Psaltis, “Generalizing smoothness constraints from discrete samples,” Neural Computation, vol. 2, no. 2, pp. 188–197, 1990.
    • (1990) Neural Computation , vol.2 , Issue.2 , pp. 188-197
    • Ji, C.1    Snapp, R.R.2    Psaltis, D.3
  • 13
    • 0025447562 scopus 로고
    • A simple procedure for pruning back-propagation trained neural networks
    • E. D. Karnin, “A simple procedure for pruning back-propagation trained neural networks,” IEEE Trans. Neural Networks, vol. 1, no. 2, pp. 239–242, 1990.
    • (1990) IEEE Trans. Neural Networks , vol.1 , Issue.2 , pp. 239-242
    • Karnin, E.D.1
  • 14
    • 0000029122 scopus 로고
    • A simple weight decay can improve generalization
    • J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds.
    • A. Krogh and J. A. Hertz, “A simple weight decay can improve generalization,” in Advances in Neural Information Processing (4), J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds., 1992, pp. 951–957.
    • (1992) Advances in Neural Information Processing (4) , pp. 951-957
    • Krogh, A.1    Hertz, J.A.2
  • 16
    • 0024904547 scopus 로고
    • Improving generalization in back-propagation networks with distributed bottlenecks
    • Washington DC
    • J. K. Kruschke, “Improving generalization in back-propagation networks with distributed bottlenecks,” in Proc. Int. Joint Conf Neural Networks, Washington DC, vol. I, 1989, pp. 443—447 0.
    • (1989) Proc. Int. Joint Conf Neural Networks , vol.1 , pp. 443-447
    • Kruschke, J.K.1
  • 17
    • 0026627410 scopus 로고
    • A Frobenius approximation reduction method (FARM) for determining optimal number of hidden units
    • Seattle
    • S. Y. Kung and Y. H. Hu, “A Frobenius approximation reduction method (FARM) for determining optimal number of hidden units,” in Proc. Int. Joint Conf. Neural Networks, vol. II, Seattle, 1991, pp. 163–168.
    • (1991) Proc. Int. Joint Conf. Neural Networks , vol.2 , pp. 163-168
    • Kung, S.Y.1    Hu, Y.H.2
  • 18
    • 0025508916 scopus 로고
    • A statistical approach to learning and generalization in layered neural networks
    • Oct.
    • E. Levin, N. Tishby, and S. A. Solla, “A statistical approach to learning and generalization in layered neural networks,” Proc IEEE, vol. 78, no. 10, pp. 1568–1574, Oct. 1990.
    • (1990) Proc IEEE , vol.78 , Issue.10 , pp. 1568-1574
    • Levin, E.1    Tishby, N.2    Solla, S.A.3
  • 19
    • 0000900876 scopus 로고
    • Skeletonization: A technique for trimming the fat from a network via relevance assessment
    • D.S. Touretzky, Ed. (Denver 1988)
    • M. C. Mozer and P. Smolensky, “Skeletonization: A technique for trimming the fat from a network via relevance assessment,” in Advances in Neural Information Processing (1), D.S. Touretzky, Ed. (Denver 1988), 1989, pp. 107–115.
    • (1988) Advances in Neural Information Processing (1) , vol.1989 , pp. 107-115
    • Mozer, M.C.1    Smolensky, P.2
  • 20
    • 0001765492 scopus 로고
    • Simplifying neural networks by soft weight-sharing
    • S. J. Nowlan and G. E. Hinton, “Simplifying neural networks by soft weight-sharing,” Neural Computation, vol. 4, no. 4, pp. 473–493, 1992.
    • (1992) Neural Computation , vol.4 , Issue.4 , pp. 473-493
    • Nowlan, S.J.1    Hinton, G.E.2
  • 21
    • 0041830862 scopus 로고
    • Optimizaton of neural network topology and information content using Boltzmann methods
    • O. M. Omidvara and C. L. Wilson, “Optimizaton of neural network topology and information content using Boltzmann methods,” in Proc. Int. Joint Conf. Neural Networks (Baltimore), vol. IV, 1992, pp. 594–599.
    • (1992) Proc. Int. Joint Conf. Neural Networks (Baltimore) , vol.4 , pp. 594-599
    • Omidvara, O.M.1    Wilson, C.L.2
  • 23
    • 0010561601 scopus 로고
    • Information measure based skeletonisation
    • J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds.
    • S. Ramachandran and L. Y. Pratt, “Information measure based skeletonisation,” in Advances in Neural Information Processing (4), J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds., 1992, pp. 1080–1087.
    • (1992) Advances in Neural Information Processing (4) , pp. 1080
    • Ramachandran, S.1    Pratt, L.Y.2
  • 24
    • 0026736931 scopus 로고
    • Optimal pruning of neural tree networks for improved generalization
    • (Seattle)
    • A. Sankar and R. J. Mammone, “Optimal pruning of neural tree networks for improved generalization,” in Proc. Int. Joint Conf. Neural Networks, vol. II (Seattle), 1991, pp. 219–224.
    • (1991) Proc. Int. Joint Conf. Neural Networks , vol.2 , pp. 219-224
    • Sankar, A.1    Mammone, R.J.2
  • 26
    • 0026624311 scopus 로고
    • Fault tolerance of pruned multilayer networks
    • (Seattle)
    • B. E. Segee and M. J. Carter, “Fault tolerance of pruned multilayer networks,” in Proc. Int. Joint Conf. Neural Networks, vol. II (Seattle), pp. 447–452, 1991.
    • (1991) Proc. Int. Joint Conf. Neural Networks , vol.2 , pp. 447-452
    • Segee, B.E.1    Carter, M.J.2
  • 27
    • 0026017007 scopus 로고
    • Creating artificial neural networks that generalize
    • J. Sietsma and R. J. F. Dow, “Creating artificial neural networks that generalize,” Neural Networks, vol. 4, no. 1, pp. 67–69, 1991.
    • (1991) Neural Networks , vol.4 , Issue.1 , pp. 67-69
    • Sietsma, J.1    Dow, R.J.F.2
  • 28
  • 29
    • 84941446975 scopus 로고
    • Learning to identify letters with REM equations
    • (Washington, DC)
    • W. E. Simon and J. R. Carter, “Learning to identify letters with REM equations,” in Proc. Int. Joint Conf. Neural Networks, vol. I (Washington, DC), 1990, pp. 727–730.
    • (1990) Proc. Int. Joint Conf. Neural Networks , vol.1 , pp. 727-730
    • Simon, W.E.1    Carter, J.R.2
  • 30
    • 0025680807 scopus 로고
    • Removing and adding network connections with recursive error minimization (REM) equations
    • S.K. Rogers, Ed.
    • W. E. Simon and J. R. Carter, “Removing and adding network connections with recursive error minimization (REM) equations,” in Applications of Artificial Neural Networks, S.K. Rogers, Ed. 1990, pp. 600–606.
    • (1990) Applications of Artificial Neural Networks , pp. 600-606
    • Simon, W.E.1    Carter, J.R.2
  • 31
    • 0024940401 scopus 로고
    • Consistent inference of probabilities in layered networks: Predictions and generalization
    • N. Tishby, E. Levin, and S. A. Solla, “Consistent inference of probabilities in layered networks: Predictions and generalization,” in Proc. Int. Joint Conf. Neural Networks, 1989, p. 403.
    • (1989) Proc. Int. Joint Conf. Neural Networks , pp. 403.
    • Tishby, N.1    Levin, E.2    Solla, S.A.3
  • 32
    • 0021518106 scopus 로고
    • A theory of the learnable
    • L. G. Valiant, “A theory of the learnable,” Commun. Ass. Comput. Mach., vol. 27, no. 11, pp. 1134–1142, 1984.
    • (1984) Commun. Ass. Comput. Mach. , vol.27 , Issue.11 , pp. 1134-1142
    • Valiant, L.G.1
  • 33
    • 0026698370 scopus 로고
    • Reduction of interconnection weights in higher order associative memory networks
    • (Seattle)
    • J.H. Wang, T. F. Krile, and J. F. Walkup, “Reduction of interconnection weights in higher order associative memory networks,” in Proc. Int. Joint Conf. Neural Networks, vol. II (Seattle), 1991, pp. 177–182.
    • (1991) Proc. Int. Joint Conf. Neural Networks , vol.2 , pp. 177-182
    • Wang, J.H.1    Krile, T.F.2    Walkup, J.F.3
  • 34
    • 0002003139 scopus 로고
    • Back-propagation, weight-elimination and time series prediction,” in Proc. 1990 Connectionist Models Summer School, D. Touretzky, J. Elman, T. Sejnowski, and G
    • A. S. Weigend, D. E. Rumelhart, and B. A. Huberman, “Back-propagation, weight-elimination and time series prediction,” in Proc. 1990 Connectionist Models Summer School, D. Touretzky, J. Elman, T. Sejnowski, and G. Hinton, Eds., 1990, pp. 105–116.
    • (1990) Hinton, Eds. , pp. 105-116
    • Weigend, A.S.1    Rumelhart, D.E.2    Huberman, B.A.3
  • 35
    • 0026367426 scopus 로고
    • Generalization by weight-elimination applied to currency exchange rate prediction
    • (Seattle)
    • A. S. Weigend, D. E. Rumelhart, and B. A. Huberman, “Generalization by weight-elimination applied to currency exchange rate prediction,” in Proc. Int. Joint Conf. Neural Networks, vol. I (Seattle), 1991, pp. 837–841.
    • (1991) Proc. Int. Joint Conf. Neural Networks , vol.1 , pp. 837-841
    • Weigend, A.S.1    Rumelhart, D.E.2    Huberman, B.A.3
  • 36
    • 0000539096 scopus 로고
    • Generalization by weight-elimination with application to forecasting
    • R. Lippmann, J. Moody, and D. Touretzky, Eds.
    • A. S. Weigend, D. E. Rumelhart, and B. A. Huberman, “Generalization by weight-elimination with application to forecasting,” in Advances in Neural Information Processing (3), R. Lippmann, J. Moody, and D. Touretzky, Eds., 1991, pp. 875–882.
    • (1991) Advances in Neural Information Processing (3) , pp. 875-882
    • Weigend, A.S.1    Rumelhart, D.E.2    Huberman, B.A.3
  • 37
    • 0001420440 scopus 로고
    • The evolution of connectivity: Pruning neural networks using genetic algorithms
    • (Washington, DC)
    • D. Whitley and C. Bogart, “The evolution of connectivity: Pruning neural networks using genetic algorithms,” in Proc, Int. Joint Conf. Neural Networks, vol. I (Washington, DC), 1990, p. 134.
    • (1990) Proc, Int. Joint Conf. Neural Networks , vol.1 , pp. 134
    • Whitley, D.1    Bogart, C.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.