메뉴 건너뛰기




Volumn 34, Issue 3, 2011, Pages 241-258

A novel pruning algorithm for optimizing feedforward neural network of classification problems

Author keywords

Backpropagation training algorithm; Classification; Data mining; Input and hidden neurons pruning; Multilayer feedforward neural network; Significant measure

Indexed keywords

BACKPROPAGATION TRAINING ALGORITHM; HIDDEN NEURONS; INPUT AND HIDDEN NEURONS PRUNING; MULTILAYER FEEDFORWARD NEURAL NETWORK; NETWORK PRUNING; NEURAL NETWORK MODEL; PRUNING ALGORITHMS; REAL DATA SETS; SIGMOIDAL ACTIVATION; SIGNIFICANT MEASURE;

EID: 84855534224     PISSN: 13704621     EISSN: 1573773X     Source Type: Journal    
DOI: 10.1007/s11063-011-9196-7     Document Type: Article
Times cited : (48)

References (43)
  • 1
    • 79952754783 scopus 로고    scopus 로고
    • Feedforward neural networks-architecture optimization and knowledge extraction
    • Reitermanova Z (2008) Feedforward neural networks-architecture optimization and knowledge extraction. In: WDS'08 proceedings of contributed papers, Part I, 159-164
    • (2008) WDS'08 Proceedings of Contributed Papers, Part i , pp. 159-164
    • Reitermanova, Z.1
  • 2
    • 0031142667 scopus 로고    scopus 로고
    • An iterative pruning algoritm for feedforward neural networks
    • Castellano G, Fanelli AM, Pelillo M (1997) An iterative pruning algoritm for feedforward neural networks. IEEE Trans Neural Netw 8(3):519-530
    • (1997) IEEE Trans Neural Netw , vol.8 , Issue.3 , pp. 519-530
    • Castellano, G.1    Fanelli, A.M.2    Pelillo, M.3
  • 3
    • 51549113917 scopus 로고    scopus 로고
    • A novel algorithm for designing three layered artificial neural networks
    • Ahmmed S, Abdullah-Al-Mamun K, Islam M (2007) A novel algorithm for designing three layered artificial neural networks. Int J Soft Comput 2(3):450-458
    • (2007) Int J Soft Comput , vol.2 , Issue.3 , pp. 450-458
    • Ahmmed, S.1    Abdullah-Al-Mamun, K.2    Islam, M.3
  • 4
    • 0033622757 scopus 로고    scopus 로고
    • Model structure determination in neural network models
    • Henrique M, Lima L, Seborg E (2000) Model structure determination in neural network models. Chem Eng Sci 55:5457-5469
    • (2000) Chem Eng Sci , vol.55 , pp. 5457-5469
    • Henrique, M.1    Lima, L.2    Seborg, E.3
  • 5
    • 0032674833 scopus 로고    scopus 로고
    • A formal selection and pruning algorithm for feedforward artificial neural network optimiztion
    • Ponnapallii PVS, Ho KC, Thomson M (1999) A formal selection and pruning algorithm for feedforward artificial neural network optimiztion. IEEE Trans Neural Netw 10(4):964-968
    • (1999) IEEE Trans Neural Netw , vol.10 , Issue.4 , pp. 964-968
    • Pvs, P.1    Ho, K.C.2    Thomson, M.3
  • 6
    • 85025505893 scopus 로고
    • Generalization performance of overtrained backpropagation networks
    • Hlomeida LB, Wellekens CJ (Eds)
    • Chauvin Y (1990) Generalization performance of overtrained backpropagation networks. In: Hlomeida LB, Wellekens CJ (Eds) Proceedings of neural networks Euroship workshop, pp 46-55
    • (1990) Proceedings of Neural Networks Euroship Workshop , pp. 46-55
    • Chauvin, Y.1
  • 7
    • 56549091450 scopus 로고    scopus 로고
    • Solving local minima problem with large number of hidden nodes on two layered feedforward artificial neural networks
    • Choi B, Lee JH, Kim DH (2008) Solving local minima problem with large number of hidden nodes on two layered feedforward artificial neural networks. Neuro Comput 71:3640-3643
    • (2008) Neuro Comput , vol.71 , pp. 3640-3643
    • Choi, B.1    Lee, J.H.2    Kim, D.H.3
  • 8
    • 0022471098 scopus 로고
    • Learning representations by backpropagating errors
    • Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by backpropagating errors. Nature 323:533-536
    • (1986) Nature , vol.323 , pp. 533-536
    • Rumelhart, D.E.1    Hinton, G.E.2    Williams, R.J.3
  • 9
    • 0026955101 scopus 로고
    • Can backpropagation error surface not have local minima
    • Yu XH (1992) Can backpropagation error surface not have local minima. IEEE Trans Neural Netw 3:1019-1021
    • (1992) IEEE Trans Neural Netw , vol.3 , pp. 1019-1021
    • Yu, X.H.1
  • 10
    • 0027662338 scopus 로고
    • Pruning algorithms a survey
    • Reed R (1993) Pruning algorithms a survey. IEEE Trans Neural Netw 4(5):740-747
    • (1993) IEEE Trans Neural Netw , vol.4 , Issue.5 , pp. 740-747
    • Reed, R.1
  • 12
    • 0027668417 scopus 로고
    • Determining and improving the fault tolerance of multi layer perceptrons in a pattern-recognition application
    • Emmerson MD, Damper RI (1993) Determining and improving the fault tolerance of multi layer perceptrons in a pattern-recognition application. IEEE Trans Neural Netw 4:788-793
    • (1993) IEEE Trans Neural Netw , vol.4 , pp. 788-793
    • Emmerson, M.D.1    Damper, R.I.2
  • 13
    • 65249163364 scopus 로고    scopus 로고
    • An incremental framework based on cross validation for estimating the architecture of MLP
    • Aran O, YildizOT, Alpaydin E (2009) An incremental framework based on cross validation for estimating the architecture of MLP. Int J Pattern Recognit Artif Intell 23(2):159-190
    • (2009) Int J Pattern Recognit Artif Intell , vol.23 , Issue.2 , pp. 159-190
    • Aran, O.1    Yildiz, O.T.2    Talpaydin, E.3
  • 14
    • 0031891445 scopus 로고    scopus 로고
    • A sequential learning approach for single hidden layer neural networks
    • Zhang J, Morris A (1997) A sequential learning approach for single hidden layer neural networks. Neural Netw 11:65-80
    • (1997) Neural Netw , vol.11 , pp. 65-80
    • Zhang Jmorris, A.1
  • 15
    • 0029754431 scopus 로고    scopus 로고
    • The dependence identification neural network construction algorithm
    • PII S1045922796004869
    • Moody J, Antsaklis PJ (1996) The dependence identification neural network construction algorithm. IEEE Trans Neural Netw 7:3-15 (Pubitemid 126811724)
    • (1996) IEEE Transactions on Neural Networks , vol.7 , Issue.1 , pp. 3-15
    • Moody, J.O.1    Antsaklis, P.J.2
  • 16
    • 0029185114 scopus 로고
    • Use of a Quasi-Newton Method in a Feedforward Neural Network Construction Algorithm
    • Setiono R, Kwong Hui LC (1995) Use of a Quasi-Newton Method in a Feedforward Neural Network Construction Algorithm. IEEE Trans Neural Netw 6(1):273-277
    • (1995) IEEE Trans Neural Netw , vol.6 , Issue.1 , pp. 273-277
    • Setiono, R.1    Kwong Hui, L.C.2
  • 17
    • 0035505658 scopus 로고    scopus 로고
    • A new pruning heuristic based on variance analysis of sensitivity information
    • Engelbrecht AP (2001) A new pruning heuristic based on variance analysis of sensitivity information. IEEE Trans Neural Netw 12(6):1386-1399
    • (2001) IEEE Trans Neural Netw , vol.12 , Issue.6 , pp. 1386-1399
    • Engelbrecht, A.P.1
  • 18
    • 0030633575 scopus 로고    scopus 로고
    • A penalty-function approach for pruning feedforward neural networks
    • Setiono R (1997) A penalty function approach for pruning feedforward neural networks. Neural Comput 9(1):185-204 (Pubitemid 127622532)
    • (1997) Neural Computation , vol.9 , Issue.1 , pp. 185-204
    • Setiono, R.1
  • 20
    • 0025792215 scopus 로고
    • Bounds on the number of hidden neurons in multilayer perceptrons
    • Huang SC, Huang YF (1991) Bounds on the number of hidden neurons in multilayer perceptrons. IEEE Trans Neural Netw 2:47-55
    • (1991) IEEE Trans Neural Netw , vol.2 , pp. 47-55
    • Huang, S.C.1    Huang, Y.F.2
  • 21
    • 8644285354 scopus 로고    scopus 로고
    • Reducing the number of neurons in Radial Basis Function networks with dynamic decay adjustment
    • Patez J (2004) Reducing the number of neurons in Radial Basis Function networks with dynamic decay adjustment. Neuro Comput 62:79-91
    • (2004) Neuro Comput , vol.62 , pp. 79-91
    • Patez, J.1
  • 24
    • 84943270068 scopus 로고
    • Optimal brain surgeon and general network pruning
    • Hassibi B, Stork DG, Wolf GJ (1993) Optimal brain surgeon and general network pruning. In: Proceedings of IEEE ICNN'93, vol 1, pp 293-299
    • (1993) Proceedings of IEEE ICNN'93 , vol.1 , pp. 293-299
    • Hassibi, B.1    Stork Dgwolf, G.J.2
  • 27
    • 0025447562 scopus 로고
    • Asimple procedure for pruning back-propagation trained neural networks
    • Karnin ED (1990) Asimple procedure for pruning back-propagation trained neural networks. IEEE Trans Neural Netw 1(2):239-242
    • (1990) IEEE Trans Neural Netw , vol.1 , Issue.2 , pp. 239-242
    • Karnin, E.D.1
  • 28
    • 33745937122 scopus 로고    scopus 로고
    • Effective neural network pruning using cross-validation
    • DOI 10.1109/IJCNN.2005.1555984, 1555984, Proceedings of the International Joint Conference on Neural Networks, IJCNN 2005
    • Huynh TQ, Setiono R (2005) Effective neural network pruning using cross validation. In: Proceedings of IEEE international joint conference on neural networks, vol 2, pp 972-977 (Pubitemid 44055583)
    • (2005) Proceedings of the International Joint Conference on Neural Networks , vol.2 , pp. 972-977
    • Huynh, T.Q.1    Setiono, R.2
  • 29
    • 53749105298 scopus 로고    scopus 로고
    • Enhancing the generalization ability of neural networks through controlling the hidden layers
    • Wan W, Mabu S, Shimada K, Hirasawa K, Hu J (2009) Enhancing the generalization ability of neural networks through controlling the hidden layers. Appl Soft Comput 9:404-414
    • (2009) Appl Soft Comput , vol.9 , pp. 404-414
    • Wan, W.1    Mabu, S.2    Shimada, K.3    Hirasawa, K.4    Hu, J.5
  • 30
    • 0028413960 scopus 로고
    • A simple and effective method for removal of hidden units and weights
    • DOI 10.1016/0925-2312(94)90055-8
    • Hagiwara M (1994) A simple and effective method for removal of hidden units and weights. Neurocomputing 6:207-218 (Pubitemid 24131870)
    • (1994) Neurocomputing , vol.6 , Issue.2 , pp. 207-218
    • Hagiwara, M.1
  • 31
    • 0024124323 scopus 로고
    • Neural net pruning: Why and how. In: Proceedings of the
    • San Diego, CA
    • Sietsma J, Dow RJF (1988) Neural net pruning: why and how. In: Proceedings of the IEEE international conference on neural networks, vol 1. San Diego, CA, pp 325-333
    • (1988) IEEE International Conference on Neural Networks , vol.1 , pp. 325-333
    • Sietsma, J.1    Rjf, D.2
  • 32
    • 0001234705 scopus 로고
    • Second order derivatives for network pruning: Optimal brain surgeon
    • Lee Giles C, Hanson SJ, Cowan JD (Eds)
    • Hassibi B, Stork DG (1993) Second order derivatives for network pruning: optimal brain surgeon. In: Lee Giles C, Hanson SJ, Cowan JD (Eds) Advances in neural information processing systems, vol 5, pp 164-171
    • (1993) Advances in Neural Information Processing Systems , vol.5 , pp. 164-171
    • Hassibi, B.1    Stork, D.G.2
  • 35
    • 65149084176 scopus 로고    scopus 로고
    • Two phase construction ofmultilayer perceptrons using Information Theory
    • Xing HJ, Gang Hu B (2009) Two phase construction ofmultilayer perceptrons using Information Theory. IEEE Trans Neural Netw 20(4):715-721
    • (2009) IEEE Trans Neural Netw , vol.20 , Issue.4 , pp. 715-721
    • Xing, H.J.1    Gang Hu, B.2
  • 36
    • 0001420440 scopus 로고
    • The evolution of connectivity: Pruning neural networks using genetic algorithms
    • Whitley D, Bogart C (1990) The evolution of connectivity: pruning neural networks using genetic algorithms. In: International joint conference on neural networks, vol 1, pp 134-137
    • (1990) International Joint Conference on Neural Networks , vol.1 , pp. 134-137
    • Whitley, D.1    Bogart, C.2
  • 38
    • 32544452874 scopus 로고    scopus 로고
    • Hidden neuron pruning of multilayer perceptrons using a quantified sensitivity measure
    • DOI 10.1016/j.neucom.2005.04.010, PII S0925231205001852
    • Zeng X, Yeung Daniel S (2006) Hidden neuron pruning of multilayer perceptrons using a quantified sensitivity measure. Neuro Comput 69:825-837 (Pubitemid 43230385)
    • (2006) Neurocomputing , vol.69 , Issue.7-9 SPEC. ISS. , pp. 825-837
    • Zeng, X.1    Yeung, D.S.2
  • 39
    • 33644884686 scopus 로고    scopus 로고
    • A node pruning agorithm based on a fourier amplitude sensitivity test method
    • DOI 10.1109/TNN.2006.871707
    • Lauret P, Fock E, Mara TA (2006) A node pruning algorithm based on a fourier amplitude sensitivity test method. IEEE Trans Neural Netw 17(2):273-293 (Pubitemid 43380055)
    • (2006) IEEE Transactions on Neural Networks , vol.17 , Issue.2 , pp. 273-293
    • Lauret, P.1    Fock, E.2    Mara, T.A.3
  • 40
    • 33750378348 scopus 로고    scopus 로고
    • A new training and pruning algorithm based on node dependence and Jacobian rank deficiency
    • Xua J, Hob Daniel WC (2006) A new training and pruning algorithm based on node dependence and Jacobian rank deficiency. Neurocomputing 70:544-558
    • (2006) Neurocomputing , vol.70 , pp. 544-558
    • Xua, J.1    Hob Daniel, W.C.2
  • 41
    • 0010320018 scopus 로고
    • A node pruning algorithm for backpropagation networks
    • Chung FL, Lee T (1992) A node pruning algorithm for backpropagation networks. Int J Neural Syst 3(3):301-314
    • (1992) Int J Neural Syst , vol.3 , Issue.3 , pp. 301-314
    • Chung, F.L.1    Lee, T.2
  • 42
    • 0003383280 scopus 로고    scopus 로고
    • Creating local and distributed bottlenecks in hidden layers of backpropagation networks
    • Touretzky DS, Hinton GE, Sejnowski TJ (Eds) , Morgan Kaufmann, San Mateo, CA
    • Kruschke JK (1998) Creating local and distributed bottlenecks in hidden layers of backpropagation networks. In: Touretzky DS, Hinton GE, Sejnowski TJ (Eds) Proceedings 1988 Connectionist Models Summer School, Morgan Kaufmann, San Mateo, CA, pp 120-126
    • (1998) Proceedings 1988 Connectionist Models Summer School , pp. 120-126
    • Kruschke, J.K.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.