메뉴 건너뛰기




Volumn 15, Issue 4, 2009, Pages 573-589

The combined statistical stepwise and iterative neural network pruning algorithm

Author keywords

Feed forward neural network (NN); Iterative Pruning algorithm (IP); Pruning algorithm; SSM Iterative Pruning algorithm (SSIP); Statistical Stepwise Method (SSM)

Indexed keywords


EID: 69249117844     PISSN: 10798587     EISSN: 2326005X     Source Type: Journal    
DOI: 10.1080/10798587.2009.10643050     Document Type: Article
Times cited : (15)

References (35)
  • 3
    • 0035505658 scopus 로고    scopus 로고
    • A new pruning heuristic based on variance analysis of sensitivity information
    • A.P. Engelbrecht. “A New Pruning Heuristic based on Variance Analysis of Sensitivity Information.” IEEE Transactions on, Neural Networks, Volume: 12 No. 6, pp: 1386-1399, 2001.
    • (2001) IEEE Transactions On, Neural Networks , vol.12 , Issue.6 , pp. 1386-1399
    • Engelbrecht, A.P.1
  • 4
    • 0031142667 scopus 로고    scopus 로고
    • An iterative pruning algorithm for feed-forward neural networks
    • G. Castellano and A. M. Fanelli. “An Iterative Pruning Algorithm for Feed-forward Neural Networks.” IEEE Transactions on Neural Networks, Vol. 8, No. 3, 1997.
    • (1997) IEEE Transactions on Neural Networks , vol.8 , Issue.3
    • Castellano, G.1    Fanelli, A.M.2
  • 5
    • 85024012083 scopus 로고    scopus 로고
    • Texture characterization based on two dimensional lattice coefficients
    • Washington, USA and IEEE ICASSP'99, Phoenix, Arizona, USA
    • F. Fnaiech, M. Sayadi, and M. Najim. “Texture Characterization Based on Two Dimensional Lattice Coefficients.” IEEE ICASSP'98, Seattle, Washington, USA and IEEE ICASSP'99, Phoenix, Arizona, USA.
    • IEEE ICASSP'98, Seattle
    • Fnaiech, F.1    Sayadi, M.2    Najim, M.3
  • 7
    • 0003194588 scopus 로고    scopus 로고
    • Feed-forward neural network design with tridiagonal symmetry constraints
    • A. Dumitras and F. Kossentini. “Feed-forward Neural Network Design with Tridiagonal Symmetry Constraints.” IEEE Transactions on signal Processing, Vol. 48, No. 5.
    • IEEE Transactions on Signal Processing , vol.48 , Issue.5
    • Dumitras, A.1    Kossentini, F.2
  • 9
    • 85024032467 scopus 로고
    • Réseaux de neurones et seriestemporelles
    • ASU, Bruxelles
    • B. Girard, M. Cottrell, and Y. Girard. “Réseaux de Neurones et Series Temporelles.” XXIV Journes de Statistiques, ASU, Bruxelles, 1992.
    • (1992) XXIV Journes
    • Girard, B.1    Cottrell, M.2    Girard, Y.3
  • 11
    • 0030243161 scopus 로고    scopus 로고
    • Automated learning for reducing the configuration of a feedforward neural network
    • C. C. Teng and B. W. Wah. “Automated Learning for Reducing the Configuration of a FeedForward Neural Network.” IEEE Transaction on Neural Network, Vol. 7, No 5.
    • IEEE Transaction on Neural Network , vol.7 , Issue.5
    • Teng, C.C.1    Wah, B.W.2
  • 12
    • 0028425064 scopus 로고    scopus 로고
    • Svd net: An algorithm that automatically selects network structures
    • D.C. Psichogios and L.H Ungar. “SVD net: An algorithm that automatically selects network structures.” IEEE Tranc. On Neural Networks, Vol 5, N3.
    • IEEE Tranc. on Neural Networks , vol.5 , pp. N3
    • Psichogios, D.C.1    Ungar, L.H.2
  • 13
    • 0025447562 scopus 로고    scopus 로고
    • A simple procedure for pruning back-propagation trained neural network
    • E. D. Karnin. “A simple Procedure for Pruning Back-Propagation Trained Neural Network.” IEEE Transaction on Neural Network, Vol. 1. No. 2.
    • IEEE Transaction on Neural Network , vol.1 , Issue.2
    • Karnin, E.D.1
  • 14
    • 0035273020 scopus 로고    scopus 로고
    • A fast feed-forward training algorithm using a modified form of standard back-propagation algorithm
    • F. Fnaiech, S. Abid, and M. Najim. “A Fast Feed-Forward Training Algorithm Using A Modified Form of Standard Back-Propagation Algorithm.” IEEE Transactions on Neural Network, 2001.
    • (2001) IEEE Transactions on Neural Network
    • Fnaiech, F.1    Abid, S.2    Najim, M.3
  • 16
    • 0035019899 scopus 로고    scopus 로고
    • A prune: An algorithm for finding k shortest paths subject to multiple constraints
    • G. Liu and K. G. Ramakrishnan. “A Prune: An Algorithm for Finding K Shortest Paths Subject to Multiple Constraints.” IEEEINFOCOM, 2001.
    • (2001) IEEEINFOCOM
    • Liu, G.1    Ramakrishnan, K.G.2
  • 17
    • 0026841022 scopus 로고    scopus 로고
    • Real-time learning algorithm for a multilayered neural network based on the extended kalman filter
    • H. Sakai, Y. Hgunl, and H. Tokumary. “Real-Time Learning Algorithm for a Multilayered Neural Network Based on the Extended Kalman Filter.” IEEE Transaction Signal Processing, Vol 40.
    • IEEE Transaction Signal Processing , vol.40
    • Sakai, H.1    Hgunl, Y.2    Tokumary, H.3
  • 19
    • 85024026302 scopus 로고    scopus 로고
    • Relation between weight initialization of neural networks and pruning algorithms case study on mackey-glass time series
    • J. Hu, W. Wan, K. Hirasawa, and J. Murata. “Relation between weight initialization of neural networks and pruning algorithms case study on Mackey-glass time series.” 2001 IEEE.
    • (2001) IEEE
    • Hu, J.1    Wan, W.2    Hirasawa, K.3    Murata, J.4
  • 22
    • 0033308459 scopus 로고    scopus 로고
    • Variance analysis of sensitivity information for pruning feed-forward neural networks
    • Washington, DC
    • L. Fletcher, A. P. Engelbrecht, and I. Cloete. “Variance Analysis of Sensitivity Information for Pruning Feed-forward Neural Networks.” in IEEE Int. Joint Conf. Neural Networks, Washington, DC, 1999.
    • (1999) IEEE Int. Joint Conf. Neural Networks
    • Fletcher, L.1    Engelbrecht, A.P.2    Cloete, I.3
  • 23
    • 0031194103 scopus 로고    scopus 로고
    • Connection pruning with static and adaptative pruning schedules
    • L. Prechelt. “Connection Pruning with Static and Adaptative Pruning Schedules.” Neurocomput, Vol. 16, no. 1, 1997.
    • (1997) Neurocomput , vol.16 , Issue.1
    • Prechelt, L.1
  • 24
    • 0027850951 scopus 로고    scopus 로고
    • Removal of hidden units and weights for back-propagation networks
    • M. Hagiwara. “Removal of Hidden Units and Weights for Back-Propagation Networks.” In Proc. 1993 Int. Joint Conf. Neural networks, vol. 1.
    • Proc. 1993 Int. Joint Conf. Neural Networks , vol.1
    • Hagiwara, M.1
  • 25
    • 85024056922 scopus 로고    scopus 로고
    • Regularization theory and neural network architectures
    • M. Jones, F. Girosi and T. Poggio. “Regularization Theory and Neural Network Architectures.” Neural Comput, vol. 7.
    • Neural Comput , vol.7
    • Jones, M.1    Girosi, F.2    Poggio, T.3
  • 32
    • 0031105479 scopus 로고    scopus 로고
    • Convergence suppression and divergence facilitation: Minimum and joint use of hidden unit by multiple output
    • S. Yasui. “Convergence Suppression and Divergence Facilitation: Minimum and joint use of Hidden Unit by Multiple Output.” Neural Networks, vol. 10, no. 2.
    • Neural Networks , vol.10 , Issue.2
    • Yasui, S.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.