메뉴 건너뛰기




Volumn , Issue , 2009, Pages 3212-3219

Improving gradient-based learning algorithms for large scale feedforward networks

Author keywords

Backpropgation; Gradient based learning; Large scale networks; Opposite transfer functions; Opposition based computing

Indexed keywords

BACKPROPGATION; GRADIENT-BASED LEARNING; LARGE SCALE NETWORKS; OPPOSITE TRANSFER FUNCTIONS; OPPOSITION-BASED COMPUTING;

EID: 70449455565     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1109/IJCNN.2009.5178798     Document Type: Conference Paper
Times cited : (16)

References (22)
  • 5
    • 0942279020 scopus 로고    scopus 로고
    • Learning improvement of neural networks used in structural optimization
    • N. Lagaros and M. Papadrakakis. Learning improvement of neural networks used in structural optimization. Advances in Software Engineering, 35 (1) :9-25, 2004.
    • (2004) Advances in Software Engineering , vol.35 , Issue.1 , pp. 9-25
    • Lagaros, N.1    Papadrakakis, M.2
  • 6
    • 70449356362 scopus 로고    scopus 로고
    • S. Rahnamayn, H. R. Tizhoosh, and S. Salama. A Novel Population Initialization Method for Accelerating Evolutionary Algorithms. (to appear) Computers and Mathematics with Applications, 2006.
    • S. Rahnamayn, H. R. Tizhoosh, and S. Salama. A Novel Population Initialization Method for Accelerating Evolutionary Algorithms. (to appear) Computers and Mathematics with Applications, 2006.
  • 8
    • 34548720742 scopus 로고    scopus 로고
    • Opposition-based Differential Evolution Algorithms for Optimization of Noisy Problems
    • S. Rahnamayn, H. R. Tizhoosh, and S. Salama. Opposition-based Differential Evolution Algorithms for Optimization of Noisy Problems. In IEEE Congress on Evolutionary Computation, pages 6756-6763, 2006.
    • (2006) IEEE Congress on Evolutionary Computation , pp. 6756-6763
    • Rahnamayn, S.1    Tizhoosh, H.R.2    Salama, S.3
  • 14
    • 0028728959 scopus 로고
    • Dynamic adaptation of the error surface for the acceleration of the training of neural networks
    • A. Thome and M. Tenorio. Dynamic adaptation of the error surface for the acceleration of the training of neural networks. In IEEE World Congress on Comutational Intelligence, pages 447-452, 1992.
    • (1992) IEEE World Congress on Comutational Intelligence , pp. 447-452
    • Thome, A.1    Tenorio, M.2
  • 18
    • 0002772943 scopus 로고    scopus 로고
    • Solving the ill-conditioning in neural network learning
    • Springer-Verlag
    • P. van der Smagt and G. Hirzinger. Solving the ill-conditioning in neural network learning. In Neural Networks: Tricks of the Trade, pages 193- 206. Springer-Verlag, 1998.
    • (1998) Neural Networks: Tricks of the Trade , pp. 193-206
    • van der Smagt, P.1    Hirzinger, G.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.