메뉴 건너뛰기




Volumn 22, Issue 10, 2010, Pages 2655-2677

Convergence analysis of three classes of split-complex gradient algorithms for complex-valued recurrent neural networks

Author keywords

[No Author keywords available]

Indexed keywords

ALGORITHM; ARTIFICIAL INTELLIGENCE; ARTIFICIAL NEURAL NETWORK; COMPUTER SIMULATION; LETTER; NONLINEAR SYSTEM; STANDARD;

EID: 78149331700     PISSN: 08997667     EISSN: 1530888X     Source Type: Journal    
DOI: 10.1162/NECO_a_00021     Document Type: Letter
Times cited : (27)

References (27)
  • 1
    • 58149460022 scopus 로고    scopus 로고
    • Single-layered complex-valued neural network for real-valued classification problems
    • Amin, Md. F., & Murase, K. (2009). Single-layered complex-valued neural network for real-valued classification problems. Neurocomputing, 72(4-6), 945-955.
    • (2009) Neurocomputing , vol.72 , Issue.4-6 , pp. 945-955
    • Amin, M.F.1    Murase, K.2
  • 2
    • 0034850079 scopus 로고    scopus 로고
    • A complex EKF-RTRL neural network
    • San Francisco: Morgan Kaufmann
    • Coelho, P. H. G. (2001). A complex EKF-RTRL neural network. In Proc. Int. Joint Conf. Neural Networks (pp. 120-125). San Francisco: Morgan Kaufmann.
    • (2001) Proc. Int. Joint Conf. Neural Networks , pp. 120-125
    • Coelho, P.H.G.1
  • 3
    • 10044261994 scopus 로고    scopus 로고
    • A complex-valued RTRL algorithm for recurrent neural networks
    • Goh, S. L., & Mandic, D. P. (2004). A complex-valued RTRL algorithm for recurrent neural networks. Neural Computation, 16(12), 2699-2713.
    • (2004) Neural Computation , vol.16 , Issue.12 , pp. 2699-2713
    • Goh, S.L.1    Mandic, D.P.2
  • 4
    • 34548650146 scopus 로고    scopus 로고
    • Stochastic gradient-adaptive complex-valued nonlinear neural adaptive filters with a gradient-adaptive step size
    • Goh, S. L., & Mandic, D. P. (2007). Stochastic gradient-adaptive complex-valued nonlinear neural adaptive filters with a gradient-adaptive step size. IEEE Trans. Neural Networks, 18(5), 1511-1516.
    • (2007) IEEE Trans. Neural Networks , vol.18 , Issue.5 , pp. 1511-1516
    • Goh, S.L.1    Mandic, D.P.2
  • 5
    • 0029732480 scopus 로고    scopus 로고
    • Optimal convergence of on-line backpropagation
    • Gori, M., & Maggini, M. (1996) Optimal convergence of on-line backpropagation. IEEE Trans. Neural Networks, 17, (1), 251-254.
    • (1996) IEEE Trans. Neural Networks , vol.17 , Issue.1 , pp. 251-254
    • Gori, M.1    Maggini, M.2
  • 6
    • 0141988752 scopus 로고    scopus 로고
    • A fully adaptive normalized nonlinear gradient descent algorithm for complex-valued nonlinear adaptive filters
    • Hanna, A. I., &Mandic, D. P. (2003). A fully adaptive normalized nonlinear gradient descent algorithm for complex-valued nonlinear adaptive filters. IEEE Trans. Signal Processing, 51(10), 2540-2549.
    • (2003) IEEE Trans. Signal Processing , vol.51 , Issue.10 , pp. 2540-2549
    • Hanna, A.I.1    Mandic, D.P.2
  • 9
    • 0026219227 scopus 로고
    • The complex backpropagation algorithm
    • Leung, H., & Haykin, S. (1991). The complex backpropagation algorithm. IEEE Trans. Signal Processing, 39(9), 2101-2104.
    • (1991) IEEE Trans. Signal Processing , vol.39 , Issue.9 , pp. 2101-2104
    • Leung, H.1    Haykin, S.2
  • 10
    • 0442279696 scopus 로고    scopus 로고
    • A generalized normalized gradient descent algorithm
    • Mandic, D. P. (2004). A generalized normalized gradient descent algorithm. IEEE Signal Processing Letters, 11(2), 115-118.
    • (2004) IEEE Signal Processing Letters , vol.11 , Issue.2 , pp. 115-118
    • Mandic, D.P.1
  • 13
    • 0035517867 scopus 로고    scopus 로고
    • A normalized gradient descent algorithm for nonlinear adaptive filters using a gradient adaptive step size
    • Mandic, D. P., Hanna, A. I., & Razaz, M. (2001). A normalized gradient descent algorithm for nonlinear adaptive filters using a gradient adaptive step size. IEEE Signal Processing Letters, 8(11), 295-297.
    • (2001) IEEE Signal Processing Letters , vol.8 , Issue.11 , pp. 295-297
    • Mandic, D.P.1    Hanna, A.I.2    Razaz, M.3
  • 15
    • 0016494934 scopus 로고
    • The complex LMS algorithm
    • Widrow, B., McCool, J., & Ball, M. (1975). The complex LMS algorithm. Proc. IEEE, 63, 712-720.
    • (1975) Proc. IEEE , vol.63 , pp. 712-720
    • Widrow, B.1    McCool, J.2    Ball, M.3
  • 16
    • 0001202594 scopus 로고
    • A learning algorithm for continually running fully recurrent neural networks
    • Williams, R. J., & Zipser, D. A. (1989). A learning algorithm for continually running fully recurrent neural networks. Neural Computation, 1(2), 270-280.
    • (1989) Neural Computation , vol.1 , Issue.2 , pp. 270-280
    • Williams, R.J.1    Zipser, D.A.2
  • 17
    • 19344362900 scopus 로고    scopus 로고
    • Deterministic convergence of an online gradient method for BP neural networks
    • Wu, W., Feng, G. R., Li, Z. X., & Xu, Y. S. (2005). Deterministic convergence of an online gradient method for BP neural networks. IEEE Trans. Neural Networks, 16(3), 533-540.
    • (2005) IEEE Trans. Neural Networks , vol.16 , Issue.3 , pp. 533-540
    • Wu, W.1    Feng, G.R.2    Li, Z.X.3    Xu, Y.S.4
  • 18
    • 33847148535 scopus 로고    scopus 로고
    • Strong convergence for gradient methods for BP networks training
    • San Francisco: Morgan Kaufmann
    • Wu, W., Shao, H. M., & Qu, D. (2005). Strong convergence for gradient methods for BP networks training. Proc. Intl. Conf. Neural Networks and Brains (pp. 332-334). San Francisco: Morgan Kaufmann.
    • (2005) Proc. Intl. Conf. Neural Networks and Brains , pp. 332-334
    • Wu, W.1    Shao, H.M.2    Qu, D.3
  • 19
    • 51849086852 scopus 로고    scopus 로고
    • Convergence of gradient method for Elman networks
    • Wu, W., Xu, D. P., & Li, Z. X. (2008). Convergence of gradient method for Elman networks. Appl. Math. Mech., 29(9), 1231-1238.
    • (2008) Appl. Math. Mech. , vol.29 , Issue.9 , pp. 1231-1238
    • Wu, W.1    Xu, D.P.2    Li, Z.X.3
  • 20
    • 48549095370 scopus 로고    scopus 로고
    • Convergence of gradient method with momentum for back-propagation neural networks
    • Wu, W., Zhang, N. M., Li, Z. X., Li, L., & Liu, Y. (2008). Convergence of gradient method with momentum for back-propagation neural networks. J. Comput.Math., 26(4), 613-623.
    • (2008) J. Comput. Math. , vol.26 , Issue.4 , pp. 613-623
    • Wu, W.1    Zhang, N.M.2    Li, Z.X.3    Li, L.4    Liu, Y.5
  • 21
    • 36849072477 scopus 로고    scopus 로고
    • Training pi-sigma network by online gradient algorithm with penalty for small weight update
    • Xiong, Y., Wu, W., Kang, X., & Zhang, C. (2007). Training pi-sigma network by online gradient algorithm with penalty for small weight update. Neural Computation, 19(12), 3356-3368.
    • (2007) Neural Computation , vol.19 , Issue.12 , pp. 3356-3368
    • Xiong, Y.1    Wu, W.2    Kang, X.3    Zhang, C.4
  • 22
    • 70350304153 scopus 로고    scopus 로고
    • Convergence of gradient method for a fully recurrent neural network
    • Xu, D. P., Li, Z. X., & Wu, W. (2010). Convergence of gradient method for a fully recurrent neural network. Soft Computing, 14(3), 245-250.
    • (2010) Soft Computing , vol.14 , Issue.3 , pp. 245-250
    • Xu, D.P.1    Li, Z.X.2    Wu, W.3
  • 23
    • 52149104247 scopus 로고    scopus 로고
    • Analysis of the initial values in split-complex backpropagation algorithm
    • Yang, S. S., Siu, S., & Ho, C. L. (2008). Analysis of the initial values in split-complex backpropagation algorithm. IEEE Trans. Neural Networks, 19(9), 1564-1573.
    • (2008) IEEE Trans. Neural Networks , vol.19 , Issue.9 , pp. 1564-1573
    • Yang, S.S.1    Siu, S.2    Ho, C.L.3
  • 24
    • 55949129099 scopus 로고    scopus 로고
    • Convergence of BP algorithm for product unit neural networks with exponential weights
    • Zhang, C., Wu, W., Chen, X. H., & Xiong, Y. (2008). Convergence of BP algorithm for product unit neural networks with exponential weights. Neurocomputing, 72(1-3), 513-520.
    • (2008) Neurocomputing , vol.72 , Issue.1-3 , pp. 513-520
    • Zhang, C.1    Wu, W.2    Chen, X.H.3    Xiong, Y.4
  • 25
    • 67649385962 scopus 로고    scopus 로고
    • Boundedness and convergence of online gradient method with penalty for feedforward neural networks
    • Zhang, H. S., Wu, W., Liu, F., & Yao, M. C. (2009). Boundedness and convergence of online gradient method with penalty for feedforward neural networks. IEEE Trans. Neural Networks, 20(6), 1050-1054.
    • (2009) IEEE Trans. Neural Networks , vol.20 , Issue.6 , pp. 1050-1054
    • Zhang, H.S.1    Wu, W.2    Liu, F.3    Yao, M.C.4
  • 26
    • 77951469146 scopus 로고    scopus 로고
    • Convergence of an online split-complex gradient algorithm for complex-valued neural networks
    • DOI: 10.1155/2010/829692
    • Zhang, H. S., Xu, D. P., &Wang, Z. P. (2010). Convergence of an online split-complex gradient algorithm for complex-valued neural networks. Discrete Dyn. Nat. Soc., DOI: 10.1155/2010/829692.
    • (2010) Discrete Dyn. Nat. Soc.
    • Zhang, H.S.1    Xu, D.P.2    Wang, Z.P.3
  • 27
    • 66749179902 scopus 로고    scopus 로고
    • Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks
    • DOI: 10.1155/2009/329173
    • Zhang, H. S., Zhang, C., & Wu, W. (2009). Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks. Discrete Dyn. Nat. Soc., DOI: 10.1155/2009/329173.
    • (2009) Discrete Dyn. Nat. Soc.
    • Zhang, H.S.1    Zhang, C.2    Wu, W.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.