메뉴 건너뛰기




Volumn 12, Issue 8, 2000, Pages 1901-1927

A signal-flow-graph approach to on-line gradient calculation

Author keywords

[No Author keywords available]

Indexed keywords

ALGORITHM; ARTICLE; ARTIFICIAL NEURAL NETWORK; FEEDBACK SYSTEM; INFORMATION PROCESSING; NONLINEAR SYSTEM;

EID: 0034241899     PISSN: 08997667     EISSN: None     Source Type: Journal    
DOI: 10.1162/089976600300015196     Document Type: Article
Times cited : (7)

References (32)
  • 1
    • 0023563286 scopus 로고
    • A learning rule for asynchronous perceptrons with feedback in combinatorial environment
    • Almeida, L. B. (1987). A learning rule for asynchronous perceptrons with feedback in combinatorial environment. In Proc. Int. Conf. on Neural Networks (Vol. 2, pp. 609-618).
    • (1987) Proc. Int. Conf. on Neural Networks , vol.2 , pp. 609-618
    • Almeida, L.B.1
  • 2
    • 0000029787 scopus 로고
    • FIR and IIR synapses, a new neural network architecture for time series modelling
    • Back, A. D., & Tsoi, A. C. (1991). FIR and IIR synapses, a new neural network architecture for time series modelling. Neural Computation, 3, 375-385.
    • (1991) Neural Computation , vol.3 , pp. 375-385
    • Back, A.D.1    Tsoi, A.C.2
  • 3
    • 0000105056 scopus 로고
    • Relating real-time backpropagation and backpropagation-through-time: An application of flow graph interreciprocity
    • Beaufays, F., & Wan, E. (1994). Relating real-time backpropagation and backpropagation-through-time: An application of flow graph interreciprocity. Neural Computation, 6, 296-306.
    • (1994) Neural Computation , vol.6 , pp. 296-306
    • Beaufays, F.1    Wan, E.2
  • 4
    • 0003545065 scopus 로고    scopus 로고
    • A circuit theory approach to recurrent neural network architectures and learning methods
    • University of Bologna, Italy. PDF available online campolucci_P or requested from campoluc@tiscalinet.it
    • Campolucci, P. (1998). A circuit theory approach to recurrent neural network architectures and learning methods. Doctoral dissertation in English, University of Bologna, Italy. PDF available online at http://nnsp.eealab.unian.it/ campolucci_P or requested from campoluc@tiscalinet.it.
    • (1998) Doctoral Dissertation in English
    • Campolucci, P.1
  • 10
    • 0024939338 scopus 로고
    • A learning algorithm for analog, fully recurrent neural networks
    • Gherrity, M. (1989). A learning algorithm for analog, fully recurrent neural networks. In Proc. Int. Joint Conference Neural Networks, (Vol. 1, pp. 643-644).
    • (1989) Proc. Int. Joint Conference Neural Networks , vol.1 , pp. 643-644
    • Gherrity, M.1
  • 12
    • 0001039722 scopus 로고
    • An experimental comparison of recurrent neural networks
    • G. Tasauro, D. Touretzky, & T. Leen (Eds.), Cambridge, MA: MIT Press
    • Horne, B. G., & Giles, C. L. (1995). An experimental comparison of recurrent neural networks. In G. Tasauro, D. Touretzky, & T. Leen (Eds.), Advances in neural information processing systems, 7 Cambridge, MA: MIT Press.
    • (1995) Advances in Neural Information Processing Systems , pp. 7
    • Horne, B.G.1    Giles, C.L.2
  • 13
    • 0016036846 scopus 로고
    • Signal flow graphs - Computer-aided system analysis and sensitivity calculations
    • Lee, A. Y. (1974). Signal flow graphs - Computer-aided system analysis and sensitivity calculations. IEEE Transactions on Circuits and Systems, cas-21, 209-216.
    • (1974) IEEE Transactions on Circuits and Systems, Cas-21 , pp. 209-216
    • Lee, A.Y.1
  • 15
    • 84899344001 scopus 로고
    • Feedback theory - Some properties of signal-flow graphs
    • Mason, S. J. (1953). Feedback theory - Some properties of signal-flow graphs. Proc. Institute of Radio Engineers, 41, 1144-1156.
    • (1953) Proc. Institute of Radio Engineers , vol.41 , pp. 1144-1156
    • Mason, S.J.1
  • 16
    • 84858315580 scopus 로고
    • Feedback theory - Further properties of signal-flow graphs
    • Mason, S. J. (1956). Feedback theory - Further properties of signal-flow graphs. Proc. Institute of Radio Engineers, 44, 920-926.
    • (1956) Proc. Institute of Radio Engineers , vol.44 , pp. 920-926
    • Mason, S.J.1
  • 17
    • 0026117466 scopus 로고
    • Gradient methods for the optimization of dynamical systems containing neural networks
    • Narendra, K. S., Parthasarathy, K. (1991). Gradient methods for the optimization of dynamical systems containing neural networks. IEEE Trans. on Neural Networks, 2, 252-262.
    • (1991) IEEE Trans. on Neural Networks , vol.2 , pp. 252-262
    • Narendra, K.S.1    Parthasarathy, K.2
  • 18
    • 0001382203 scopus 로고
    • Neural networks and nonlinear adaptive filtering: Unifying concepts and new algorithms
    • Nerrand, O., Roussel-Ragot, P., Personnaz, L., Dreyfus, G., & Marcos, S. (1993). Neural networks and nonlinear adaptive filtering: Unifying concepts and new algorithms. Neural Computation, 5 165-199.
    • (1993) Neural Computation , vol.5 , pp. 165-199
    • Nerrand, O.1    Roussel-Ragot, P.2    Personnaz, L.3    Dreyfus, G.4    Marcos, S.5
  • 20
    • 0027969639 scopus 로고
    • Signal flow graphs and neural networks
    • Osowski, S. (1994). Signal flow graphs and neural networks. Biological Cybernetics, 70, 387-395.
    • (1994) Biological Cybernetics , vol.70 , pp. 387-395
    • Osowski, S.1
  • 21
    • 0029375851 scopus 로고
    • Gradient calculations for dynamic recurrent neural networks: A survey
    • Pearlmutter, B. A. (1995). Gradient calculations for dynamic recurrent neural networks: A survey. IEEE Trans. on Neural Networks, 6, 1212-1228.
    • (1995) IEEE Trans. on Neural Networks , vol.6 , pp. 1212-1228
    • Pearlmutter, B.A.1
  • 23
    • 0028392484 scopus 로고
    • Backpropagation through adjoints for the identification of non linear dynamic systems using recurrent neural models
    • Srinivasan, B., Prasad, U. R., & Rao, N. J. (1994). Backpropagation through adjoints for the identification of non linear dynamic systems using recurrent neural models. IEEE Trans. on Neural Networks, 5, 213-228.
    • (1994) IEEE Trans. on Neural Networks , vol.5 , pp. 213-228
    • Srinivasan, B.1    Prasad, U.R.2    Rao, N.J.3
  • 24
    • 0003896099 scopus 로고
    • A general network theorem, with applications
    • Tellegen, B. D. H. (1952). A general network theorem, with applications. Philips Res. Rep., 7, 259-269.
    • (1952) Philips Res. Rep. , vol.7 , pp. 259-269
    • Tellegen, B.D.H.1
  • 25
    • 0028392406 scopus 로고
    • Locally recurrent globally feedforward networks: A critical review of architectures
    • Tsoi, A. C., & Back, A. D. (1994). Locally recurrent globally feedforward networks: A critical review of architectures. IEEE Transactions on Neural Networks, 5, 229-239.
    • (1994) IEEE Transactions on Neural Networks , vol.5 , pp. 229-239
    • Tsoi, A.C.1    Back, A.D.2
  • 26
    • 0033080294 scopus 로고    scopus 로고
    • Complex-valued neural networks with adaptive spline activation function for digital radio links nonlinear equalization
    • Uncini, A., Vecci, L., Campolucci, P., & Piazza, F. (1999). Complex-valued neural networks with adaptive spline activation function for digital radio links nonlinear equalization. IEEE Transactions on Signal Processing, 47, 505-514.
    • (1999) IEEE Transactions on Signal Processing , vol.47 , pp. 505-514
    • Uncini, A.1    Vecci, L.2    Campolucci, P.3    Piazza, F.4
  • 27
    • 0001317823 scopus 로고    scopus 로고
    • Diagrammatic derivation of gradient algorithms for neural networks
    • Wan, E. A., & Beaufays, F. (1996). Diagrammatic derivation of gradient algorithms for neural networks. Neural Computation, 8, 182-201.
    • (1996) Neural Computation , vol.8 , pp. 182-201
    • Wan, E.A.1    Beaufays, F.2
  • 28
    • 0347994318 scopus 로고    scopus 로고
    • Diagrammatic methods for deriving and relating temporal neural networks algorithms
    • M. Gori & C. L. Giles (Eds.), Berlin: Springer-Verlag
    • Wan, E. A., & Beaufays, F. (1998). Diagrammatic methods for deriving and relating temporal neural networks algorithms. In M. Gori & C. L. Giles (Eds.), Adaptive processing of sequences and data structures. Berlin: Springer-Verlag.
    • (1998) Adaptive Processing of Sequences and Data Structures
    • Wan, E.A.1    Beaufays, F.2
  • 29
    • 0025503558 scopus 로고
    • Backpropagation through time: What it does and how to do it
    • Werbos, P. J. (1990). Backpropagation through time: What it does and how to do it. Proc. of IEEE, 78, 1550-1560.
    • (1990) Proc. of IEEE , vol.78 , pp. 1550-1560
    • Werbos, P.J.1
  • 30
    • 0001609567 scopus 로고
    • An efficient gradient-based algorithm for on line training of recurrent network trajectories
    • Williams, R. J., & Peng, J. (1990). An efficient gradient-based algorithm for on line training of recurrent network trajectories. Neural Computation, 2, 490-501.
    • (1990) Neural Computation , vol.2 , pp. 490-501
    • Williams, R.J.1    Peng, J.2
  • 31
    • 0001202594 scopus 로고
    • A learning algorithm for continually running fully recurrent neural networks
    • Williams, R. J., & Zipser, D. (1989). A learning algorithm for continually running fully recurrent neural networks. Neural Computation, 1, 270-280.
    • (1989) Neural Computation , vol.1 , pp. 270-280
    • Williams, R.J.1    Zipser, D.2
  • 32
    • 0001765578 scopus 로고
    • Gradient-based learning algorithms for recurrent networks and their computational complexity
    • Y. Chauvin & D. E. Rumelhart (Eds.), Hillsdale, NJ: Erlbaum
    • Williams, R. J., & Zipser, D. (1994). Gradient-based learning algorithms for recurrent networks and their computational complexity. In Y. Chauvin & D. E. Rumelhart (Eds.), Backpropagation: Theory, architectures and applications (pp. 433-486). Hillsdale, NJ: Erlbaum.
    • (1994) Backpropagation: Theory, Architectures and Applications , pp. 433-486
    • Williams, R.J.1    Zipser, D.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.