-
1
-
-
84953506049
-
Extraction, insertion, and refinement of symbolic rules in dynamically driven recurrent neural networks
-
Connection Sci.
-
C. L. Giles and C. W. Omlin, "Extraction, insertion, and refinement of symbolic rules in dynamically driven recurrent neural networks," Connection Sci., special issue on architectures for integrating symbolic and neural processes, vol. 5, nos. 3/4, p. 307, 1993.
-
(1993)
Architectures for Integrating Symbolic and Neural Processes
, vol.5
, Issue.3-4 SPEC. ISSUE
, pp. 307
-
-
Giles, C.L.1
Omlin, C.W.2
-
2
-
-
0001601299
-
Induction of finite-state languages using second-order recurrent networks
-
May
-
R. L. Watrous and G. M. Kuhn, "Induction of finite-state languages using second-order recurrent networks," Neural Computa., vol. 4, pp. 406-414, May 1992.
-
(1992)
Neural Computa.
, vol.4
, pp. 406-414
-
-
Watrous, R.L.1
Kuhn, G.M.2
-
3
-
-
0026972451
-
On the computational power of neural nets
-
New York, ACM
-
H. Siegelmann and E. Sontag, "On the computational power of neural nets," in Proc. 5th ACM Wkshp. Computa. Learning Theory, New York, 1992, pp. 440-449, ACM.
-
(1992)
Proc. 5th ACM Wkshp. Computa. Learning Theory
, pp. 440-449
-
-
Siegelmann, H.1
Sontag, E.2
-
4
-
-
0030125824
-
Representation of finite-state automata in recurrent radial basis function networks
-
P. Frasconi, M. Gori, M. Maggini, and G. Soda, "Representation of finite-state automata in recurrent radial basis function networks," Machine Learning, vol. 23, pp. 5-32, 1996.
-
(1996)
Machine Learning
, vol.23
, pp. 5-32
-
-
Frasconi, P.1
Gori, M.2
Maggini, M.3
Soda, G.4
-
5
-
-
0001949873
-
Recurrent networks: State machines or iterated function systems?
-
M. C. Mozer, P. Smolensky, D. S. Touretzky, J. L. Elman, and A. S. Weigend, Eds. Hillsdale, NJ: Lawrence Erlbaum
-
J. F. Kolen, "Recurrent networks: State machines or iterated function systems?," in Proc. 1993 Connectionist Models Summer School, M. C. Mozer, P. Smolensky, D. S. Touretzky, J. L. Elman, and A. S. Weigend, Eds. Hillsdale, NJ: Lawrence Erlbaum, 1994, pp. 203-210.
-
(1994)
Proc. 1993 Connectionist Models Summer School
, pp. 203-210
-
-
Kolen, J.F.1
-
6
-
-
0030586641
-
The dynamics of discrete-time computation, with the application to recurrent neural networks and finite-state machine extraction
-
M. Casey, "The dynamics of discrete-time computation, with the application to recurrent neural networks and finite-state machine extraction," Neural Computa., vol. 8, no. 6, pp. 1135-1178, 1996.
-
(1996)
Neural Computa.
, vol.8
, Issue.6
, pp. 1135-1178
-
-
Casey, M.1
-
7
-
-
0001257629
-
Experimental comparison of the effect of order in recurrent neural networks
-
applications of neural networks to pattern recognition
-
C. B. Miller and C. L. Giles, "Experimental comparison of the effect of order in recurrent neural networks," Int. J. Pattern Recognition Artificial Intell., special issue on applications of neural networks to pattern recognition, vol. 7, no. 4, pp. 849-872, 1993.
-
(1993)
Int. J. Pattern Recognition Artificial Intell.
, vol.7
, Issue.4 SPEC. ISSUE
, pp. 849-872
-
-
Miller, C.B.1
Giles, C.L.2
-
8
-
-
0030083072
-
Rule revision with recurrent neural networks
-
C. W. Omlin and C. L. Giles, "Rule revision with recurrent neural networks," IEEE Trans. Knowledge Data Eng., vol. 8, no. 1, p. 183, 1996.
-
(1996)
IEEE Trans. Knowledge Data Eng.
, vol.8
, Issue.1
, pp. 183
-
-
Omlin, C.W.1
Giles, C.L.2
-
9
-
-
0001609567
-
An efficient gradient-based algorithm for on-line training of recurrent network trajectories
-
R. J. Williams and J. Peng, "An efficient gradient-based algorithm for on-line training of recurrent network trajectories," Neural Computa., vol. 2, no. 4, pp. 490-501, 1990.
-
(1990)
Neural Computa.
, vol.2
, Issue.4
, pp. 490-501
-
-
Williams, R.J.1
Peng, J.2
-
10
-
-
84947426484
-
Second-order recurrent neural networks can learn regular grammars from noisy strings
-
From Natural to Artificial Neural Computa.: Proc. IWANN'95 (June 7-9, 1995), J. Mira and F. Sandoval, Eds., New York: Springer-Verlag
-
R. C. Carrasco and M. L. Forcada, "Second-order recurrent neural networks can learn regular grammars from noisy strings," in From Natural to Artificial Neural Computa.: Proc. IWANN'95 (June 7-9, 1995), J. Mira and F. Sandoval, Eds., vol. 930 of Lecture Notes in Computer Science. New York: Springer-Verlag, 1995, pp. 605-610.
-
(1995)
Lecture Notes in Computer Science
, vol.930
, pp. 605-610
-
-
Carrasco, R.C.1
Forcada, M.L.2
-
11
-
-
0001770758
-
Dynamic construction of finite-state automata from examples using hill-climbing
-
Ann Arbor, MI
-
M. Tomita, "Dynamic construction of finite-state automata from examples using hill-climbing," in Proc. 4th Annu. Cognitive Sci. Conf., Ann Arbor, MI, 1982, pp. 105-108.
-
(1982)
Proc. 4th Annu. Cognitive Sci. Conf.
, pp. 105-108
-
-
Tomita, M.1
|