-
1
-
-
84946245750
-
The appeal of parallel distributed processing
-
J. L. McClelland, D. E. Rumelhart, and G. E. Hinton, “The appeal of parallel distributed processing.” in D. E. Rumelhart, J. L. McClelland, and the PDP research group, Eds., Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume 1: Foundations. MA: MIT 1986.
-
(1986)
D. E. Rumelhart
-
-
McClelland, J.L.1
Rumelhart, D.E.2
Hinton, G.E.3
-
2
-
-
0017119752
-
Cooperative computation of stereo disparity
-
D. Marr and T. Poggio, “Cooperative computation of stereo disparity,” Sci., vol. 194, pp. 283-287, 1976.
-
(1976)
Sci.
, vol.194
, pp. 283-287
-
-
Marr, D.1
Poggio, T.2
-
3
-
-
84946244053
-
Analysis of a cooperative stereo Biol. vol. 28, 1978
-
G. Marr, D. Palm, and T. Poggio, “Analysis of a cooperative stereo Biol. vol. 28, 1978.
-
(1978)
, vol.28
-
-
Marr, G.1
Palm, D.2
Poggio, T.3
-
4
-
-
26444546351
-
Relaxation and its role in vision
-
G. E. Hinton, “Relaxation and its role in vision,” Ph.D. dissertation, Univ. 1977.
-
(1977)
Ph.D. dissertation
-
-
Hinton, G.E.1
-
5
-
-
0019597414
-
Cooperating processes for low-level vision: A Intell., vol. 1981
-
L. S. Davis and A. Rosenfeld, “Cooperating processes for low-level vision: A Intell., vol. 1981.
-
(1981)
, vol.1981
-
-
Davis, L.S.1
Rosenfeld, A.2
-
8
-
-
34248867021
-
Massively parallel parsing: A strongly interactive model of natural language interpretation
-
D. L. Waltz and J. B. Pollack, “Massively parallel parsing: A strongly interactive model of natural language interpretation,” Cognitive Sci., vol. 9, pp. 51-74, 1985.
-
(1985)
Cognitive Sci.
, vol.9
, pp. 51-74
-
-
Waltz, D.L.1
Pollack, J.B.2
-
9
-
-
0012545485
-
Learning to solve random-dot stereograms of dense and transparent surfaces with recurrent backpropagation
-
N. Qian and T. J. Sejnowski, “Learning to solve random-dot stereograms of dense and transparent surfaces with recurrent backpropagation,” in Proc. 1988 Connectionist Models Summer School, D. S. Touretzky, G. E. Hinton, and T. J. Eds., 1989, 435-443.
-
(1989)
Proc. 1988 Connectionist Models Summer School
-
-
Qian, N.1
Sejnowski, T.J.2
-
10
-
-
0001382203
-
Neural networks and nonlinear adaptive filtering: Unifying concepts and new algorithms
-
O. Nerrand, P. Roussel-Ragot, G. D. L. Personnaz, and S. Marcos, “Neural networks and nonlinear adaptive filtering: Unifying concepts and new algorithms,” Neural Computa., vol. 5, no. 2, pp. 165-197, 1993.
-
(1993)
Neural Computa.
, vol.5
, Issue.2
, pp. 165-197
-
-
Nerrand, O.1
Roussel-Ragot, P.2
Personnaz, G.D.L.3
Marcos, S.4
-
12
-
-
0025331805
-
Complete gradient optimization of a recurrent network applied to BDG discrimination
-
Mar.
-
R. L. Watrous, B. Laedendorf, and G. M. Kuhn, “Complete gradient optimization of a recurrent network applied to BDG discrimination,” J Acoustical Soc. Amer., vol. 87, no. 3, 1301-1309, Mar. 1990.
-
(1990)
J Acoustical Soc. Amer.
, vol.87
, Issue.3
-
-
Watrous, R.L.1
Laedendorf, B.2
Kuhn, G.M.3
-
14
-
-
85132029423
-
Word recognition with recurrent network automata
-
D. Albesano, R. Gemello, and F, Mana, “Word recognition with recurrent network automata,” in Proc. IJCNN '92, Baltimore, pp. 308-313.
-
Proc. IJCNN '92
, pp. 308-313
-
-
Albesano, D.1
Gemello, R.2
Mana, F.3
-
15
-
-
0003758152
-
A dynamic neural network model of sensorimotor transformations in the leech
-
S. R. Lockery, Y. Fang, and T. J. Sejnowski, “A dynamic neural network model of sensorimotor transformations in the leech,” Neural Computa., vol. 2, no. 3, 274-282, 1990.
-
(1990)
Neural Computa.
, vol.2
, Issue.3
-
-
Lockery, S.R.1
Fang, Y.2
Sejnowski, T.J.3
-
16
-
-
84946245462
-
Mapping between neural and physical activities of the lobster gastric mill system
-
K. Doya, M. E. T. Boyle, and A. I. Selverston, “Mapping between neural and physical activities of the lobster gastric mill system,” in Advances in Neural Information Processing Systems 5, S. J. Hanson, Jack D. Cowan, and C. Lee Giles, Eds. San Mateo, CA: Morgan Kaufmann, 1993, pp. 913-920.
-
(1993)
Advances in Neural Information Processing Systems 5
, pp. 913-920
-
-
Doya, K.1
Boyle, M.E.T.2
Selverston, A.I.3
-
17
-
-
84919596539
-
A Hodgkin-Huxley type neuron model that learns slow nonspike oscillation
-
K. Doya, A. I. Selverston, and P. F. Rowat, “A Hodgkin-Huxley type neuron model that learns slow nonspike oscillation,” in Advances in Neural Information Processing Systems 6, J. D. Cowan, G. Tesauro and J. Alspector, Eds. San Mateo, CA: Morgan Kaufmann, 1994.
-
(1994)
Advances in Neural Information Processing Systems 6
-
-
Doya, K.1
Selverston, A.I.2
Rowat, P.F.3
-
18
-
-
0004133171
-
-
Hillsdale, NJ: Lawrence Erlbaum Associates
-
Y. Chauvin and D. E. Rumelhart, Eds., Bcckpropagation: Theory, Architectures and Applications. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994, in Press.
-
(1994)
Bcckpropagation: Theory, Architectures and Applications.
-
-
Chauvin, Y.1
Rumelhart, D.E.2
-
19
-
-
0023843391
-
Analysis of hidden units in a layered network trained to classify sonar targets
-
P. R. Gorman and T, J. Sejnowski, “Analysis of hidden units in a layered network trained to classify sonar targets,” Neural Networks, vol. 1, no. 1, pp. 75-89, 1988.
-
(1988)
Neural Networks
, vol.1
, Issue.1
, pp. 75-89
-
-
Gorman, P.R.1
Sejnowski, T.J.2
-
20
-
-
84946244710
-
The development of the time-delay neural network architecture for speech recognition
-
Nov.
-
K. Lang and G. Hinton, “The development of the time-delay neural network architecture for speech recognition,” Dep. Comput. Sci., Carnegie Mellon Univ., Tech. Rep. CMU-CS-88-152, Nov. 1988.
-
(1988)
Dep. Comput. Sci.
-
-
Lang, K.1
Hinton, G.2
-
21
-
-
0024634603
-
Phoneme recognition using time-delay networks
-
A. Waibel, T. Hanazawa, G. Hinton, K. Shikano, and K. Lang, “Phoneme recognition using time-delay networks,” IEEE Trans. Acoustics, Speech, Signal Process., vol. 37, no. 3, pp. 328-339, 1989.
-
(1989)
IEEE Trans. Acoustics
, vol.37
, Issue.3
, pp. 328-339
-
-
Waibel, A.1
Hanazawa, T.2
Hinton, G.3
Shikano, K.4
Lang, K.5
-
22
-
-
0025254722
-
A time-delay neural network architecture for isolated word recognition
-
K. J. Lang, G. E. Hinton, and A. Waibel, “A time-delay neural network architecture for isolated word recognition,” Neural Networks, vol. 3, no. 23-43, 1990.
-
(1990)
Neural Networks
, vol.3
, Issue.23
-
-
Lang, K.J.1
Hinton, G.E.2
Waibel, A.3
-
23
-
-
0040409547
-
Dimensionality reduction and prior knowledge in e-set recognition
-
K. J. Lang and G. E. Hinton, “Dimensionality reduction and prior knowledge in e-set recognition,” in Advances in Neural Information Processing Systems 2, D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1990, pp. 178-185.
-
(1990)
Advances in Neural Information Processing Systems 2
, pp. 178-185
-
-
Lang, K.J.1
Hinton, G.E.2
-
24
-
-
0001510011
-
Concentrating information in time: Analog neural networks with applications to speech recognition problems
-
June
-
D. W. Tank and J. J. Hopfield, “Concentrating information in time: Analog neural networks with applications to speech recognition problems,” in Proc. IEEE 1st Int. Conf. Neural Networks, San Diego, CA, June 21-24 1987, 455—468.
-
(1987)
Proc. IEEE 1st Int. Conf. Neural Networks
-
-
Tank, D.W.1
Hopfield, J.J.2
-
25
-
-
0027553120
-
Continuous-time temporal backpropagation with adaptable time delays
-
S. P. Day and M. R. Davenport, “Continuous-time temporal backpropagation with adaptable time delays,” IEEE Trans. Neural Networks, vol. 4, no. 2, 348-354, 1993.
-
(1993)
IEEE Trans. Neural Networks
, vol.4
, Issue.2
-
-
Day, S.P.1
Davenport, M.R.2
-
27
-
-
84946244500
-
-
number 1294 in APIE Proceedings Series, Orlando, FL, Apr. 18-20
-
Applications of Artificial Neural Networks, number 1294 in APIE Proceedings Series, Orlando, FL, Apr. 18-20, 1990.
-
(1990)
Applications of Artificial Neural Networks
-
-
-
28
-
-
0000539096
-
Generalization by weight-elimination with application to forecasting
-
A. S. Weigend, D. E. Rumelhart, and B. A. Huberman, “Generalization by weight-elimination with application to forecasting,” in Advances in Neural Information Processing Systems 3, R. P. Lippmann, J. E. Moody, and D. S. Touretzky, Eds. San Mateo, CA: Morgan Kaufmann, 1991, 875-882.
-
(1991)
Advances in Neural Information Processing Systems 3
-
-
Weigend, A.S.1
Rumelhart, D.E.2
Huberman, B.A.3
-
30
-
-
0000029787
-
FIR and IIR synapses, a new neural network architecture for time series modeling
-
A. D. Back and A. C. Tsoi, “FIR and IIR synapses, a new neural network architecture for time series modeling,” Neural Computa., vol. 3, no. 3, pp. 337-350, 1991.
-
(1991)
Neural Computa.
, vol.3
, Issue.3
, pp. 337-350
-
-
Back, A.D.1
Tsoi, A.C.2
-
32
-
-
85032752004
-
Progress in supervised neural networks
-
D. Hush and B. Horne, “Progress in supervised neural networks,” IEEE Signal Process. Mag., vol. 10, no. 1, pp. 8-39, 1993.
-
(1993)
IEEE Signal Process. Mag.
, vol.10
, Issue.1
, pp. 8-39
-
-
Hush, D.1
Horne, B.2
-
33
-
-
0026895542
-
The gamma model—A new neural network for temporal processing
-
B. de Vries and J. Principe, “The gamma model—A new neural network for temporal processing,” Neural Networks, vol. 5, no. 4, pp. 565-576, 1992.
-
(1992)
Neural Networks
, vol.5
, Issue.4
, pp. 565-576
-
-
de Vries, B.1
Principe, J.2
-
34
-
-
84901547461
-
nonlinear dynamics of artificial neural systems
-
T. Maxwell, C. L. Giles, Y. C. Lee, and H. H. Chen, “nonlinear dynamics of artificial neural systems,” in Proc. Snowbird Conf. Neural Networks Computing, Amer. Institute Physics, no. 151, 1986.
-
(1986)
Proc. Snowbird Conf. Neural Networks Computing
, Issue.151
-
-
Maxwell, T.1
Giles, C.L.2
Lee, Y.C.3
Chen, H.H.4
-
35
-
-
0000359337
-
Backpropagation applied to handwritten zip code recognition
-
Y. Le Cun, B. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard, and L. D. Jackel, “Backpropagation applied to handwritten zip code recognition,” Neural Computation, vol. 1, no. 4, pp. 541-551, 1989.
-
(1989)
Neural Computation
, vol.1
, Issue.4
, pp. 541-551
-
-
Le Cun, Y.1
Boser, B.2
Denker, J.S.3
Henderson, D.4
Howard, R.E.5
Hubbard, W.6
Jackel, L.D.7
-
38
-
-
0025399567
-
Identification and control of dynamical systems using neural networks
-
Mar.
-
K. S. Narendra and K. Parthasarathy, “Identification and control of dynamical systems using neural networks,” IEEE Trans. Neural Networks, vol. 1, pp. 4-27, Mar. 1990.
-
(1990)
IEEE Trans. Neural Networks
, vol.1
, pp. 4-27
-
-
Narendra, K.S.1
Parthasarathy, K.2
-
40
-
-
0002663413
-
Turing computability with neural networks
-
H. T. Siegelmann and E. D. Sontag, “Turing computability with neural networks,” Appl. Math. Lett., vol. 4, no. 6, pp. 77-80, 1991.
-
(1991)
Appl. Math. Lett.
, vol.4
, Issue.6
, pp. 77-80
-
-
Siegelmann, H.T.1
Sontag, E.D.2
-
43
-
-
0000111307
-
Finite state automata and simple recurrent networks
-
A. Cleeremans, D. Servan-Schreiber, and J. McClelland, “Finite state automata and simple recurrent networks,” Neural Computa., vol. 1, no. 3, 372-381, 1989.
-
(1989)
Neural Computa.
, vol.1
, Issue.3
-
-
Cleeremans, A.1
Servan-Schreiber, D.2
McClelland, J.3
-
44
-
-
0000032536
-
Induction of finite-state automata using second-order recurrent networks
-
R. L. Watrous and G. M. Kuhn, “Induction of finite-state automata using second-order recurrent networks,” in Advances in Neural Information Processing Systems 4, J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds. San Mateo, CA: Morgan Kaufmann, 1992, pp. 309-316.
-
(1992)
Advances in Neural Information Processing Systems 4
, pp. 309-316
-
-
Watrous, R.L.1
Kuhn, G.M.2
-
45
-
-
0000651312
-
Extracting and learning an unknown grammar with recurrent neural networks
-
C. L. Giles, C. B. Miller, D. Chen, G. Z. Sun, H. H. Chen, and Y. C. Lee, “Extracting and learning an unknown grammar with recurrent neural networks,” in Advances in Neural Information Processing Systems 4J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds. San Mateo, CA: Morgan Kaufmann, 1992, pp. 317-324.
-
(1992)
Advances in Neural Information Processing Systems 4J. E. Moody
, pp. 317-324
-
-
Giles, C.L.1
Miller, C.B.2
Chen, D.3
Sun, G.Z.4
Chen, H.H.5
Lee, Y.C.6
-
46
-
-
0003927606
-
A connectionist symbol manipulator that discovers the structure of context-free languages
-
M. C. Mozer and S. Das, “A connectionist symbol manipulator that discovers the structure of context-free languages,” in Advances in Neural Information Processing Systems 5S. J. Hanson, Jack D. Cowan, and C. Lee Giles, Eds. San Mateo, CA: Morgan Kaufmann, 1993,pp. 863-870.
-
(1993)
Advances in Neural Information Processing Systems 5S. J. Hanson
, pp. 863-870
-
-
Mozer, M.C.1
Das, S.2
-
47
-
-
0003218910
-
Using prior knowledge in a NNPDA to learn context-free languages
-
S. Das, C. L. Giles, and G.-Z. Sun, “Using prior knowledge in a NNPDA to learn context-free languages,” in Advances in Neural Information Processing Systems 5S. J. Hanson, Jack D. Cowan, and C. Lee Giles, Eds. San Mateo, CA: Morgan Kaufmann, 1993, pp. 65-72.
-
(1993)
Advances in Neural Information Processing Systems 5S. J. Hanson
, pp. 65-72
-
-
Das, S.1
Giles, C.L.2
Sun, G.-Z.3
-
48
-
-
0002046921
-
Fool's gold: Extracting finite state machines from recurrent network dynamics
-
J. F. Kolen, “Fool's gold: Extracting finite state machines from recurrent network dynamics,” in Advances in Neural Information Processing Systems 6J. D. Cowan, G. Tesauro and J. Alspector, Eds. San Mateo, CA: Morgan Kaufmann, 1994, In Press.
-
(1994)
Advances in Neural Information Processing Systems 6J. D. Cowan
-
-
Kolen, J.F.1
-
49
-
-
0003130308
-
A unified gradient-descent/clustering architecture for finite state machine induction
-
S. Das and M. C. Mozer, “A unified gradient-descent/clustering architecture for finite state machine induction,” in Advances in Neural Information Processing Systems 6J. D. Cowan, G. Tesauro and J. Eds. San Mateo, CA: Morgan Kaufmann, 1994, In Press.
-
(1994)
Advances in Neural Information Processing Systems 6J. D. Cowan
-
-
Das, S.1
Mozer, M.C.2
-
50
-
-
0023453626
-
Learning regular sets from queries and counter examples
-
D. Angluin, “Learning regular sets from queries and counter examples,” Inform. Computa., vol. 75, 87-106, 1987.
-
(1987)
Inform. Computa.
, vol.75
-
-
Angluin, D.1
-
51
-
-
0026995322
-
Random DFA's can be approximately learned from sparse uniform examples
-
July
-
K. J. Lang, “Random DFA's can be approximately learned from sparse uniform examples,” in Proc. 5th Annu. ACM Workshop Computational Learning Theory, Pittsburgh, PA, July 1992, pp. 45-52.
-
(1992)
Proc. 5th Annu. ACM Workshop Computational Learning Theory
, pp. 45-52
-
-
Lang, K.J.1
-
52
-
-
84946244077
-
Untersuchungen zu dynamischen neuronalen netzen
-
J. Hochreiter, “Untersuchungen zu dynamischen neuronalen netzen,” 1991, Diplomarbeit, Institut für Informatik, Technische Universitat München.
-
(1991)
1991
-
-
Hochreiter, J.1
-
53
-
-
0005316958
-
Induction of multiscale temporal structure
-
M. C. Mozer, “Induction of multiscale temporal structure,” in Advances in Neural Information Processing Systems 4J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds. San Mateo, CA: Morgan Kaufmann, 1992, pp. 275-282.
-
(1992)
Advances in Neural Information Processing Systems 4J. E. Moody
, pp. 275-282
-
-
Mozer, M.C.1
-
54
-
-
0001033889
-
Learning complex, extended sequences using the principle of history compression
-
J. H. Schmidhuber, “Learning complex, extended sequences using the principle of history compression,” Neural Computa., vol. 4, no. 2, pp. 234-242, 1992.
-
(1992)
Neural Computa.
, vol.4
, Issue.2
, pp. 234-242
-
-
Schmidhuber, J.H.1
-
55
-
-
0348068168
-
Learning unambiguous reduced sequence descriptions
-
“Learning unambiguous reduced sequence descriptions,” Advances in Neural Information Processing Systems 4in J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds. San Mateo, CA: Morgan Kaufmann, 1992, pp. 291-298.
-
(1992)
Advances in Neural Information Processing Systems 4in J. E. Moody
, pp. 291-298
-
-
-
57
-
-
0011347548
-
Learning symmetry groups with hidden units: Beyond the perceptron
-
T. J. Sejnowski, P. K. Kienker, and G. Hinton, “Learning symmetry groups with hidden units: Beyond the perceptron,” Physica D, vol. 22, pp. 260-275, 1986.
-
(1986)
Physica D
, vol.22
, pp. 260-275
-
-
Sejnowski, T.J.1
Kienker, P.K.2
Hinton, G.3
-
58
-
-
0001578518
-
A learning algorithm for Boltzmann Machines
-
D. H. Ackley, G. E. Hinton, and T. J. Sejnowski, “A learning algorithm for Boltzmann Machines,” Cognitive Sci., vol. 9, pp. 147-169, 1985.
-
(1985)
Cognitive Sci.
, vol.9
, pp. 147-169
-
-
Ackley, D.H.1
Hinton, G.E.2
Sejnowski, T.J.3
-
59
-
-
0000646059
-
Learning internal representations by error propagation
-
D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning internal representations by error propagation,” in Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume 1: Foundations D. E. Rumelhart, J. L. McClelland, and the PDP research group, Eds. Cambridge, MA: MIT Press, 1986.
-
(1986)
Parallel Distributed Processing: Explorations in the Microstructure of Cognition
-
-
Rumelhart, D.E.1
Hinton, G.E.2
Williams, R.J.3
-
60
-
-
0003529238
-
Beyond regression: New tools for prediction and analysis in the behavioral sciences
-
P. J. Werbos, “Beyond regression: New tools for prediction and analysis in the behavioral sciences,” Ph.D. dissertation, Harvard Univ., 1974.
-
(1974)
Ph.D. dissertation
-
-
Werbos, P.J.1
-
63
-
-
0002334138
-
What's hidden in the hidden layers
-
Aug.
-
D. S. Touretzky and D. A. Pomerleau, “What's hidden in the hidden layers?,” BYTE, pp. 227-233, Aug. 1989.
-
(1989)
BYTE
, pp. 227-233
-
-
Touretzky, D.S.1
Pomerleau, D.A.2
-
64
-
-
0002278965
-
Adaptive switching circuits
-
B. Widrow and M. Hoff, “Adaptive switching circuits,” in Proc. Western Electron. Show Conv., 1960, vol. 4, pp. 96-104, Institute of Radio Engineers (now IEEE).
-
(1960)
Proc. Western Electron. Show Conv.
, vol.4
, pp. 96-104
-
-
Widrow, B.1
Hoff, M.2
-
65
-
-
0009772169
-
Second-order properties of error surfaces: Learning time and generalization
-
Y. L. Cun, I. Kanter, and S. A. Solla, “Second-order properties of error surfaces: Learning time and generalization,” in R. P. Lippmann, J. E. Moody, and D. S. Touretzky, Eds., Advances in Neural Information Processing Systems 3. San Mateo, CA: Morgan Kaufmann, 1991, pp. 918-924.
-
(1991)
R. P. Lippmann
, pp. 918-924
-
-
Cun, Y.L.1
Kanter, I.2
Solla, S.A.3
-
66
-
-
0025547297
-
A learning rule for CAM storage of continuous periodic sequences
-
June
-
B. Baird, “A learning rule for CAM storage of continuous periodic sequences,” in Proc. IJCNN '90 II (Int. Joint Conf. Neural Networks), San Diego, CA, June 1990, pp. 493-498.
-
(1990)
Proc. IJCNN '90 II (Int. Joint Conf. Neural Networks)
, pp. 493-498
-
-
Baird, B.1
-
67
-
-
84946245525
-
CAM storage of analog patterns and continuous sequences with 3 n2 weights
-
B. Baird and F. Eeckman, “CAM storage of analog patterns and continuous sequences with 3 n2 weights,” in Advances in Neural Information Processing Systems 3R. P. Lippmann, J. E. Moody, and D. S. Touretzky, Eds. San Mateo, CA: Morgan Kaufmann, 1991, 91-97.
-
(1991)
Advances in Neural Information Processing Systems 3R. P. Lippmann
-
-
Baird, B.1
Eeckman, F.2
-
68
-
-
84973949757
-
How brains make chaos to make sense of the world
-
Nov.
-
C. A. Skarda and W. J. Freeman, “How brains make chaos to make sense of the world,” Brain Behavioral Sci., vol. 10, Nov. 1987.
-
(1987)
Brain Behavioral Sci.
, vol.10
-
-
Skarda, C.A.1
Freeman, W.J.2
-
69
-
-
0023186918
-
Simulation of chaotic EEG patterns with a dynamic model of the olfactory system
-
W. J. Freeman, “Simulation of chaotic EEG patterns with a dynamic model of the olfactory system,” Biol. Cybern., vol. 56, p. 139, 1987.
-
(1987)
Biol. Cybern.
, vol.56
, pp. 139
-
-
Freeman, W.J.1
-
70
-
-
0001192821
-
Equations of motion from a data series
-
J. P. Crutchfield and B. S. McNamara, “Equations of motion from a data series,” Complex Syst., vol. 1, pp. 417-452, 1987.
-
(1987)
Complex Syst.
, vol.1
, pp. 417-452
-
-
Crutchfield, J.P.1
McNamara, B.S.2
-
71
-
-
84946244831
-
nonlinear signal processing using neural networks: Prediction and system modeling
-
A. Lapedes and R. Farber, “nonlinear signal processing using neural networks: Prediction and system modeling,” Theoretical Division, Los Alamos Nat. Los Alamos, Tech. 1987.
-
(1987)
Theoretical Division
-
-
Lapedes, A.1
Farber, R.2
-
72
-
-
0000442791
-
Generalization of backpropagation to recurrent neural networks
-
F. Pineda, “Generalization of backpropagation to recurrent neural networks,” Physical Rev. Lett., vol. 19, no. 59, pp. 2229-2232, 1987.
-
(1987)
Physical Rev. Lett.
, vol.19
, Issue.59
, pp. 2229-2232
-
-
Pineda, F.1
-
73
-
-
0023563286
-
A learning rule for asynchronous perceptrons with feedback in a combinatorial environment
-
June
-
L. B. Almeida, “A learning rule for asynchronous perceptrons with feedback in a combinatorial environment,” in Proc. IEEE 1st Int. Conf Neural Networks, San Diego, CA, June 21-24, 1987, pp. 609-618.
-
(1987)
Proc. IEEE 1st Int. Conf Neural Networks
, pp. 609-618
-
-
Almeida, L.B.1
-
74
-
-
0001590282
-
Deterministic Boltzmann learning performs steepest descent in weight-space
-
G. E. Hinton, “Deterministic Boltzmann learning performs steepest descent in weight-space,” Neural Computa., vol. 1, no. 1, pp. 143-150, 1989.
-
(1989)
Neural Computa.
, vol.1
, Issue.1
, pp. 143-150
-
-
Hinton, G.E.1
-
75
-
-
0043170587
-
Contrastive learning and neural oscillations
-
P. Baldi and F. Pineda, “Contrastive learning and neural oscillations,” Neural Computa., vol. 3, no. 4, pp. 526-545, 1991.
-
(1991)
Neural Computa.
, vol.3
, Issue.4
, pp. 526-545
-
-
Baldi, P.1
Pineda, F.2
-
77
-
-
0024903887
-
BPS: A learning algorithm for capturing the dynamic nature of speech
-
June
-
M. Gori, Y. Bengio, and R. de Mori, “BPS: A learning algorithm for capturing the dynamic nature of speech,” in Proc. IJCNN '89 Int. Joint Conf. Neural Networks, Washington DC, June 18-22 1989, pp. 417-423.
-
(1989)
Proc. IJCNN '89 Int. Joint Conf. Neural Networks
, pp. 417-423
-
-
Gori, M.1
Bengio, Y.2
de Mori, R.3
-
78
-
-
84946243720
-
A first look at phonetic discrimination using connectionist models with recurrent links
-
Apr.
-
G. Kuhn, “A first look at phonetic discrimination using connectionist models with recurrent links,” Institute Defense Anal., Princeton, NJ, SCIMP Working Paper 82018, Apr. 1987.
-
(1987)
Institute Defense Anal.
-
-
Kuhn, G.1
-
79
-
-
0008554931
-
A focused backpropagation algorithm for temporal pattern recognition
-
Aug.
-
M. C. Mozer, “A focused backpropagation algorithm for temporal pattern recognition,” Complex Syst., vol. 3, no. 4, pp. 349-381, Aug. 1989.
-
(1989)
Complex Syst.
, vol.3
, Issue.4
, pp. 349-381
-
-
Mozer, M.C.1
-
80
-
-
0024877304
-
A modified leaky integrator network for temporal pattern recognition
-
June
-
T. Uchiyama, K. Shimohara, and Y. Tokunaga, “A modified leaky integrator network for temporal pattern recognition,” in Proc. IJCNN '89 (Int. Joint Conf. Neural Networks), Washington DC, June 18-22 1989, pp. 469-475.
-
(1989)
Proc. IJCNN '89 (Int. Joint Conf. Neural Networks)
, pp. 469-475
-
-
Uchiyama, T.1
Shimohara, K.2
Tokunaga, Y.3
-
81
-
-
33747465483
-
Two new learning procedures for recurrent networks
-
B. A. Pearlmutter, “Two new learning procedures for recurrent networks.” Neural Network Rev., vol. 3, no. 3, 99-101, 1990.
-
(1990)
Neural Network Rev.
, vol.3
, Issue.3
-
-
Pearlmutter, B.A.1
-
82
-
-
84870580329
-
-
C. Cambridge, UK: Cambridge Univ. Press
-
W. H. Press, B. P. Flannery, S. A. Teukolsky, and W. T. Verrerling, Numerical Recipes in C. Cambridge, UK: Cambridge Univ. Press, 1988.
-
(1988)
Numerical Recipes
-
-
Press, W.H.1
Flannery, B.P.2
Teukolsky, S.A.3
Verrerling, W.T.4
-
83
-
-
0020970741
-
Stability of global pattern formation and parallel memory storage by competitive neural networks
-
M. A. Cohen and S. Grossberg, “Stability of global pattern formation and parallel memory storage by competitive neural networks,” IEEE Trans. Syst., Man Cybern., vol. 13, pp. 815-826, 1983.
-
(1983)
IEEE Trans. Syst.
, vol.13
, pp. 815-826
-
-
Cohen, M.A.1
Grossberg, S.2
-
86
-
-
0001202693
-
A study of network dynamics
-
June
-
S. Renals and R. Rohwer, “A study of network dynamics,” J. Statistical Physics, vol. 58. pp. 825-848, June 1990.
-
(1990)
J. Statistical Physics
, vol.58
, pp. 825-848
-
-
Renals, S.1
Rohwer, R.2
-
87
-
-
84946244928
-
Learning of stable states in stochastic asymmetric networks
-
Nov.
-
R. B. Allen and J. Alspector, “Learning of stable states in stochastic asymmetric networks,” Bell Commun. Res., Morristown, NJ, Tech. Rep. TM-ARH-015240, Nov. 1989.
-
(1989)
Bell Commun. Res.
-
-
Allen, R.B.1
Alspector, J.2
-
88
-
-
84946244457
-
Deterministic Boltzmann learning in networks with asymmetric connectivity
-
C. C. Galland and G. E. Hinton, “Deterministic Boltzmann learning in networks with asymmetric connectivity,” Univ. Toronto Dep. Comput. Sci., Tech. 1989.
-
(1989)
Univ. Toronto Dep. Comput. Sci.
-
-
Galland, C.C.1
Hinton, G.E.2
-
89
-
-
0345741104
-
Gain variation in recurrent error propagation networks
-
June
-
S. J. Nowlan, “Gain variation in recurrent error propagation networks,” Complex Syst., vol. 2, no. 3, 305-320, June 1988.
-
(1988)
Complex Syst.
, vol.2
, Issue.3
-
-
Nowlan, S.J.1
-
90
-
-
0009487750
-
Fixed point analysis for recurrent neural networks
-
M. B. Ottaway, P. Y. Simard, and D. H. Ballard, “Fixed point analysis for recurrent neural networks,” in Advances in Neural Information Processing Systems I, David S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1989.
-
(1989)
Advances in Neural Information Processing Systems I
-
-
Ottaway, M.B.1
Simard, P.Y.2
Ballard, D.H.3
-
91
-
-
0001406440
-
A mean field theory learning algorithm for neural nets
-
C. Peterson and J. R. Anderson, “A mean field theory learning algorithm for neural nets,” Complex Syst., vol. 1, 1987.
-
(1987)
Complex Syst.
, vol.1
-
-
Peterson, C.1
Anderson, J.R.2
-
93
-
-
0004142541
-
Shape recognition and illusory conjunctions
-
Aug.
-
G. E. Hinton and K. J. Lang, “Shape recognition and illusory conjunctions,” in Proc. 9th Int. Joint Conf. Artificial Intell., Los Angeles, Aug. 1985, vol. 1, pp. 252-259.
-
(1985)
Proc. 9th Int. Joint Conf. Artificial Intell.
, vol.1
, pp. 252-259
-
-
Hinton, G.E.1
Lang, K.J.2
-
95
-
-
0025503558
-
Backpropagation through time: What it does and how to do it
-
P. J. Werbos, “Backpropagation through time: What it does and how to do it,” in Proc. IEEE, vol. 78, pp. 1550-1560, 1990.
-
(1990)
Proc. IEEE
, vol.78
, pp. 1550-1560
-
-
Werbos, P.J.1
-
96
-
-
0001202597
-
Learning state space trajectories in recurrent neural networks
-
B. Pearlmutter, “Learning state space trajectories in recurrent neural networks,” Neural Computa., vol. 1, no. 2, pp. 263-269, 1989.
-
(1989)
Neural Computa.
, vol.1
, Issue.2
, pp. 263-269
-
-
Pearlmutter, B.1
-
97
-
-
0000903748
-
Generalization of backpropagation with application to a recurrent gas market model
-
P. J. Werbos, “Generalization of backpropagation with application to a recurrent gas market model,” Neural Networks, vol. 1, pp. 339-356, 1988.
-
(1988)
Neural Networks
, vol.1
, pp. 339-356
-
-
Werbos, P.J.1
-
98
-
-
85024508370
-
A steepest ascent method for solving optimum programming problems
-
A. E. Bryson, Jr., “A steepest ascent method for solving optimum programming problems,” J. Appl. Mechanics, vol. 29, no. 2, p. 247, 1962.
-
(1962)
J. Appl. Mechanics
, vol.29
, Issue.2
, pp. 247
-
-
Bryson, A.E.1
-
100
-
-
0001773535
-
Applications of advances in nonlinear sensitivity analysis
-
Aug.
-
P. J. Werbos, “Applications of advances in nonlinear sensitivity analysis,” in Proc. 10th IFIP Conf Syst. Modeling Optimization, New York, Aug. 31-Sep. 4, 1981.
-
(1981)
Proc. 10th IFIP Conf Syst. Modeling Optimization
-
-
Werbos, P.J.1
-
101
-
-
0005803728
-
Static and dynamic error propagation networks with application to speech coding
-
June
-
A. J. Robinson and F. Fallside, “Static and dynamic error propagation networks with application to speech coding,” in Proc. Int. Joint Conf. Neural Networks, Washington DC, June 18-22 1989, pp. 632-641.
-
(1989)
Proc. Int. Joint Conf. Neural Networks
, pp. 632-641
-
-
Robinson, A.J.1
Fallside, F.2
-
102
-
-
0024939338
-
A learning algorithm for analog, fully recurrent neural networks
-
June
-
M. Gherrity, “A learning algorithm for analog, fully recurrent neural networks,” in Proc. IJCNN '89 Int. Joint Conf. Neural Networks, Washington DC, June 18-22 1989, pp. 643-644.
-
(1989)
Proc. IJCNN '89 Int. Joint Conf. Neural Networks
, pp. 643-644
-
-
Gherrity, M.1
-
103
-
-
0001202594
-
A learning algorithm for continually running fully recurrent neural networks
-
R. J. Williams and D. Zipser, “A learning algorithm for continually running fully recurrent neural networks,” Neural Computa., vol. 1, no. 2, pp. 270-280, 1989.
-
(1989)
Neural Computa.
, vol.1
, Issue.2
, pp. 270-280
-
-
Williams, R.J.1
Zipser, D.2
-
104
-
-
0026117466
-
Gradient methods for the optimization of dynamical systems containing neural networks
-
Mar.
-
K. S. Narendra and K. Parthasarathy, “Gradient methods for the optimization of dynamical systems containing neural networks,” IEEE Trans. Neural Networks, vol. no. Mar. 1991.
-
(1991)
IEEE Trans. Neural Networks
-
-
Narendra, K.S.1
Parthasarathy, K.2
-
105
-
-
34250516620
-
New second-order and first-order algorithm for determining optimal control: A differential dynamic programming approach
-
D. H. Jacobson, “New second-order and first-order algorithm for determining optimal control: A differential dynamic programming approach,” J. Optimization Theory Applicat., vol. 2, 1968.
-
(1968)
J. Optimization Theory Applicat.
, vol.2
-
-
Jacobson, D.H.1
-
108
-
-
0037700575
-
Subgrouping reduces complexity and speeds up learning in recurrent networks
-
D. Zipser, “Subgrouping reduces complexity and speeds up learning in recurrent networks,” in Advances in Neural Information Processing Systems 2D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1990, pp. 638-641.
-
(1990)
Advances in Neural Information Processing Systems 2D. S. Touretzky
, pp. 638-641
-
-
Zipser, D.1
-
109
-
-
0001609567
-
An efficient gradient-based algorithm for on-line training of recurrent network trajectories
-
R. J. Williams and J. Peng, “An efficient gradient-based algorithm for on-line training of recurrent network trajectories,” Neural Computa., vol. 2, no. 4, 1990.
-
(1990)
Neural Computa.
, vol.2
, Issue.4
-
-
Williams, R.J.1
Peng, J.2
-
110
-
-
0001765578
-
Gradient-based learning algorithms for recurrent networks and their computational complexity
-
R. J. Williams and D. Zipser, “Gradient-based learning algorithms for recurrent networks and their computational complexity,” in Backpropagation: Theory, Architectures and Applications, Y. Chauvin and D. E. Rumelhart, Eds. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994, in Press.
-
(1994)
Backpropagation: Theory
-
-
Williams, R.J.1
Zipser, D.2
-
111
-
-
0000053463
-
A fixed size storage O(n3) time complexity learning algorithm for fully recurrent continually running networks
-
Jürgen H. Schmidhuber, “A fixed size storage O(n3) time complexity learning algorithm for fully recurrent continually running networks,” Neural Computa., vol. 4, no. 2, pp. 243-248, 1992.
-
(1992)
Neural Computa.
, vol.4
, Issue.2
, pp. 243-248
-
-
Schmidhuber, J.H.1
-
112
-
-
0000651310
-
Green's function method for fast on-line learning algorithm of recurrent neural networks
-
G.-Z. Sun, H.-H. Chen, and Y.-C. Lee, “Green's function method for fast on-line learning algorithm of recurrent neural networks,” in Advances in Neural Information Processing Systems 4J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds. San Mateo, CA: Morgan Kaufmann, 1992, pp. 333-340.
-
(1992)
Advances in Neural Information Processing Systems 4J. E. Moody
, pp. 333-340
-
-
Sun, G.-Z.1
Chen, H.-H.2
Lee, Y.-C.3
-
113
-
-
0025547301
-
Learning internal representations of pattern sequences in a neural network with adaptive time-delays
-
June
-
U. Bodenhausen, “Learning internal representations of pattern sequences in a neural network with adaptive time-delays,” in Proc. IJCNN '90 II (Int. Joint Conf. Neural Networks), San Diego, CA, June 1990.
-
(1990)
Proc. IJCNN '90 II (Int. Joint Conf. Neural Networks)
-
-
Bodenhausen, U.1
-
114
-
-
0023317858
-
Neural computation by time compression
-
D. W. Tank and J. J. Hopfield, “Neural computation by time compression,” Proc. National Academy Sci., vol. 84, pp. 1896-1900, 1987.
-
(1987)
Proc. National Academy Sci.
, vol.84
, pp. 1896-1900
-
-
Tank, D.W.1
Hopfield, J.J.2
-
115
-
-
0141840579
-
Speech recognition using connectionist networks
-
Oct.
-
R. L. Watrous, “Speech recognition using connectionist networks,” Ph.D. dissertation, Univ. Pennsylvania, Oct. 1988.
-
(1988)
Ph.D. dissertation
-
-
Watrous, R.L.1
-
116
-
-
33747641474
-
Shaping the state space landscape in recurrent networks
-
P. Y. Simard, J. P. Rayzs, and B. Victorri, “Shaping the state space landscape in recurrent networks,” in Advances in Neural Information Processing Systems 3, R. P. Lippmann, J. E. Moody, and D. S. Touretzky, Eds. San Mateo, CA: Morgan Kaufmann, 1991, pp. 105-112.
-
(1991)
Advances in Neural Information Processing Systems 3
, pp. 105-112
-
-
Simard, P.Y.1
Rayzs, J.P.2
Victorri, B.3
-
117
-
-
26444565569
-
Finding structure in time
-
J. L. Elman, “Finding structure in time,” Cognitive Sci., vol. 14, pp. 179-211, 1990.
-
(1990)
Cognitive Sci.
, vol.14
, pp. 179-211
-
-
Elman, J.L.1
-
118
-
-
0004262806
-
Finding structure in time
-
“Finding structure in time,” Center Res. Language, Univ. Calif. San Tech. 1988.
-
(1988)
Center Res. Language
-
-
-
120
-
-
0011943051
-
Learning by choice of internal representations
-
T. Grossman, R. Meir, and E. Domany, “Learning by choice of internal representations,” Complex Syst., vol. 2, pp. 555-575, 1989.
-
(1989)
Complex Syst.
, vol.2
, pp. 555-575
-
-
Grossman, T.1
Meir, R.2
Domany, E.3
-
122
-
-
84946244246
-
A learning algorithm for continually running fully recurrent neural networks
-
Nov.
-
R. J. Williams and D. Zipser, “A learning algorithm for continually running fully recurrent neural networks,” Univ. Calif San Diego, La CA Tech. ICS Nov. 1988.
-
(1988)
Univ. Calif San Diego
-
-
Williams, R.J.1
Zipser, D.2
-
123
-
-
0001887517
-
Attractor dynamics and parallelism in a connectionist sequential machine
-
M. I. Jordan, “Attractor dynamics and parallelism in a connectionist sequential machine,” in Proc. 9th Annu. Conf. Cognitive Sci. Soc., 1986, pp. 531-546.
-
(1986)
Proc. 9th Annu. Conf. Cognitive Sci. Soc.
, pp. 531-546
-
-
Jordan, M.I.1
-
124
-
-
0000838510
-
Neural network nonlinear adaptive filtering using the extended Kalman filter algorithm
-
July
-
M. B. Matthews, “Neural network nonlinear adaptive filtering using the extended Kalman filter algorithm,” in Proc. Int. Neural Networks Conf., Paris, July 1990, vol. 1, pp. 115-119.
-
(1990)
Proc. Int. Neural Networks Conf.
, vol.1
, pp. 115-119
-
-
Matthews, M.B.1
-
125
-
-
85132302281
-
Training recurrent networks using the extended Kalman filter
-
R. J. Williams, “Training recurrent networks using the extended Kalman filter,” in Proc. IJCNN '92, (Int. Joint Conf. Neural Networks), Baltimore MD, 1992, 241-250.
-
(1992)
Proc. IJCNN '92
-
-
Williams, R.J.1
-
126
-
-
85024429815
-
A new approach to linear filtering and prediction problems
-
Mar.
-
R. E. Kalman, “A new approach to linear filtering and prediction problems,” Trans. ASME J. Basic Eng., vol. 82, no. 1, pp. 35-45, Mar. 1960.
-
(1960)
Trans. ASME J. Basic Eng.
, vol.82
, Issue.1
, pp. 35-45
-
-
Kalman, R.E.1
-
127
-
-
0014764781
-
On the identification of variances and adaptive Kalman filtering
-
Apr.
-
R. K. Mahra, “On the identification of variances and adaptive Kalman filtering,” IEEE Trans. Automat. Contr., vol. AC-15, no. 2, pp. 175-184, Apr. 1970.
-
(1970)
IEEE Trans. Automat. Contr.
, vol.AC-15
, Issue.2
, pp. 175-184
-
-
Mahra, R.K.1
-
129
-
-
0006537321
-
Adaptive soft weight tying using Gaussian mixtures
-
S. J. Nowlan and G. E. Hinton, “Adaptive soft weight tying using Gaussian mixtures,” in Advances in Neural Information Processing Systems 4, J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds. San Mateo, CA: Morgan Kaufmann, 1992, 993-1000.
-
(1992)
Advances in Neural Information Processing Systems 4
-
-
Nowlan, S.J.1
Hinton, G.E.2
-
131
-
-
0342835008
-
Learning algorithms for oscillatory networks with gap junctions and membrane currents
-
Feb.
-
P. F. Rowat and A. I. Selverston, “Learning algorithms for oscillatory networks with gap junctions and membrane currents,” Network: Computation Neural Syst., vol. 2, no. 1, 17-42, Feb. 1991.
-
(1991)
Network: Computation Neural Syst.
, vol.2
, Issue.1
-
-
Rowat, P.F.1
Selverston, A.I.2
-
133
-
-
0024137490
-
Increased rates of convergence through learning rate adaptation
-
R. A. Jacobs, “Increased rates of convergence through learning rate adaptation,” Neural Networks, vol. 1, no. 4, pp. 295-307, 1988.
-
(1988)
Neural Networks
, vol.1
, Issue.4
, pp. 295-307
-
-
Jacobs, R.A.1
-
134
-
-
0038397468
-
Faster learning for dynamic recurrent backpropagation
-
Y. Fang and T. J. Sejnowski, “Faster learning for dynamic recurrent backpropagation,” Neural Computa., vol. 2, no. 3, 270-273, 1990.
-
(1990)
Neural Computa.
, vol.2
, Issue.3
-
-
Fang, Y.1
Sejnowski, T.J.2
-
136
-
-
0026971570
-
Adapting bias by gradient descent: An incremental version of delta-bar-delta
-
“Adapting bias by gradient descent: An incremental version of delta-bar-delta,” in Proc. Nat. Conf Artificial Intell. AAAI-92, 1992.
-
(1992)
Proc. Nat. Conf Artificial Intell. AAAI-92
-
-
-
137
-
-
84946244108
-
Adaptation of cuespecific learning rates in network models of human category learning
-
M. A. Gluck, P. T. Glauthier, and R. S. Sutton, “Adaptation of cuespecific learning rates in network models of human category learning,” in Proc. 14th Annu. Sci. 1992.
-
(1992)
Proc. 14th Annu. Sci. 1992.
-
-
Gluck, M.A.1
Glauthier, P.T.2
Sutton, R.S.3
-
139
-
-
0016987049
-
Stationary and nonstationary learning characteristics of the LMS adaptive filter
-
B. Widrow, J. M. McCool, M. G. Larimore, and C. R. Johnson, Jr., “Stationary and nonstationary learning characteristics of the LMS adaptive filter,” Proc. IEEE, vol. 64, pp. 1151-1162, 1976.
-
(1976)
Proc. IEEE
, vol.64
, pp. 1151-1162
-
-
Widrow, B.1
McCool, J.M.2
Larimore, M.G.3
Johnson, C.R.4
-
142
-
-
0023602770
-
Optimal algorithms for adaptive networks: Second-order backpropagation, second-order direct propagation and second-order Hebbian learning
-
June
-
D. B. Parker, “Optimal algorithms for adaptive networks: Second-order backpropagation, second-order direct propagation and second-order Hebbian learning,” in Proc. IEEE 1st Int. Conf. Neural Networks, San Diego, CA, June 21-24, 1987, pp. 593-600.
-
(1987)
Proc. IEEE 1st Int. Conf. Neural Networks
, pp. 593-600
-
-
Parker, D.B.1
-
143
-
-
0023541050
-
Learning algorithms for connectionist networks: Applied gradient methods of nonlinear optimization
-
June
-
R. Watrous, “Learning algorithms for connectionist networks: Applied gradient methods of nonlinear optimization,” in Proc. IEEE 1st Int. Conf Neural Networks, San Diego, CA, June 21-24, 1987, pp. 619-627.
-
(1987)
Proc. IEEE 1st Int. Conf Neural Networks
, pp. 619-627
-
-
Watrous, R.1
-
145
-
-
84946244402
-
Asymptotic convergence of back- Neural
-
G. Tesauro, Y. He, and S. Ahmad, “Asymptotic convergence of back- Neural vol. 1, no. 3, 382-391, 1989.
-
(1989)
, vol.1
, Issue.3
-
-
Tesauro, G.1
He, Y.2
Ahmad, S.3
-
146
-
-
0024749726
-
Properties of the momentum LMS algorithm
-
Oct.
-
M. A. Tuğay and Y. Tanik, “Properties of the momentum LMS algorithm,” Signal Process., vol. 18, no. 2, 117-127, Oct. 1989.
-
(1989)
Signal Process.
, vol.18
, Issue.2
-
-
Tuğay, M.A.1
Tanik, Y.2
-
147
-
-
0342533951
-
Gradient descent: Second-order momentum and saturating error
-
B. A. Pearlmutter, “Gradient descent: Second-order momentum and saturating error,” in Advances in Neural Information Processing Systems 4, J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds. San Mateo, CA: Morgan Kaufmann, 1992, pp. 887-894.
-
(1992)
Advances in Neural Information Processing Systems 4
, pp. 887-894
-
-
Pearlmutter, B.A.1
-
148
-
-
0021835689
-
‘Neural' computation of decisions in optimization problems
-
J. J. Hopfield and D. W. Tank, “‘Neural’ computation of decisions in optimization problems,” Biol. Cybern., vol. 52, pp. 141-152, 1985.
-
(1985)
Biol. Cybern.
, vol.52
, pp. 141-152
-
-
Hopfield, J.J.1
Tank, D.W.2
-
149
-
-
26444479778
-
Optimization by simulated annealing
-
S. Kirkpatrick, C. D. Geiatt, Jr., and M. P. Vecchi, “Optimization by simulated annealing,” Sci., vol. 220, pp. 671-680, 1983.
-
(1983)
Sci.
, vol.220
, pp. 671-680
-
-
Kirkpatrick, S.1
Geiatt, C.D.2
Vecchi, M.P.3
|