-
1
-
-
0038346765
-
Participation of inhibitory and excitatory interneurones in the control of hippocampal cortical output
-
Mary A.B. Brazier, editor, University of California Press, Los Angeles
-
Per Anderson, Gary N. Gross, Terje Lømo, and Ola Sveen. Participation of inhibitory and excitatory interneurones in the control of hippocampal cortical output. In Mary A.B. Brazier, editor, The Interneuron, volume 11. University of California Press, Los Angeles, 1969.
-
(1969)
The Interneuron
, vol.11
-
-
Anderson, P.1
Gross, G.N.2
Lømo, T.3
Sveen, O.4
-
3
-
-
0242630292
-
Interneuronal mechanisms in the cortex
-
Mary A.B. Brazier, editor, University of California Press, Los Angeles
-
Costas Stefanis. Interneuronal mechanisms in the cortex. In Mary A.B. Brazier, editor, The Interneuron, volume 11. University of California Press, Los Angeles, 1969.
-
(1969)
The Interneuron
, vol.11
-
-
Stefanis, C.1
-
4
-
-
84961365499
-
Contour enhancement, short-term memory, and constancies in reverberating neural networks
-
Stephen Grossberg. Contour enhancement, short-term memory, and constancies in reverberating neural networks. Studies in Applied Mathematics, 52:213-257, 1973.
-
(1973)
Studies in Applied Mathematics
, vol.52
, pp. 213-257
-
-
Grossberg, S.1
-
5
-
-
77957064197
-
Catastrophic interference in connectionist networks: The sequential learning problem
-
Michael McCloskey and Neal J. Cohen. Catastrophic interference in connectionist networks: The sequential learning problem. The Psychology of Learning and Motivation, 24:109-164, 1989.
-
(1989)
The Psychology of Learning and Motivation
, vol.24
, pp. 109-164
-
-
McCloskey, M.1
Cohen, N.J.2
-
6
-
-
0023981451
-
The art of adaptive pattern recognition by a self-organising neural network
-
Gail A. Carpenter and Stephen Grossberg. The art of adaptive pattern recognition by a self-organising neural network. Computer, 21(3):77-88, 1988.
-
(1988)
Computer
, vol.21
, Issue.3
, pp. 77-88
-
-
Carpenter, G.A.1
Grossberg, S.2
-
7
-
-
0003588579
-
-
PhD thesis, Department of Computer Sciences, The University of Texas at Austin, Austin, Texas 78712, August
-
Mark B. Ring. Continual Learning in Reinforcement Environments. PhD thesis, Department of Computer Sciences, The University of Texas at Austin, Austin, Texas 78712, August 1994.
-
(1994)
Continual Learning in Reinforcement Environments
-
-
Ring, M.B.1
-
8
-
-
0016737517
-
Pattern formation, contrast control, and oscillations in the short term memory of shunting on-center off-surround networks
-
Samuel A. Ellias and Stephen Grossberg. Pattern formation, contrast control, and oscillations in the short term memory of shunting on-center off-surround networks. Bio. Cybernetics, 1975.
-
(1975)
Bio. Cybernetics
-
-
Ellias, S.A.1
Grossberg, S.2
-
9
-
-
0026678711
-
Complex dynamics in winner-take-all neural nets with slow inhibition
-
Brad Ermentrout. Complex dynamics in winner-take-all neural nets with slow inhibition. Neural Networks, 5(1):415-431, 1992.
-
(1992)
Neural Networks
, vol.5
, Issue.1
, pp. 415-431
-
-
Ermentrout, B.1
-
10
-
-
0015749493
-
Self-organization of orientation sensitive cells in the striate cortex
-
December
-
Christoph von der Malsburg. Self-organization of orientation sensitive cells in the striate cortex. Kybernetik, 14(2):85-100, December 1973.
-
(1973)
Kybernetik
, vol.14
, Issue.2
, pp. 85-100
-
-
Von Der Malsburg, C.1
-
11
-
-
0020068152
-
Self-organized formation of topologically correct feature maps
-
Teuvo Kohonen. Self-organized formation of topologically correct feature maps. Biological cybernetics, 43(1):59-69, 1982.
-
(1982)
Biological Cybernetics
, vol.43
, Issue.1
, pp. 59-69
-
-
Kohonen, T.1
-
13
-
-
0033360355
-
Attention activates winner-takeall competition among visual filters
-
April
-
Dale K. Lee, Laurent Itti, Christof Koch, and Jochen Braun. Attention activates winner-takeall competition among visual filters. Nature Neuroscience, 2(4):375-81, April 1999.
-
(1999)
Nature Neuroscience
, vol.2
, Issue.4
, pp. 375-381
-
-
Lee, D.K.1
Itti, L.2
Koch, C.3
Braun, J.4
-
14
-
-
84898947886
-
Spiking inputs to a winner-take-all network
-
MIT;
-
Matthias Oster and Shih-Chii Liu. Spiking inputs to a winner-take-all network. In Proceedings of NIPS, volume 18. MIT; 1998, 2006.
-
(1998)
Proceedings of NIPS
, vol.18
, pp. 2006
-
-
Oster, M.1
Liu, S.2
-
16
-
-
0034576069
-
Modeling selective attention using a neuromorphic analog VLSI device
-
Giacomo Indiveri. Modeling selective attention using a neuromorphic analog VLSI device. Neural Computation, 12(12):2857-2880, 2000.
-
(2000)
Neural Computation
, vol.12
, Issue.12
, pp. 2857-2880
-
-
Indiveri, G.1
-
17
-
-
0347191165
-
Neural computation with winner-take-all as the only nonlinear operation
-
Wolfgang Maass. Neural computation with winner-take-all as the only nonlinear operation. In Proceedings of NIPS, volume 12, 1999.
-
(1999)
Proceedings of NIPS
, vol.12
-
-
Maass, W.1
-
18
-
-
0034321873
-
On the computational power of winner-take-all
-
Wolfgang Maass. On the computational power of winner-take-all. Neural Computation, 12:2519-2535, 2000.
-
(2000)
Neural Computation
, vol.12
, pp. 2519-2535
-
-
Maass, W.1
-
19
-
-
84897543523
-
Maxout networks
-
Ian J. Goodfellow, David Warde-Farley, Mehdi Mirza, Aaron Courville, and Yoshua Bengio. Maxout networks. In Proceedings of the ICML, 2013.
-
(2013)
Proceedings of the ICML
-
-
Goodfellow, I.J.1
Warde-Farley, D.2
Mirza, M.3
Courville, A.4
Bengio, Y.5
-
20
-
-
84867720412
-
-
arXiv: 1207.0580
-
Geoffrey E. Hinton, Nitish Srivastava, Alex Krizhevsky, Ilya Sutskever, and Ruslan R. Salakhutdinov. Improving neural networks by preventing co-adaptation of feature detectors, 2012. arXiv:1207.0580.
-
(2012)
Improving Neural Networks by Preventing Co-adaptation of Feature Detectors
-
-
Hinton, G.E.1
Srivastava, N.2
Krizhevsky, A.3
Sutskever, I.4
Salakhutdinov, R.R.5
-
21
-
-
0001623105
-
A local learning algorithm for dynamic feedforward and recurrent networks
-
Juergen Schmidhuber. A local learning algorithm for dynamic feedforward and recurrent networks. Connection Science, 1(4):403-412, 1989.
-
(1989)
Connection Science
, vol.1
, Issue.4
, pp. 403-412
-
-
Schmidhuber, J.1
-
23
-
-
0033316361
-
Hierarchical models of object recognition in cortex
-
Maximillian Riesenhuber and Tomaso Poggio. Hierarchical models of object recognition in cortex. Nature Neuroscience, 2(11), 1999.
-
(1999)
Nature Neuroscience
, vol.2
, pp. 11
-
-
Riesenhuber, M.1
Poggio, T.2
-
24
-
-
84878919540
-
Imagenet classification with deep convolutional neural networks
-
Alex Krizhevsky, Ilya Sutskever, and Goeffrey E. Hinton. Imagenet classification with deep convolutional neural networks. In Proceedings of NIPS, pages 1-9, 2012.
-
Proceedings of NIPS
, vol.2012
, pp. 1-9
-
-
Krizhevsky, A.1
Sutskever, I.2
Hinton, G.E.3
-
26
-
-
77956509090
-
Rectified linear units improve restricted boltzmann machines
-
Vinod Nair and Geoffrey E. Hinton. Rectified linear units improve restricted boltzmann machines. In Proceedings of the ICML, number 3, 2010.
-
(2010)
Proceedings of the ICML
, Issue.3
-
-
Nair, V.1
Hinton, G.E.2
-
27
-
-
84862294866
-
Deep sparse rectifier networks
-
Xavier Glorot, Antoine Bordes, and Yoshua Bengio. Deep sparse rectifier networks. In AISTATS, volume 15, pages 315-323, 2011.
-
(2011)
Aistats
, vol.15
, pp. 315-323
-
-
Glorot, X.1
Bordes, A.2
Bengio, Y.3
-
28
-
-
84890527827
-
Improving deep neural networks for LVCSR using rectified linear units and dropout
-
George E. Dahl, Tara N. Sainath, and Geoffrey E. Hinton. Improving Deep Neural Networks for LVCSR using Rectified Linear Units and Dropout. In Proceedings of ICASSP, 2013.
-
(2013)
Proceedings of ICASSP
-
-
Dahl, G.E.1
Sainath, T.N.2
Hinton, G.E.3
-
29
-
-
84893676344
-
Rectifier nonlinearities improve neural network acoustic models
-
Andrew L. Maas, Awni Y. Hannun, and Andrew Y. Ng. Rectifier nonlinearities improve neural network acoustic models. In Proceedings of the ICML, 2013.
-
(2013)
Proceedings of the ICML
-
-
Maas, A.L.1
Hannun, A.Y.2
Ng, A.Y.3
-
34
-
-
84864069017
-
Efficient learning of sparse representations with an energy-based model
-
Marc'Aurelio Ranzato, Christopher Poultney, Sumit Chopra, and Yann LeCun. Efficient learning of sparse representations with an energy-based model. In Proceedings of NIPS, 2007.
-
(2007)
Proceedings of NIPS
-
-
Ranzato, M.A.1
Poultney, C.2
Chopra, S.3
LeCun, Y.4
-
35
-
-
85083954484
-
Stochastic pooling for regularization of deep convolutional neural networks
-
Matthew D. Zeiler and Rob Fergus. Stochastic pooling for regularization of deep convolutional neural networks. In Proceedings of the ICLR, 2013.
-
(2013)
Proceedings of the ICLR
-
-
Zeiler, M.D.1
Fergus, R.2
-
36
-
-
77953183471
-
What is the best multi-stage architecture for object recognition?
-
Kevin Jarrett, Koray Kavukcuoglu, Marc'Aurelio Ranzato, and Yann LeCun. What is the best multi-stage architecture for object recognition? In Proc. of the ICCV, pages 2146-2153, 2009.
-
(2009)
Proc. of the ICCV
, pp. 2146-2153
-
-
Jarrett, K.1
Kavukcuoglu, K.2
Aurelio Ranzato, M.3
Lecun, Y.4
-
37
-
-
84860524227
-
Biographies, bollywood, boom-boxes and blenders: Domain adaptation for sentiment classification
-
John Blitzer, Mark Dredze, and Fernando Pereira. Biographies, bollywood, boom-boxes and blenders: Domain adaptation for sentiment classification. Annual Meeting-ACL, 2007.
-
(2007)
Annual Meeting-ACL
-
-
Blitzer, J.1
Dredze, M.2
Pereira, F.3
-
38
-
-
80053443013
-
Domain adaptation for large-scale sentiment classification: A deep learning approach
-
Xavier Glorot, Antoine Bordes, and Yoshua Bengio. Domain adaptation for large-scale sentiment classification: A deep learning approach. In Proceedings of the ICML, number 1, 2011.
-
(2011)
Proceedings of the ICML
, Issue.1
-
-
Glorot, X.1
Bordes, A.2
Bengio, Y.3
|