-
1
-
-
0000335879
-
Parameter adaptation in stochastic optimization
-
D. Saad, Ed. Cambridge, U.K.: Cambridge Univ. Press ch. 6
-
L. B. Almeida, T. Langlois, J. D. Amaral, and A. Plakhov, "Parameter adaptation in stochastic optimization," in On-Line Learning in Neural Networks, D. Saad, Ed. Cambridge, U.K.: Cambridge Univ. Press, 1999, ch. 6, pp. 111-134.
-
(1999)
On-Line Learning in Neural Networks
, pp. 111-134
-
-
Almeida, L.B.1
Langlois, T.2
Amaral, J.D.3
Plakhov, A.4
-
2
-
-
34548281969
-
Could information theory provide an ecological theory of sensory processing?
-
J. J. Atick, "Could information theory provide an ecological theory of sensory processing?," Network, vol. 3, pp. 213-251, 1992.
-
(1992)
Network
, vol.3
, pp. 213-251
-
-
Atick, J.J.1
-
3
-
-
0002014402
-
Possible principles underlying the transformation of sensory messages
-
W. A. Rosenbluth, Ed. Cambridge, MA: MIT Press
-
H. B. Barlow, "Possible principles underlying the transformation of sensory messages," in Sensory Communication, W. A. Rosenbluth, Ed. Cambridge, MA: MIT Press, 1961.
-
(1961)
Sensory Communication
-
-
Barlow, H.B.1
-
4
-
-
0001471775
-
Unsupervised learning
-
H. B. Barlow, "Unsupervised learning," Neural Computat., vol. 1, no. 3, pp. 295-311, 1989.
-
(1989)
Neural Computat.
, vol.1
, Issue.3
, pp. 295-311
-
-
Barlow, H.B.1
-
5
-
-
0026586030
-
A self-organizing neural network that discovers surfaces in random-dot stereograms
-
S. Becker and G. E. Hinton, "A self-organizing neural network that discovers surfaces in random-dot stereograms," Nature, vol. 355, pp. 161-163, 1992.
-
(1992)
Nature
, vol.355
, pp. 161-163
-
-
Becker, S.1
Hinton, G.E.2
-
6
-
-
0001787422
-
Non-parametric entropy estimation: An overview
-
J. Beirlant, E. J. Dudewicz, L. Györfi, and E. C. van der Meulen, "Non-parametric entropy estimation: An overview," Int. J. Math. Statist. Sci., vol. 6, no. 1, pp. 17-39, 1997.
-
(1997)
Int. J. Math. Statist. Sci.
, vol.6
, Issue.1
, pp. 17-39
-
-
Beirlant, J.1
Dudewicz, E.J.2
Györfi, L.3
van der Meulen, E.C.4
-
7
-
-
0029411030
-
An information-maximization approach to blind separation and blind deconvolution
-
A. J. Bell and T. J. Sejnowski, "An information-maximization approach to blind separation and blind deconvolution," Neural Computat., vol. 7, no. 6, pp. 1129-1159, 1995.
-
(1995)
Neural Computat.
, vol.7
, Issue.6
, pp. 1129-1159
-
-
Bell, A.J.1
Sejnowski, T.J.2
-
8
-
-
0003857778
-
-
International Computer Science Institute, Univ. California Berkeley, CA, Technical Report TR-97-021
-
J. A. Bilmes, "A Gentle Tutorial of the EM Algorithm and Its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models," International Computer Science Institute, Univ. California Berkeley, CA, Technical Report TR-97-021, 1997.
-
(1997)
A Gentle Tutorial of the EM Algorithm and Its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models
-
-
Bilmes, J.A.1
-
9
-
-
0001699291
-
Training stochastic model recognition algorithms as networks can lead to maximum mutual information estimation of parameters
-
D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann
-
J. S. Bridle, "Training stochastic model recognition algorithms as networks can lead to maximum mutual information estimation of parameters," in Advances in Neural Information Processing Systems, D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 1990, vol. 2, pp. 211-217.
-
(1990)
Advances in Neural Information Processing Systems
, vol.2
, pp. 211-217
-
-
Bridle, J.S.1
-
10
-
-
0028416938
-
Independent component analysis, a new concept?
-
P. Comon, "Independent component analysis, a new concept?," Signal Process., vol. 36, no. 3, pp. 287-314, 1994.
-
(1994)
Signal Process.
, vol.36
, Issue.3
, pp. 287-314
-
-
Comon, P.1
-
11
-
-
0002629270
-
Maximum likelihood from incomplete data via the EM algorithm (with discussion)
-
Ser. B
-
A. P. Dempster, N. M. Laird, and D. B. Rubin, "Maximum likelihood from incomplete data via the EM algorithm (with discussion)," J. Roy. Statist. Soc. Ser. B, vol. 39, pp. 1-38, 1977.
-
(1977)
J. Roy. Statist. Soc.
, vol.39
, pp. 1-38
-
-
Dempster, A.P.1
Laird, N.M.2
Rubin, D.B.3
-
13
-
-
0041663375
-
Online entropy manipulation: Stochastic information gradient
-
Aug
-
D. Erdogmus, K. E. Hild II, and J. C. Principe, "Online entropy manipulation: stochastic information gradient," IEEE Signal Processing Lett., vol. 10, pp. 242-245, Aug. 2003.
-
(2003)
IEEE Signal Processing Lett.
, vol.10
, pp. 242-245
-
-
Erdogmus, D.1
Hild II, K.E.2
Principe, J.C.3
-
14
-
-
0036737108
-
Generalized information potential criterion for adaptive system training
-
Sept
-
D. Erdogmus and J. C. Principe, "Generalized information potential criterion for adaptive system training," IEEE Trans. Neural Networks, vol. 13, pp. 1035-1044, Sept. 2002.
-
(2002)
IEEE Trans. Neural Networks
, vol.13
, pp. 1035-1044
-
-
Erdogmus, D.1
Principe, J.C.2
-
15
-
-
0016102310
-
A projection pursuit algorithm for exploratory data analysis
-
J. H. Friedman and J. W. Tukey, "A projection pursuit algorithm for exploratory data analysis," IEEE Trans. Comput., vol. 23, pp. 881-889, 1974.
-
(1974)
IEEE Trans. Comput.
, vol.23
, pp. 881-889
-
-
Friedman, J.H.1
Tukey, J.W.2
-
16
-
-
0000263797
-
Projection pursuit
-
P. J. Huber, "Projection pursuit," Ann. Statist., vol. 13, no. 2, pp. 435-475, 1985.
-
(1985)
Ann. Statist.
, vol.13
, Issue.2
, pp. 435-475
-
-
Huber, P.J.1
-
19
-
-
0346494464
-
Additive versus exponentiated gradient updates for linear prediction
-
New York, May
-
J. Kivinen and M. K. Warmuth, "Additive versus exponentiated gradient updates for linear prediction," in Proc. 27th Annu. ACM Symp. Theory Computing, New York, May 1995, pp. 209-218.
-
(1995)
Proc. 27th Annu. ACM Symp. Theory Computing
, pp. 209-218
-
-
Kivinen, J.1
Warmuth, M.K.2
-
20
-
-
0023981750
-
Self-organization in a perceptual network
-
March
-
R. Linsker, "Self-organization in a perceptual network," Computer, pp. 105-117, March 1988.
-
(1988)
Computer
, pp. 105-117
-
-
Linsker, R.1
-
21
-
-
0034131785
-
On-line EM algorithm for the normalized Gaussian network
-
M.Masa-aki Sato and S. Ishii, "On-line EM algorithm for the normalized Gaussian network," Neural Computat., vol. 12, no. 2, pp. 407-432, 2000.
-
(2000)
Neural Computat.
, vol.12
, Issue.2
, pp. 407-432
-
-
Masa-aki Sato, S.1
Ishii, S.2
-
22
-
-
85106600538
-
Symplectic nonlinear component analysis
-
Cambridge, MA: MIT Press
-
L. C. Parra, "Symplectic nonlinear component analysis," in Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 1996, pp. 437-443.
-
(1996)
Advances in Neural Information Processing Systems
, pp. 437-443
-
-
Parra, L.C.1
-
23
-
-
0001473437
-
On the estimation of a probability density function and mode
-
E. Parzen, "On the estimation of a probability density function and mode," Ann. Math. Statist., vol. 33, pp. 1065-1076, 1962.
-
(1962)
Ann. Math. Statist.
, vol.33
, pp. 1065-1076
-
-
Parzen, E.1
-
24
-
-
0000986833
-
Information-theoretic learning
-
S. Haykin, Ed. New York: Wiley, ch. 7
-
J. C. Principe, D. Xu, and J. W. Fisher III, "Information-theoretic learning," in Unsupervised Adaptive Filtering: Blind Source Separation, S. Haykin, Ed. New York: Wiley, 2000, vol. 1, ch. 7, pp. 265-319.
-
(2000)
Unsupervised Adaptive Filtering: Blind Source Separation
, vol.1
, pp. 265-319
-
-
Principe, J.C.1
Xu, D.2
Fisher III, J.W.3
-
25
-
-
0002408684
-
On measures of entropy and information
-
Berkeley, CA, 1961, Reprinted in Selected Papers of Alfred Renyi Akademia Kiado, Budapest
-
A. Renyi, "On measures of entropy and information," in Proc. 4th Berkeley Symp. Mathematical Statistics and Probability, Berkeley, CA, 1961, Reprinted in Selected Papers of Alfred Renyi Akademia Kiado, Budapest, 1961, pp. 547-561.
-
(1961)
Proc. 4th Berkeley Symp. Mathematical Statistics and Probability
, pp. 547-561
-
-
Renyi, A.1
-
27
-
-
0040422903
-
Learning factorial codes by predictability minimization
-
J. Schmidhuber, "Learning factorial codes by predictability minimization," Neural Computat., vol. 4, no. 6, pp. 863-879, 1992.
-
(1992)
Neural Computat.
, vol.4
, Issue.6
, pp. 863-879
-
-
Schmidhuber, J.1
-
28
-
-
0039238512
-
Semilinear predictability minimization produces well-known feature detectors
-
J. Schmidhuber, M. Eldracher, and B. Foltin, "Semilinear predictability minimization produces well-known feature detectors," Neural Computat., vol. 8, no. 4, pp. 773-786, 1996.
-
(1996)
Neural Computat.
, vol.8
, Issue.4
, pp. 773-786
-
-
Schmidhuber, J.1
Eldracher, M.2
Foltin, B.3
-
29
-
-
0043262323
-
-
Ph.D. dissertation, Univ. California, San Diego, CA
-
N. N. Schraudolph, "Optimization of Entropy With Neural Networks," Ph.D. dissertation, Univ. California, San Diego, CA, 1995.
-
(1995)
Optimization of Entropy With Neural Networks
-
-
Schraudolph, N.N.1
-
30
-
-
0033338205
-
Local gain adaptation in stochastic gradient descent
-
Edinburgh, Scotland
-
N. N. Schraudolph, "Local gain adaptation in stochastic gradient descent," in Proc. Int. Conf. Artificial Neural Networks, Edinburgh, Scotland, 1999, pp. 569-574.
-
(1999)
Proc. Int. Conf. Artificial Neural Networks
, pp. 569-574
-
-
Schraudolph, N.N.1
-
31
-
-
0036631778
-
Fast curvature matrix-vector products for second-order gradient descent
-
N. N. Schraudolph, "Fast curvature matrix-vector products for second-order gradient descent," Neural Computat., vol. 14, no. 7, pp. 1723-1738, 2002.
-
(2002)
Neural Computat.
, vol.14
, Issue.7
, pp. 1723-1738
-
-
Schraudolph, N.N.1
-
32
-
-
0041637282
-
Processing images by semi-linear predictability minimization
-
N. N. Schraudolph, M. Eldracher, and J. Schmidhuber, "Processing images by semi-linear predictability minimization," Network: Computat. Neural Syst., vol. 10, no. 2, pp. 133-169, 1999.
-
(1999)
Network: Computat. Neural Syst.
, vol.10
, Issue.2
, pp. 133-169
-
-
Schraudolph, N.N.1
Eldracher, M.2
Schmidhuber, J.3
-
33
-
-
0041016997
-
Unsupervised discrimination of clustered data via optimization of binary information gain
-
S. J. Hanson, J. D. Cowan, and C. L. Giles, Eds. San Mateo, CA: Morgan Kaufmann
-
N. N. Schraudolph and T. J. Sejnowski, "Unsupervised discrimination of clustered data via optimization of binary information gain," in Advances in Neural Information Processing Systems, S. J. Hanson, J. D. Cowan, and C. L. Giles, Eds. San Mateo, CA: Morgan Kaufmann, 1993, vol. 5, pp. 499-506.
-
(1993)
Advances in Neural Information Processing Systems
, vol.5
, pp. 499-506
-
-
Schraudolph, N.N.1
Sejnowski, T.J.2
-
34
-
-
0002824981
-
Speeding up back-propagation
-
R. Eckmiller, Ed. Amsterdam: Elsevier
-
F. M. Silva and L. B. Almeida, "Speeding up back-propagation," in Advanced Neural Computers, R. Eckmiller, Ed. Amsterdam: Elsevier, 1990, pp. 151-158.
-
(1990)
Advanced Neural Computers
, pp. 151-158
-
-
Silva, F.M.1
Almeida, L.B.2
-
35
-
-
0001526001
-
A quasi-Bayes sequential procedure for mixtures
-
A. F. M. Smith and U. E. Makov, "A quasi-Bayes sequential procedure for mixtures," J. Roy. Statist. Soc. B, vol. 40, no. 1, pp. 106-112, 1978.
-
(1978)
J. Roy. Statist. Soc. B
, vol.40
, Issue.1
, pp. 106-112
-
-
Smith, A.F.M.1
Makov, U.E.2
-
36
-
-
0025593679
-
SuperSAB: Fast adaptive back propagation with good scaling properties
-
T. Tollenaere, "SuperSAB: fast adaptive back propagation with good scaling properties," Neural Networks, vol. 3, pp. 561-573, 1990.
-
(1990)
Neural Networks
, vol.3
, pp. 561-573
-
-
Tollenaere, T.1
-
37
-
-
1942450610
-
Feature extraction by nonparametric mutual information maximization
-
K. Torkkola, "Feature extraction by nonparametric mutual information maximization," J. Machine Learning Research, vol. 3, pp. 1415-1438, 2003.
-
(2003)
J. Machine Learning Research
, vol.3
, pp. 1415-1438
-
-
Torkkola, K.1
-
38
-
-
0008458860
-
-
Eds., Cambridge, MA: MIT Press
-
D. S. Touretzky, M. C. Mozer, and M. E. Hasselmo, Eds., Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 1996, vol. 8.
-
(1996)
Advances in Neural Information Processing Systems
, vol.8
-
-
Touretzky, D.S.1
Mozer, M.C.2
Hasselmo, M.E.3
-
39
-
-
0005671334
-
The formation of topographic maps that maximize the average mutual information of the output responses to noiseless input signals
-
M. M. van Hulle, "The formation of topographic maps that maximize the average mutual information of the output responses to noiseless input signals," Neural Computat., vol. 9, no. 3, pp. 595-606, 1997.
-
(1997)
Neural Computat.
, vol.9
, Issue.3
, pp. 595-606
-
-
van Hulle, M.M.1
-
40
-
-
0001405580
-
Kernel-based equiprobabilistic topographic map formation
-
M. M. van Hulle, "Kernel-based equiprobabilistic topographic map formation," Neural Computat., vol. 10, no. 7, pp. 1847-1871, 1998.
-
(1998)
Neural Computat.
, vol.10
, Issue.7
, pp. 1847-1871
-
-
van Hulle, M.M.1
-
42
-
-
77952553565
-
Empirical entropy manipulation for real-world problems
-
Cambridge, MA: MIT Press
-
P. A. Viola, N. N. Schraudolph, and T. J. Sejnowski, "empirical entropy manipulation for real-world problems," in Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, pp. 851-857.
-
Advances in Neural Information Processing Systems
, pp. 851-857
-
-
Viola, P.A.1
Schraudolph, N.N.2
Sejnowski, T.J.3
-
43
-
-
0029234207
-
Alignment by maximization of mutual information
-
Cambridge, MA
-
P. A. Viola and W. M. Wells III, "Alignment by maximization of mutual information," in Fifth Int. Conf. Computer Vision, Cambridge, MA, 1995, pp. 16-23.
-
(1995)
Fifth Int. Conf. Computer Vision
, pp. 16-23
-
-
Viola, P.A.1
Wells III, W.M.2
|