-
1
-
-
0001912663
-
Two-sample test statistics for measuring discrepancies between two multivariate probability density functions using kernel-based density estimates
-
7
-
N. Anderson, P. Hall, and D. Titterington. Two-sample test statistics for measuring discrepancies between two multivariate probability density functions using kernel-based density estimates. Journal of Multivariate Analysis, 50(1):41-54, 7 1994.
-
(1994)
Journal of Multivariate Analysis
, vol.50
, Issue.1
, pp. 41-54
-
-
Anderson, N.1
Hall, P.2
Titterington, D.3
-
2
-
-
0011812771
-
Kernel independent component analysis
-
F. R. Bach and M. I. Jordan. Kernel independent component analysis. JMLR, 3:1-48, 2004.
-
(2004)
JMLR
, vol.3
, pp. 1-48
-
-
Bach, F.R.1
Jordan, M.I.2
-
4
-
-
0001787422
-
Nonparametric entropy estimation: An overview
-
J. Beirlant, E. Dudewicz, L. Gyorfi, and E. van der Meulen. Nonparametric entropy estimation: An overview. International Journal of the Mathematical Statistics Sciences, pages 17-39, 1997.
-
(1997)
International Journal of the Mathematical Statistics Sciences
, pp. 17-39
-
-
Beirlant, J.1
Dudewicz, E.2
Gyorfi, L.3
Van Der Meulen, E.4
-
6
-
-
0032641874
-
Estimation of the information by an adaptive partitioning of the observation space
-
5
-
G. A. Darbellay and I. Vajda. Estimation of the information by an adaptive partitioning of the observation space. IEEE Trans. Information Theory, 45(4):1315-1321, 5 1999.
-
(1999)
IEEE Trans. Information Theory
, vol.45
, Issue.4
, pp. 1315-1321
-
-
Darbellay, G.A.1
Vajda, I.2
-
7
-
-
0039318503
-
A simplified method for experimental estimate of the entropy of a stationary sequence
-
R. L. Dobrushin. A simplified method for experimental estimate of the entropy of a stationary sequence. Theory of Probability and its Applications, (4):428-430, 1958.
-
(1958)
Theory of Probability and Its Applications
, Issue.4
, pp. 428-430
-
-
Dobrushin, R.L.1
-
8
-
-
33645690579
-
Fast binary feature selection with conditional mutual information
-
F. Fleuret. Fast binary feature selection with conditional mutual information. JMLR, 5:1531-1555, 2004.
-
(2004)
JMLR
, vol.5
, pp. 1531-1555
-
-
Fleuret, F.1
-
9
-
-
84864063983
-
A kernel method for the twosample-problem
-
B. Schölkopf, J. Platt, and T. Hofmann, editors, Cambridge, MA. MIT Press
-
A. Gretton, K. M. Borgwardt, M. Rasch, B. Schölkopf, and A. Smola. A kernel method for the twosample-problem. In B. Schölkopf, J. Platt, and T. Hofmann, editors, Advances in Neural Information Processing Systems 19, Cambridge, MA, 2007. MIT Press.
-
(2007)
Advances in Neural Information Processing Systems
, vol.19
-
-
Gretton, A.1
Borgwardt, K.M.2
Rasch, M.3
Schölkopf, B.4
Smola, A.5
-
10
-
-
85162060108
-
A kernel statistical test of independence
-
J. C. Platt, D. Koller, Y. Singer, and S. Roweis, editors, Cambridge, MA. MIT Press
-
A. Gretton, K. Fukumizu, C. H. Teo, L. Song, B. Schölkopf, and A. Smola. A kernel statistical test of independence. In J.C. Platt, D. Koller, Y. Singer, and S. Roweis, editors, Advances in Neural Information Processing Systems 20, Cambridge, MA, 2008. MIT Press.
-
(2008)
Advances in Neural Information Processing Systems
, vol.20
-
-
Gretton, A.1
Fukumizu, K.2
Teo, C.H.3
Song, L.4
Schölkopf, B.5
Smola, A.6
-
13
-
-
2942723846
-
A divisive information-theoretic feature clustering algorithm for text classification
-
S. Mallela I. S. Dhillon and R. Kumar. A divisive information-theoretic feature clustering algorithm for text classification. JMLR, 3:1265-1287, 3 2003.
-
(2003)
JMLR
, vol.3
, Issue.3
, pp. 1265-1287
-
-
Mallela, S.1
Dhillon, I.S.2
Kumar, R.3
-
14
-
-
0003821875
-
-
Theory Wiley, New York, USA
-
Leonard Kleinrock. Queueing Systems. Volume 1: Theory. Wiley, New York, USA, 1975.
-
(1975)
Queueing Systems
, vol.1
-
-
Kleinrock, L.1
-
16
-
-
39749164774
-
Estimating mutual information
-
6
-
A. Kraskov, H. Stögbauer, and P. Grassberger. Estimating mutual information. Physical Review E, 69(6):1-16, 6 2004.
-
(2004)
Physical Review e
, vol.69
, Issue.6
, pp. 1-16
-
-
Kraskov, A.1
Stögbauer, H.2
Grassberger, P.3
-
17
-
-
0001927585
-
On information and sufficiency
-
3
-
S. Kullback and R. A. Leibler. On information and sufficiency. Ann. Math. Stats., 22(1):79-86, 3 1951.
-
(1951)
Ann. Math. Stats.
, vol.22
, Issue.1
, pp. 79-86
-
-
Kullback, S.1
Leibler, R.A.2
-
18
-
-
54349103517
-
A class of renyi information estimators for multidimensional densities
-
Submitted
-
N. N. Leonenko, L. Pronzato, and V. Savani. A class of renyi information estimators for multidimensional densities. Annals of Statistics, 2007. Submitted.
-
(2007)
Annals of Statistics
-
-
Leonenko, N.N.1
Pronzato, L.2
Savani, V.3
-
20
-
-
51649085815
-
Nonparametric estimation of the likelihood ratio and divergence functionals
-
X. Nguyen, M. J. Wainwright, and M. I. Jordan. Nonparametric estimation of the likelihood ratio and divergence functionals. In IEEE Int. Symp. Information Theory, Nice, France, 6 2007.
-
(2007)
IEEE Int. Symp. Information Theory, Nice, France
, vol.6
-
-
Nguyen, X.1
Wainwright, M.J.2
Jordan, M.I.3
-
21
-
-
85161959021
-
Estimating divergence functionals and the likelihood ratio by penalized convex risk minimization
-
J. C. Platt, D. Koller, Y. Singer, and S. Roweis, editors, Cambridge, MA. MIT Press
-
X. Nguyen, M. J. Wainwright, and M. I. Jordan. Estimating divergence functionals and the likelihood ratio by penalized convex risk minimization. In J.C. Platt, D. Koller, Y. Singer, and S. Roweis, editors, Advances in Neural Information Processing Systems 20, Cambridge, MA, 2008. MIT Press.
-
(2008)
Advances in Neural Information Processing Systems
, vol.20
-
-
Nguyen, X.1
Wainwright, M.J.2
Jordan, M.I.3
-
22
-
-
0041877169
-
Estimation of entropy and mutual information
-
DOI 10.1162/089976603321780272
-
L. Paninski. Estimation of entropy and mutual information. Neural Compt, 15(6):1191-1253, 6 2003. (Pubitemid 37049793)
-
(2003)
Neural Computation
, vol.15
, Issue.6
, pp. 1191-1253
-
-
Paninski, L.1
-
23
-
-
84940644968
-
A mathematical theory of communication
-
C. E. Shannon. A mathematical theory of communication. Bell System Tech. J., pages 379-423, 1948.
-
(1948)
Bell System Tech. J.
, pp. 379-423
-
-
Shannon, C.E.1
-
24
-
-
1942450610
-
Feature extraction by non parametric mutual information maximization
-
K. Torkkola. Feature extraction by non parametric mutual information maximization. JMLR, 3:1415-1438, 2003.
-
(2003)
JMLR
, vol.3
, pp. 1415-1438
-
-
Torkkola, K.1
-
25
-
-
26444495559
-
Divergence estimation of continuous distributions based on data-dependent partitions
-
DOI 10.1109/TIT.2005.853314
-
Q. Wang, S. Kulkarni, and S. Verdú. Divergence estimation of continuous distributions based on datadependent partitions. IEEE Trans. Information Theory, 51(9):3064-3074, 9 2005. (Pubitemid 41430975)
-
(2005)
IEEE Transactions on Information Theory
, vol.51
, Issue.9
, pp. 3064-3074
-
-
Wang, Q.1
Kulkarni, S.R.2
Verdu, S.3
-
26
-
-
39049106144
-
A nearest-neighbor approach to estimating divergence between continuous random vectors
-
Q. Wang, S. Kulkarni, and S. Verdú. A nearest-neighbor approach to estimating divergence between continuous random vectors. In IEEE Int. Symp. Information Theory, Seattle, USA, 7 2006.
-
(2006)
IEEE Int. Symp. Information Theory, Seattle, USA
, vol.7
-
-
Wang, Q.1
Kulkarni, S.2
Verdú, S.3
|