-
1
-
-
84856043672
-
The mathematical theory of communication
-
Shannon, C.E. The mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379-423.
-
(1948)
Bell Syst. Tech. J.
, vol.27
, pp. 379-423
-
-
Shannon, C.E.1
-
2
-
-
0004188848
-
-
John Wiley & Sons, Inc: New York, NY, USA, USA
-
Cover, T.M. Thomas, J.A. Elements of Information Theory John Wiley & Sons, Inc.: New York, NY, USA, 1991.
-
(1991)
Elements of Information Theory
-
-
Cover, T.M.1
Thomas, J.A.2
-
3
-
-
33745726849
-
Neural correlations, population coding and computation
-
Averbeck, B.B. Latham, P.E. Pouget, A. Neural correlations, population coding and computation. Nat. Rev. Neurosci. 2006, 7, 358-366.
-
(2006)
Nat. Rev. Neurosci.
, vol.7
, pp. 358-366
-
-
Averbeck, B.B.1
Latham, P.E.2
Pouget, A.3
-
4
-
-
0008280251
-
-
London Mathematical Society Student Texts (No. 20) Cambridge University Press: Cambridge UK
-
Goldie, C.M. Pinch, R.G.E. Communication Theory. In London Mathematical Society Student Texts (No. 20) Cambridge University Press: Cambridge, UK, 1991.
-
Communication Theory
, pp. 1991
-
-
Goldie, C.M.1
Pinch, R.G.E.2
-
5
-
-
33744541116
-
Rational Inattention: Beyond the Linear-Quadratic Case
-
Sims, C.A. Rational Inattention: Beyond the Linear-Quadratic Case. Am. Econ. Rev. 2006, 96, 158-163.
-
(2006)
Am. Econ. Rev.
, vol.96
, pp. 158-163
-
-
Sims, C.A.1
-
6
-
-
77956415338
-
Entropy and Information Approaches to Genetic Diversity and its Expression: Genomic Geography
-
Sherwin, W.E. Entropy and Information Approaches to Genetic Diversity and its Expression: Genomic Geography. Entropy 2010, 12, 1765-1798.
-
(2010)
Entropy
, vol.12
, pp. 1765-1798
-
-
Sherwin, W.E.1
-
7
-
-
34249706857
-
Characterizing linguistic structure with mutual information
-
Pothos, E.M. Juola, P. Characterizing linguistic structure with mutual information. Br. J. Psychol. 2007, 98, 291-304.
-
(2007)
Br. J. Psychol.
, vol.98
, pp. 291-304
-
-
Pothos, E.M.1
Juola, P.2
-
8
-
-
33847363253
-
Non-Gaussianity and asymmetry of the winter monthly precipitation estimation from the NAO
-
Pires, C.A. Perdigão, R.A.P. Non-Gaussianity and asymmetry of the winter monthly precipitation estimation from the NAO. Mon. Wea. Rev. 2007, 135, 430-448.
-
(2007)
Mon. Wea. Rev.
, vol.135
, pp. 430-448
-
-
Pires, C.A.1
Perdigão, R.A.P.2
-
9
-
-
62549156450
-
The minimum information principle for discriminative learning
-
Banff, Canada, 7-11 July
-
Globerson, A. Tishby, N. The minimum information principle for discriminative learning. In Proceedings of the 20th conference on Uncertainty in artificial intelligence, Banff, Canada, 7-11 July 2004 pp. 193-200.
-
(2004)
Proceedings of the 20th conference on Uncertainty in artificial intelligence
, pp. 193-200
-
-
Globerson, A.1
Tishby, N.2
-
10
-
-
62549144942
-
The minimum information principle and its application to neural code analysis
-
Globerson, A. Stark, E. Vaadia, E. Tishby, N. The minimum information principle and its application to neural code analysis. Proc. Natl. Accd. Sci. USA 2009, 106, 3490-3495.
-
(2009)
Proc. Natl. Accd. Sci. USA
, vol.106
, pp. 3490-3495
-
-
Globerson, A.1
Stark, E.2
Vaadia, E.3
Tishby, N.4
-
11
-
-
79951698849
-
Lower bounds on mutual information
-
010101(R):1-010101(R):4
-
Foster, D.V. Grassberger, P. Lower bounds on mutual information. Phys. Rev. E 2011, 83, 010101(R):1-010101(R):4.
-
(2011)
Phys. Rev. E
, vol.83
-
-
Foster, D.V.1
Grassberger, P.2
-
12
-
-
84864594043
-
Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
-
Pires, C.A. Perdigão, R.A.P. Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties. Entropy 2012, 14, 1103-1126.
-
(2012)
Entropy
, vol.14
, pp. 1103-1126
-
-
Pires, C.A.1
Perdigão, R.A.P.2
-
14
-
-
34547962800
-
Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data
-
026209:1-026209:15
-
Khan, S. Bandyopadhyay, S. Ganguly, A.R. Saigal, S. Erickson, D.J. Protopopescu, V. Ostrouchov, G. Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data. Phys. Rev. E 2007, 76, 026209:1-026209:15.
-
(2007)
Phys. Rev. E
, vol.76
-
-
Khan, S.1
Bandyopadhyay, S.2
Ganguly, A.R.3
Saigal, S.4
Erickson, D.J.5
Protopopescu, V.6
Ostrouchov, G.7
-
15
-
-
0041877169
-
Estimation of entropy and mutual information
-
Paninski, L. Estimation of entropy and mutual information. Neural Comput. 2003, 15, 1191-1254.
-
(2003)
Neural Comput.
, vol.15
, pp. 1191-1254
-
-
Paninski, L.1
-
16
-
-
84866067617
-
Analytical estimates of limited sampling biases in different information measures
-
Panzeri, S. Treves, A. Analytical estimates of limited sampling biases in different information measures. Comp. Neur. Syst. 1996, 7, 87-107.
-
(1996)
Comp. Neur. Syst.
, vol.7
, pp. 87-107
-
-
Panzeri, S.1
Treves, A.2
-
17
-
-
0034571899
-
Asymptotic Bias in Information Estimates and the Exponential (Bell) Polynomials
-
Victor, J.D. Asymptotic Bias in Information Estimates and the Exponential (Bell) Polynomials. Neur. Comput. 2000, 12, 2797-2804.
-
(2000)
Neur. Comput.
, vol.12
, pp. 2797-2804
-
-
Victor, J.D.1
-
18
-
-
34548790687
-
Train Information Measures Correcting for the Sampling Bias Problem in Spike Information Measures
-
Panzeri, S. Senatore, R. Montemurro, M.A. Petersen, R.S. Train Information Measures Correcting for the Sampling Bias Problem in Spike Information Measures. J. Neurophysiol. 2007, 98, 1064-1072.
-
(2007)
J. Neurophysiol.
, vol.98
, pp. 1064-1072
-
-
Panzeri, S.1
Senatore, R.2
Montemurro, M.A.3
Petersen, R.S.4
-
19
-
-
0000989527
-
Entropy and information in neural spike trains
-
Strong, S.P. Koberle, R. de Ruyter van Steveninck, R. Bialek, W. Entropy and information in neural spike trains. Phys. Rev. Lett. 1998, 86, 197-200.
-
(1998)
Phys. Rev. Lett.
, vol.86
, pp. 197-200
-
-
Strong, S.P.1
Koberle, R.2
de Ruyter van Steveninck, R.3
Bialek, W.4
-
20
-
-
0005770331
-
Note on the bias of information estimates
-
II-B Free Press: Glencoe, IL, USA, H., Ed.
-
Miller, G. Note on the bias of information estimates. In Information Theory in Psycholog Quastler, H., Ed. II-B Free Press: Glencoe, IL, USA, 1955 pp. 95-100.
-
(1955)
Information Theory in Psycholog Quastler
, pp. 95-100
-
-
Miller, G.1
-
22
-
-
44449107124
-
Entropy estimates of small data sets
-
Bonachela, J.A. Hinrichsen, H. Muñoz, M.A. Entropy estimates of small data sets. J. Phys. A 2008, 41, 202001.
-
(2008)
J. Phys. A
, vol.41
, pp. 202001
-
-
Bonachela, J.A.1
Hinrichsen, H.2
Muñoz, M.A.3
-
23
-
-
0004156740
-
-
Springer: New York, NY, USA, ISBN: 0-387-98623-5
-
Nelsen, R.B. An Introduction to Copulas Springer: New York, NY, USA, 1999 ISBN: 0-387-98623-5.
-
(1999)
An Introduction to Copulas
-
-
Nelsen, R.B.1
-
24
-
-
79051469507
-
An information-theoretic approach to statistical dependence: Copula information
-
Calsaverini, R.S. Vicente, R. An information-theoretic approach to statistical dependence: Copula information. Europhys. Lett. 2009, 88, 68003.
-
(2009)
Europhys. Lett.
, vol.88
, pp. 68003
-
-
Calsaverini, R.S.1
Vicente, R.2
-
26
-
-
85162369175
-
How biased are maximum entropy models?
-
Macke, J.H. Murray, I. Latham, P.E. How biased are maximum entropy models? Adv. Neur. Inf. Proc. Syst. 2011, 24, 2034-2042.
-
(2011)
Adv. Neur. Inf. Proc. Syst.
, vol.24
, pp. 2034-2042
-
-
Macke, J.H.1
Murray, I.2
Latham, P.E.3
-
27
-
-
10044252103
-
Distribution of mutual information from complete and incomplete data
-
Hutter, M. Zaffalon, M. Distribution of mutual information from complete and incomplete data. Comput. Stat. Data An. 2005, 48, 633-657.
-
(2005)
Comput. Stat. Data An.
, vol.48
, pp. 633-657
-
-
Hutter, M.1
Zaffalon, M.2
-
28
-
-
0020187981
-
On the Rationale of Maximum-entropy methods
-
Jaynes, E.T. On the Rationale of Maximum-entropy methods. P. IEEE 1982, 70, 939-952.
-
(1982)
P. IEEE
, vol.70
, pp. 939-952
-
-
Jaynes, E.T.1
-
29
-
-
0018877134
-
Axiomatic derivation of the principle of maximum entropy and the principle of the minimum cross-entropy
-
Shore, J.E. Johnson, R.W. Axiomatic derivation of the principle of maximum entropy and the principle of the minimum cross-entropy. IEEE Trans. Inform. Theor. 1980, 26, 26-37.
-
(1980)
IEEE Trans. Inform. Theor.
, vol.26
, pp. 26-37
-
-
Shore, J.E.1
Johnson, R.W.2
-
30
-
-
78650261469
-
Information Measures in Perspective
-
Ebrahimi, N. Soofi, E.S. Soyer, R. Information Measures in Perspective. Int. Stat. Rev. 2010, 78, 383-412.
-
(2010)
Int. Stat. Rev.
, vol.78
, pp. 383-412
-
-
Ebrahimi, N.1
Soofi, E.S.2
Soyer, R.3
-
32
-
-
43449096966
-
-
J., Ed. Risk Publications: London, UK, Section 2
-
Charpentier, A. Fermanian, J.D. Copulas: From Theory to Application in Finance Rank, J., Ed. Risk Publications: London, UK, 2007 Section 2.
-
(2007)
Copulas: From Theory to Application in Finance Rank
-
-
Charpentier, A.1
Fermanian, J.D.2
-
33
-
-
0022244556
-
On Covariance in Finite Population Sampling
-
Tam, S.M. On Covariance in Finite Population Sampling. J. Roy. Stat. Soc. D-Sta. 1985, 34, 429-433.
-
(1985)
J. Roy. Stat. Soc. D-Sta.
, vol.34
, pp. 429-433
-
-
Tam, S.M.1
-
34
-
-
0004272666
-
-
Cambridge University Press: New York, NY, USA ISBN 978-0-521-49603-2, LCCN. V22 1998 QA276. V22
-
Van det Vaart, A.W. Asymptotic statistics. Cambridge University Press: New York, NY, USA 1998 ISBN 978-0-521-49603-2, LCCN. V22 1998 QA276. V22.
-
(1998)
Asymptotic statistics
-
-
Van det Vaart, A.W.1
-
35
-
-
0344669945
-
Entropy densities with an application to autoregressive conditional skewness and kurtosis
-
Rockinger, M. Jondeau, E. Entropy densities with an application to autoregressive conditional skewness and kurtosis. J. Econometrics 2002, 106, 119-142.
-
(2002)
J. Econometrics
, vol.106
, pp. 119-142
-
-
Rockinger, M.1
Jondeau, E.2
-
36
-
-
84875384398
-
-
Quadratic Forms of Random Variables. STAT 849 lectures. Available online: (accessed on 22 February 2013)
-
Bates, D. Quadratic Forms of Random Variables. STAT 849 lectures. Available online: http://www.stat.wisc.edu/~st849-1/lectures/Ch02.pdf (accessed on 22 February 2013).
-
-
-
Bates, D.1
-
37
-
-
24344496951
-
An approximation to the distribution of finite sample size mutual information estimates
-
Seoul, Korea, 16-20 May 2005
-
Goebel, B. Dawy, Z. Hagenauer, J. Mueller, J.C. An approximation to the distribution of finite sample size mutual information estimates. 2005. In Proceedings of IEEE International Conference on Communications (ICC' 05), Seoul, Korea, 16-20 May 2005 pp. 1102-1106.
-
(2005)
Proceedings of IEEE International Conference on Communications (ICC' 05)
, pp. 1102-1106
-
-
Goebel, B.1
Dawy, Z.2
Hagenauer, J.3
Mueller, J.C.4
-
38
-
-
0002532939
-
On the probable error of a coefficient of correlation deduced from a small sample
-
Fisher, R.A. On the "probable error" of a coefficient of correlation deduced from a small sample. Metron 1921, 1, 3-32.
-
(1921)
Metron
, vol.1
, pp. 3-32
-
-
Fisher, R.A.1
-
39
-
-
34547828779
-
Applying the bootstrap to the multivariate case: bootstrap component/factor analysis
-
Zientek, L.R. Thompson, B. Applying the bootstrap to the multivariate case: bootstrap component/factor analysis. Behav. Res. Methods 2007, 39, 318-325.
-
(2007)
Behav. Res. Methods
, vol.39
, pp. 318-325
-
-
Zientek, L.R.1
Thompson, B.2
-
40
-
-
0000942642
-
Algorithm AS 84: Measures of multivariate skewness and kurtosis
-
Mardia, K.V. Algorithm AS 84: Measures of multivariate skewness and kurtosis. Appl. Stat. 1975, 24, 262-265.
-
(1975)
Appl. Stat.
, vol.24
, pp. 262-265
-
-
Mardia, K.V.1
-
41
-
-
85014721473
-
The North Atlantic Oscillation
-
Hurrell, J.W. Kushnir, Y. Visbeck, M. The North Atlantic Oscillation. Science 2001, 26, 291.
-
(2001)
Science
, vol.26
, pp. 291
-
-
Hurrell, J.W.1
Kushnir, Y.2
Visbeck, M.3
-
42
-
-
84875407045
-
-
The NCEP/NCAR Reanalysis Project. Available online: (accessed on 22 February 2013)
-
The NCEP/NCAR Reanalysis Project. Available online: http://www.esrl.noaa.gov/psd/data/reanalysis/reanalysis.shtml/(accessed on 22 February 2013).
-
-
-
|