메뉴 건너뛰기




Volumn 15, Issue 3, 2013, Pages 721-752

Minimum mutual information and non-gaussianity through the maximum entropy method: Estimation from finite samples

Author keywords

Entropy bias; Maximum entropy distributions; Morphism; Mutual information; Mutual information distribution; Non Gaussianity

Indexed keywords


EID: 84875404998     PISSN: None     EISSN: 10994300     Source Type: Journal    
DOI: 10.3390/e15030721     Document Type: Article
Times cited : (9)

References (42)
  • 1
    • 84856043672 scopus 로고
    • The mathematical theory of communication
    • Shannon, C.E. The mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379-423.
    • (1948) Bell Syst. Tech. J. , vol.27 , pp. 379-423
    • Shannon, C.E.1
  • 3
    • 33745726849 scopus 로고    scopus 로고
    • Neural correlations, population coding and computation
    • Averbeck, B.B. Latham, P.E. Pouget, A. Neural correlations, population coding and computation. Nat. Rev. Neurosci. 2006, 7, 358-366.
    • (2006) Nat. Rev. Neurosci. , vol.7 , pp. 358-366
    • Averbeck, B.B.1    Latham, P.E.2    Pouget, A.3
  • 4
    • 0008280251 scopus 로고    scopus 로고
    • London Mathematical Society Student Texts (No. 20) Cambridge University Press: Cambridge UK
    • Goldie, C.M. Pinch, R.G.E. Communication Theory. In London Mathematical Society Student Texts (No. 20) Cambridge University Press: Cambridge, UK, 1991.
    • Communication Theory , pp. 1991
    • Goldie, C.M.1    Pinch, R.G.E.2
  • 5
    • 33744541116 scopus 로고    scopus 로고
    • Rational Inattention: Beyond the Linear-Quadratic Case
    • Sims, C.A. Rational Inattention: Beyond the Linear-Quadratic Case. Am. Econ. Rev. 2006, 96, 158-163.
    • (2006) Am. Econ. Rev. , vol.96 , pp. 158-163
    • Sims, C.A.1
  • 6
    • 77956415338 scopus 로고    scopus 로고
    • Entropy and Information Approaches to Genetic Diversity and its Expression: Genomic Geography
    • Sherwin, W.E. Entropy and Information Approaches to Genetic Diversity and its Expression: Genomic Geography. Entropy 2010, 12, 1765-1798.
    • (2010) Entropy , vol.12 , pp. 1765-1798
    • Sherwin, W.E.1
  • 7
    • 34249706857 scopus 로고    scopus 로고
    • Characterizing linguistic structure with mutual information
    • Pothos, E.M. Juola, P. Characterizing linguistic structure with mutual information. Br. J. Psychol. 2007, 98, 291-304.
    • (2007) Br. J. Psychol. , vol.98 , pp. 291-304
    • Pothos, E.M.1    Juola, P.2
  • 8
    • 33847363253 scopus 로고    scopus 로고
    • Non-Gaussianity and asymmetry of the winter monthly precipitation estimation from the NAO
    • Pires, C.A. Perdigão, R.A.P. Non-Gaussianity and asymmetry of the winter monthly precipitation estimation from the NAO. Mon. Wea. Rev. 2007, 135, 430-448.
    • (2007) Mon. Wea. Rev. , vol.135 , pp. 430-448
    • Pires, C.A.1    Perdigão, R.A.P.2
  • 10
    • 62549144942 scopus 로고    scopus 로고
    • The minimum information principle and its application to neural code analysis
    • Globerson, A. Stark, E. Vaadia, E. Tishby, N. The minimum information principle and its application to neural code analysis. Proc. Natl. Accd. Sci. USA 2009, 106, 3490-3495.
    • (2009) Proc. Natl. Accd. Sci. USA , vol.106 , pp. 3490-3495
    • Globerson, A.1    Stark, E.2    Vaadia, E.3    Tishby, N.4
  • 11
    • 79951698849 scopus 로고    scopus 로고
    • Lower bounds on mutual information
    • 010101(R):1-010101(R):4
    • Foster, D.V. Grassberger, P. Lower bounds on mutual information. Phys. Rev. E 2011, 83, 010101(R):1-010101(R):4.
    • (2011) Phys. Rev. E , vol.83
    • Foster, D.V.1    Grassberger, P.2
  • 12
    • 84864594043 scopus 로고    scopus 로고
    • Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
    • Pires, C.A. Perdigão, R.A.P. Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties. Entropy 2012, 14, 1103-1126.
    • (2012) Entropy , vol.14 , pp. 1103-1126
    • Pires, C.A.1    Perdigão, R.A.P.2
  • 13
    • 69049111920 scopus 로고    scopus 로고
    • Estimation of mutual information: A survey
    • Walters-Williams, J. Li, Y. Estimation of mutual information: A survey. Lect. Notes Comput. Sci. 2009, 5589, 389-396.
    • (2009) Lect. Notes Comput. Sci. , vol.5589 , pp. 389-396
    • Walters-Williams, J.1    Li, Y.2
  • 14
    • 34547962800 scopus 로고    scopus 로고
    • Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data
    • 026209:1-026209:15
    • Khan, S. Bandyopadhyay, S. Ganguly, A.R. Saigal, S. Erickson, D.J. Protopopescu, V. Ostrouchov, G. Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data. Phys. Rev. E 2007, 76, 026209:1-026209:15.
    • (2007) Phys. Rev. E , vol.76
    • Khan, S.1    Bandyopadhyay, S.2    Ganguly, A.R.3    Saigal, S.4    Erickson, D.J.5    Protopopescu, V.6    Ostrouchov, G.7
  • 15
    • 0041877169 scopus 로고    scopus 로고
    • Estimation of entropy and mutual information
    • Paninski, L. Estimation of entropy and mutual information. Neural Comput. 2003, 15, 1191-1254.
    • (2003) Neural Comput. , vol.15 , pp. 1191-1254
    • Paninski, L.1
  • 16
    • 84866067617 scopus 로고    scopus 로고
    • Analytical estimates of limited sampling biases in different information measures
    • Panzeri, S. Treves, A. Analytical estimates of limited sampling biases in different information measures. Comp. Neur. Syst. 1996, 7, 87-107.
    • (1996) Comp. Neur. Syst. , vol.7 , pp. 87-107
    • Panzeri, S.1    Treves, A.2
  • 17
    • 0034571899 scopus 로고    scopus 로고
    • Asymptotic Bias in Information Estimates and the Exponential (Bell) Polynomials
    • Victor, J.D. Asymptotic Bias in Information Estimates and the Exponential (Bell) Polynomials. Neur. Comput. 2000, 12, 2797-2804.
    • (2000) Neur. Comput. , vol.12 , pp. 2797-2804
    • Victor, J.D.1
  • 18
    • 34548790687 scopus 로고    scopus 로고
    • Train Information Measures Correcting for the Sampling Bias Problem in Spike Information Measures
    • Panzeri, S. Senatore, R. Montemurro, M.A. Petersen, R.S. Train Information Measures Correcting for the Sampling Bias Problem in Spike Information Measures. J. Neurophysiol. 2007, 98, 1064-1072.
    • (2007) J. Neurophysiol. , vol.98 , pp. 1064-1072
    • Panzeri, S.1    Senatore, R.2    Montemurro, M.A.3    Petersen, R.S.4
  • 20
    • 0005770331 scopus 로고
    • Note on the bias of information estimates
    • II-B Free Press: Glencoe, IL, USA, H., Ed.
    • Miller, G. Note on the bias of information estimates. In Information Theory in Psycholog Quastler, H., Ed. II-B Free Press: Glencoe, IL, USA, 1955 pp. 95-100.
    • (1955) Information Theory in Psycholog Quastler , pp. 95-100
    • Miller, G.1
  • 23
    • 0004156740 scopus 로고    scopus 로고
    • Springer: New York, NY, USA, ISBN: 0-387-98623-5
    • Nelsen, R.B. An Introduction to Copulas Springer: New York, NY, USA, 1999 ISBN: 0-387-98623-5.
    • (1999) An Introduction to Copulas
    • Nelsen, R.B.1
  • 24
    • 79051469507 scopus 로고    scopus 로고
    • An information-theoretic approach to statistical dependence: Copula information
    • Calsaverini, R.S. Vicente, R. An information-theoretic approach to statistical dependence: Copula information. Europhys. Lett. 2009, 88, 68003.
    • (2009) Europhys. Lett. , vol.88 , pp. 68003
    • Calsaverini, R.S.1    Vicente, R.2
  • 27
    • 10044252103 scopus 로고    scopus 로고
    • Distribution of mutual information from complete and incomplete data
    • Hutter, M. Zaffalon, M. Distribution of mutual information from complete and incomplete data. Comput. Stat. Data An. 2005, 48, 633-657.
    • (2005) Comput. Stat. Data An. , vol.48 , pp. 633-657
    • Hutter, M.1    Zaffalon, M.2
  • 28
    • 0020187981 scopus 로고
    • On the Rationale of Maximum-entropy methods
    • Jaynes, E.T. On the Rationale of Maximum-entropy methods. P. IEEE 1982, 70, 939-952.
    • (1982) P. IEEE , vol.70 , pp. 939-952
    • Jaynes, E.T.1
  • 29
    • 0018877134 scopus 로고
    • Axiomatic derivation of the principle of maximum entropy and the principle of the minimum cross-entropy
    • Shore, J.E. Johnson, R.W. Axiomatic derivation of the principle of maximum entropy and the principle of the minimum cross-entropy. IEEE Trans. Inform. Theor. 1980, 26, 26-37.
    • (1980) IEEE Trans. Inform. Theor. , vol.26 , pp. 26-37
    • Shore, J.E.1    Johnson, R.W.2
  • 33
    • 0022244556 scopus 로고
    • On Covariance in Finite Population Sampling
    • Tam, S.M. On Covariance in Finite Population Sampling. J. Roy. Stat. Soc. D-Sta. 1985, 34, 429-433.
    • (1985) J. Roy. Stat. Soc. D-Sta. , vol.34 , pp. 429-433
    • Tam, S.M.1
  • 34
    • 0004272666 scopus 로고    scopus 로고
    • Cambridge University Press: New York, NY, USA ISBN 978-0-521-49603-2, LCCN. V22 1998 QA276. V22
    • Van det Vaart, A.W. Asymptotic statistics. Cambridge University Press: New York, NY, USA 1998 ISBN 978-0-521-49603-2, LCCN. V22 1998 QA276. V22.
    • (1998) Asymptotic statistics
    • Van det Vaart, A.W.1
  • 35
    • 0344669945 scopus 로고    scopus 로고
    • Entropy densities with an application to autoregressive conditional skewness and kurtosis
    • Rockinger, M. Jondeau, E. Entropy densities with an application to autoregressive conditional skewness and kurtosis. J. Econometrics 2002, 106, 119-142.
    • (2002) J. Econometrics , vol.106 , pp. 119-142
    • Rockinger, M.1    Jondeau, E.2
  • 36
    • 84875384398 scopus 로고    scopus 로고
    • Quadratic Forms of Random Variables. STAT 849 lectures. Available online: (accessed on 22 February 2013)
    • Bates, D. Quadratic Forms of Random Variables. STAT 849 lectures. Available online: http://www.stat.wisc.edu/~st849-1/lectures/Ch02.pdf (accessed on 22 February 2013).
    • Bates, D.1
  • 38
    • 0002532939 scopus 로고
    • On the probable error of a coefficient of correlation deduced from a small sample
    • Fisher, R.A. On the "probable error" of a coefficient of correlation deduced from a small sample. Metron 1921, 1, 3-32.
    • (1921) Metron , vol.1 , pp. 3-32
    • Fisher, R.A.1
  • 39
    • 34547828779 scopus 로고    scopus 로고
    • Applying the bootstrap to the multivariate case: bootstrap component/factor analysis
    • Zientek, L.R. Thompson, B. Applying the bootstrap to the multivariate case: bootstrap component/factor analysis. Behav. Res. Methods 2007, 39, 318-325.
    • (2007) Behav. Res. Methods , vol.39 , pp. 318-325
    • Zientek, L.R.1    Thompson, B.2
  • 40
    • 0000942642 scopus 로고
    • Algorithm AS 84: Measures of multivariate skewness and kurtosis
    • Mardia, K.V. Algorithm AS 84: Measures of multivariate skewness and kurtosis. Appl. Stat. 1975, 24, 262-265.
    • (1975) Appl. Stat. , vol.24 , pp. 262-265
    • Mardia, K.V.1
  • 42
    • 84875407045 scopus 로고    scopus 로고
    • The NCEP/NCAR Reanalysis Project. Available online: (accessed on 22 February 2013)
    • The NCEP/NCAR Reanalysis Project. Available online: http://www.esrl.noaa.gov/psd/data/reanalysis/reanalysis.shtml/(accessed on 22 February 2013).


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.