메뉴 건너뛰기




Volumn , Issue , 2010, Pages

The confidence interval of entropy estimation through a noisy channel

Author keywords

[No Author keywords available]

Indexed keywords

ALPHABET SIZE; CONFIDENCE INTERVAL; DISCRETE MEMORYLESS CHANNELS; ENTROPY ESTIMATION; LOWER BOUNDS; MEMORYLESS SOURCE; NOISY CHANNEL; NUMBER OF SAMPLES; PRIOR KNOWLEDGE; SOURCE DISTRIBUTION; SOURCE ENTROPY; TRANSITION MATRICES;

EID: 80051932385     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1109/CIG.2010.5592695     Document Type: Conference Paper
Times cited : (3)

References (17)
  • 1
    • 0041877169 scopus 로고    scopus 로고
    • Estimation of entropy and mutual information
    • DOI 10.1162/089976603321780272
    • L. Paninski, "Estimation of entropy and mutual information," Neural Comput., vol. 15, pp. 1191-1253, 2003. (Pubitemid 37049793)
    • (2003) Neural Computation , vol.15 , Issue.6 , pp. 1191-1253
    • Paninski, L.1
  • 3
    • 0001388569 scopus 로고
    • On a Statistical Estimate for the Entropy of a Sequence of Independent Random Variables
    • G.P. Basarin, "On a Statistical Estimate for the Entropy of a Sequence of Independent Random Variables," Theory Probab. Appln., vol. 4, pp. 333-336, 1959.
    • (1959) Theory Probab. Appln. , vol.4 , pp. 333-336
    • Basarin, G.P.1
  • 5
    • 18744410140 scopus 로고    scopus 로고
    • Estimating entropy rates with bayesian confidence intervals
    • DOI 10.1162/0899766053723050
    • M. B. Kennel, J. Shlens, H. D. I. Abarbanel and E. J. Chichilnisky, "Estimating Entropy Rates with Bayesian Confidence Intervals," Neural Comput., vol. 17, pp. 1531-1576, 2005. (Pubitemid 40669911)
    • (2005) Neural Computation , vol.17 , Issue.7 , pp. 1531-1576
    • Kennel, M.B.1    Shlens, J.2    Abarbanel, H.D.I.3    Chichilnisky, E.J.4
  • 6
    • 3042515346 scopus 로고    scopus 로고
    • Universal entropy estimation via block sorting
    • July
    • H. Cai, S. R. Kulkarni, and S. Verdú, "Universal entropy estimation via block sorting," IEEE Trans. Inform. Theory, vol. 50, pp. 1551-1561, July 2004.
    • (2004) IEEE Trans. Inform. Theory , vol.50 , pp. 1551-1561
    • Cai, H.1    Kulkarni, S.R.2    Verdú, S.3
  • 8
    • 51649122981 scopus 로고    scopus 로고
    • The Interplay between Entropy and Variational Distance
    • (Nice, France), Jun. 24-29
    • S.-W. Ho and R. W. Yeung, "The Interplay between Entropy and Variational Distance," in Proc. IEEE Int. Symp. Inform. Theory, (Nice, France), Jun. 24-29, 2007.
    • (2007) Proc. IEEE Int. Symp. Inform. Theory
    • Ho, S.-W.1    Yeung, R.W.2
  • 14
    • 39849102050 scopus 로고    scopus 로고
    • Universal filtering via hidden Markov modeling
    • DOI 10.1109/TIT.2007.913220
    • T. Moon and T. Weissman, "Universal Filtering Via Hidden Markov Modeling," IEEE Trans. Inform. Theory, vol. 54, no. 2, pp. 692-708, Feb. 2008. (Pubitemid 351314660)
    • (2008) IEEE Transactions on Information Theory , vol.54 , Issue.2 , pp. 692-708
    • Moon, T.1    Weissman, T.2
  • 16
    • 80051918917 scopus 로고    scopus 로고
    • Convergence Properties of Functional Estimates of Discrete Distributions, Random
    • A. Antos and I. Kontoiannis, "Convergence Properties of Functional Estimates of Discrete Distributions," Random Structures and Algorithms, 2002.
    • (2002) Structures and Algorithms
    • Antos, A.1    Kontoiannis, I.2
  • 17
    • 39049132987 scopus 로고    scopus 로고
    • On Information Divergence Measures and a Unified Typicality
    • (Seattle), July 9-14
    • S.-W. Ho and R. W. Yeung, "On Information Divergence Measures and a Unified Typicality," in Proc. IEEE Int. Symp. Inform. Theory, (Seattle), July 9-14, 2006.
    • (2006) Proc. IEEE Int. Symp. Inform. Theory
    • Ho, S.-W.1    Yeung, R.W.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.