메뉴 건너뛰기




Volumn 25, Issue 6, 1997, Pages 2451-2492

Mutual information, metric entropy and cumulative relative entropy risk

Author keywords

Bayes risk; Density estimation; Hellinger distance; Kullback leibler distance; Metric entropy; Minimax risk; Mutual information; Relative entropy

Indexed keywords


EID: 0031326925     PISSN: 00905364     EISSN: None     Source Type: Journal    
DOI: 10.1214/aos/1030741081     Document Type: Article
Times cited : (124)

References (57)
  • 1
    • 0000344740 scopus 로고
    • Differential geometry of curved exponential families - Curvatures and information loss
    • AMARI, S. (1982). Differential geometry of curved exponential families - curvatures and information loss. Ann. Statist. 10 357-385.
    • (1982) Ann. Statist. , vol.10 , pp. 357-385
    • Amari, S.1
  • 2
    • 0000729504 scopus 로고
    • Statistical theory of learning curves under entropic loss
    • AMARI, S. and MURATA, N. (1993). Statistical theory of learning curves under entropic loss. Neural Comput. 5 140-153.
    • (1993) Neural Comput. , vol.5 , pp. 140-153
    • Amari, S.1    Murata, N.2
  • 3
    • 0000039074 scopus 로고
    • The strong ergodic theorem for densities: Generalized Shannon - McMilan-Breiman theorem
    • BARRON, A. (1985). The strong ergodic theorem for densities: generalized Shannon - McMilan-Breiman theorem. Ann. Probab. 13 1292-1303.
    • (1985) Ann. Probab. , vol.13 , pp. 1292-1303
    • Barron, A.1
  • 4
    • 0002168687 scopus 로고
    • Are Bayes rules consistent in information?
    • (T. M. Cover and B. Gopinath, eds.) Springer-Verlag, New York
    • BARRON, A. (1987). Are Bayes rules consistent in information? In Open Problems in Communication and Computation (T. M. Cover and B. Gopinath, eds.) 85-91. Springer-Verlag, New York.
    • (1987) Open Problems in Communication and Computation , pp. 85-91
    • Barron, A.1
  • 7
    • 0024079005 scopus 로고
    • A bound on the financial value of information
    • BARRON, A. and COVER, T. (1988). A bound on the financial value of information. IEEE Trans. Inform. Theory 34 1097-1100.
    • (1988) IEEE Trans. Inform. Theory , vol.34 , pp. 1097-1100
    • Barron, A.1    Cover, T.2
  • 8
    • 0026914305 scopus 로고
    • Distribution estimation consistent in total variation and in two types of information divergence
    • BARRON, A., GYÖRFI, L. and VAN DER MEULEN, E. (1992). Distribution estimation consistent in total variation and in two types of information divergence. IEEE Trans. Inform. Theory 38 1437-1454.
    • (1992) IEEE Trans. Inform. Theory , vol.38 , pp. 1437-1454
    • Barron, A.1    Györfi, L.2    Van Der Meulen, E.3
  • 10
    • 34250136988 scopus 로고
    • Approximation dans les espaces métriques et théorie de l'estimation
    • BIRGÉ, L. (1983). Approximation dans les espaces métriques et théorie de l'estimation. Z. Wahrsch. Verw. Gebiete 65 181-237.
    • (1983) Z. Wahrsch. Verw. Gebiete , vol.65 , pp. 181-237
    • Birgé, L.1
  • 11
    • 0000864703 scopus 로고
    • On estimating a density using Hellinger distance and some other strange facts
    • BIRGÉ, L. (1986). On estimating a density using Hellinger distance and some other strange facts. Probab. Theory Related Fields 71 271-291.
    • (1986) Probab. Theory Related Fields , vol.71 , pp. 271-291
    • Birgé, L.1
  • 12
    • 21344492381 scopus 로고
    • Rates of convergence for minimum contrast estimators
    • BIRGÉ, L. and MASSART, P. (1993). Rates of convergence for minimum contrast estimators. Probab. Theory Related Fields 97 113-150.
    • (1993) Probab. Theory Related Fields , vol.97 , pp. 113-150
    • Birgé, L.1    Massart, P.2
  • 13
    • 0000609688 scopus 로고
    • Transformation of Wiener integrals under translations
    • CAMERON, R. H. and MARTIN, W. T. (1944). Transformation of Wiener integrals under translations. Ann. Math. 45 386-396.
    • (1944) Ann. Math. , vol.45 , pp. 386-396
    • Cameron, R.H.1    Martin, W.T.2
  • 15
    • 0025430804 scopus 로고
    • Information-theoretic asymptotics of Bayes methods
    • CLARKE, B. and BARRON, A. (1990). Information-theoretic asymptotics of Bayes methods. IEEE Trans. Inform. Theory 36 453-471.
    • (1990) IEEE Trans. Inform. Theory , vol.36 , pp. 453-471
    • Clarke, B.1    Barron, A.2
  • 16
    • 0002138061 scopus 로고
    • Jefferys' prior is asymptotically least favorable under entropy risk
    • CLARKE, B. and BARRON, A. (1994). Jefferys' prior is asymptotically least favorable under entropy risk. J. Statist. Plann. Inference 41 37-60.
    • (1994) J. Statist. Plann. Inference , vol.41 , pp. 37-60
    • Clarke, B.1    Barron, A.2
  • 17
    • 0009030508 scopus 로고
    • Entropy of several sets of real-valued functions
    • CLEMENTS, G. F. (1963). Entropy of several sets of real-valued functions. Pacific J. Math. 13 1085-1095.
    • (1963) Pacific J. Math. , vol.13 , pp. 1085-1095
    • Clements, G.F.1
  • 19
    • 0018998365 scopus 로고
    • A source matching approach to finding minimax codes
    • DAVISSON, L. and LEON-GARCIA, A. (1980). A source matching approach to finding minimax codes. IEEE Trans. Inform. Theory 26 166-174.
    • (1980) IEEE Trans. Inform. Theory , vol.26 , pp. 166-174
    • Davisson, L.1    Leon-Garcia, A.2
  • 21
    • 0003171807 scopus 로고
    • On the consistency of Bayes estimates
    • DIACONIS, P. and FREEDMAN, D. (1986). On the consistency of Bayes estimates. Ann. Statist. 14 1-26.
    • (1986) Ann. Statist. , vol.14 , pp. 1-26
    • Diaconis, P.1    Freedman, D.2
  • 22
    • 0000085854 scopus 로고
    • A course on empirical processes
    • Springer, New York
    • DUDLEY, R. M. (1984). A course on empirical processes. Lecture Notes in Math. 1097 2-142. Springer, New York.
    • (1984) Lecture Notes in Math. , vol.1097 , pp. 2-142
    • Dudley, R.M.1
  • 23
    • 0018489220 scopus 로고
    • Information contained in a sequence of observations
    • EFROIMOVICH, S. Y. (1980). Information contained in a sequence of observations. Problems Inform. Transmission 15 178-189.
    • (1980) Problems Inform. Transmission , vol.15 , pp. 178-189
    • Efroimovich, S.Y.1
  • 26
    • 0012347244 scopus 로고
    • Stability and convergence of the posterior in non-regular problems
    • (S. Gupta and J. O. Berger, eds.). Springer, New York
    • GHOSH, J., GHOSAL, S. and SAMANTA, T. (1994). Stability and convergence of the posterior in non-regular problems. In Statistical Decision Theory and Related Topics. V (S. Gupta and J. O. Berger, eds.). Springer, New York.
    • (1994) Statistical Decision Theory and Related Topics. V
    • Ghosh, J.1    Ghosal, S.2    Samanta, T.3
  • 27
    • 0000145024 scopus 로고
    • Some limit theorems for empirical processes
    • GINÉ, E. and ZINN, J. (1984). Some limit theorems for empirical processes. Ann. Probab. 12 929-989.
    • (1984) Ann. Probab. , vol.12 , pp. 929-989
    • Giné, E.1    Zinn, J.2
  • 29
    • 0001253184 scopus 로고
    • On density estimation in the view of Kolmogorov's ideas in approximation theory
    • HASMINSKII, R. and IBRAGIMOV, I. (1990). On density estimation in the view of Kolmogorov's ideas in approximation theory. Ann. Statist. 18 999-1010.
    • (1990) Ann. Statist. , vol.18 , pp. 999-1010
    • Hasminskii, R.1    Ibragimov, I.2
  • 30
    • 0031190750 scopus 로고    scopus 로고
    • A general minimax result for relative entropy
    • HAUSSLER, D. (1997). A general minimax result for relative entropy. IEEE Trans. Inform. Theory 40 1276-1280.
    • (1997) IEEE Trans. Inform. Theory , vol.40 , pp. 1276-1280
    • Haussler, D.1
  • 32
    • 0028132501 scopus 로고
    • Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension
    • HAUSSLER, D., KEARNS, M. and SCHAPIRE, R. E. (1994). Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension. Machine Learning 14 83-113.
    • (1994) Machine Learning , vol.14 , pp. 83-113
    • Haussler, D.1    Kearns, M.2    Schapire, R.E.3
  • 33
    • 84899018385 scopus 로고
    • General bounds on the mutual information between a parameter and n conditionally independent observations
    • ACM Press, New York
    • HAUSSLER, D. and OPPER, M. (1995). General bounds on the mutual information between a parameter and n conditionally independent observations. In Proceedings of the Seventh Annual ACM Workshop on Computational Learning Theory 402-411. ACM Press, New York.
    • (1995) Proceedings of the Seventh Annual ACM Workshop on Computational Learning Theory , pp. 402-411
    • Haussler, D.1    Opper, M.2
  • 36
    • 84950426173 scopus 로고
    • Recent developments in nonparametric density estimation
    • IZENMAN, A. J. (1991). Recent developments in nonparametric density estimation. J. Amer. Statist. Assoc. 86 205-224.
    • (1991) J. Amer. Statist. Assoc. , vol.86 , pp. 205-224
    • Izenman, A.J.1
  • 37
    • 0000624821 scopus 로고
    • ε-Entropy and ε-capacity of sets in functional spaces
    • KOLMOGOROV, A. N. and TIKHOMIROV, V. M. (1961). ε-entropy and ε-capacity of sets in functional spaces. Amer. Math. Soc. Trans. Ser. 2 17 277-364.
    • (1961) Amer. Math. Soc. Trans. Ser. , vol.2 , pp. 17277-17364
    • Kolmogorov, A.N.1    Tikhomirov, V.M.2
  • 39
    • 0011115920 scopus 로고
    • An extension of Wald's theory of statistical decision functions
    • LE CAM, L. (1955). An extension of Wald's theory of statistical decision functions. Ann. Math. Statist. 26 69-81.
    • (1955) Ann. Math. Statist. , vol.26 , pp. 69-81
    • Le Cam, L.1
  • 41
    • 0029323797 scopus 로고
    • On the stochastic complexity of learning realizable and unrealizable rules
    • MEIR, R. and MERHAV, N. (1995). On the stochastic complexity of learning realizable and unrealizable rules. Machine Learning 19 241-261.
    • (1995) Machine Learning , vol.19 , pp. 241-261
    • Meir, R.1    Merhav, N.2
  • 42
    • 0029304928 scopus 로고
    • A strong version of the redundancy-capacity theorem of universal coding
    • MERHAV, N. and FEDER, M. (1995). A strong version of the redundancy-capacity theorem of universal coding. IEEE Trans. Inform. Theory 41 714-722.
    • (1995) IEEE Trans. Inform. Theory , vol.41 , pp. 714-722
    • Merhav, N.1    Feder, M.2
  • 43
    • 0041081792 scopus 로고
    • Calculation of the learning curve of Bayes optimal classification algorithm for learning a perceptron with noise
    • Morgan Kaufmann, San Mateo, CA
    • OPPER, M. and HAUSSLER, D. (1991). Calculation of the learning curve of Bayes optimal classification algorithm for learning a perceptron with noise. In Proceedings of the Fourth Annual Workshop on Computational Learning Theory 75-87. Morgan Kaufmann, San Mateo, CA.
    • (1991) Proceedings of the Fourth Annual Workshop on Computational Learning Theory , pp. 75-87
    • Opper, M.1    Haussler, D.2
  • 44
    • 0000285881 scopus 로고
    • Bounds for predictive errors in the statistical mechanics of in supervised learning
    • OPPER, M. and HAUSSLER, D. (1995). Bounds for predictive errors in the statistical mechanics of in supervised learning. Phys. Rev. Lett. 75 3772-3775.
    • (1995) Phys. Rev. Lett. , vol.75 , pp. 3772-3775
    • Opper, M.1    Haussler, D.2
  • 47
    • 0002408684 scopus 로고
    • On measures of entropy and information
    • Univ. California Press, Berkeley
    • RENYI, A. (1960). On measures of entropy and information. Proc. Fourth Berkeley Symp. Math. Statist. Probab. 1 547-561. Univ. California Press, Berkeley.
    • (1960) Proc. Fourth Berkeley Symp. Math. Statist. Probab. , vol.1 , pp. 547-561
    • Renyi, A.1
  • 48
    • 0041081794 scopus 로고
    • On the amount of information concerning an unknown parameter in a sequence of observations
    • RENYI, A. (1964). On the amount of information concerning an unknown parameter in a sequence of observations. Publ. Math. Inst. Hungar. Acad. Sci. 9 617-625.
    • (1964) Publ. Math. Inst. Hungar. Acad. Sci. , vol.9 , pp. 617-625
    • Renyi, A.1
  • 49
    • 0000318553 scopus 로고
    • Stochastic complexity and modeling
    • RISSANEN, J. (1986). Stochastic complexity and modeling. Ann. Statist. 14 1080-1100.
    • (1986) Ann. Statist. , vol.14 , pp. 1080-1100
    • Rissanen, J.1
  • 50
  • 51
    • 0000824198 scopus 로고
    • Proof and refinements of an inequality of Feynman
    • SYMANZIK, K. (1965). Proof and refinements of an inequality of Feynman. J. Math. Phys. 6 1155-1165.
    • (1965) J. Math. Phys. , vol.6 , pp. 1155-1165
    • Symanzik, K.1
  • 52
    • 21144465429 scopus 로고
    • Hellinger-consistency of certain nonparametric maximum likelihood estimators
    • VAN DEGEER, S. (1993). Hellinger-consistency of certain nonparametric maximum likelihood estimators. Ann. Statist. 21 14-44.
    • (1993) Ann. Statist. , vol.21 , pp. 14-44
    • Van Degeer, S.1
  • 53
    • 21844517288 scopus 로고
    • Probability inequalities for likelihood ratios and convergence rates for sieve MLE's
    • WONG, W. and SHEN, X. (1995). Probability inequalities for likelihood ratios and convergence rates for sieve MLE's. Ann. Statist. 23 339-362.
    • (1995) Ann. Statist. , vol.23 , pp. 339-362
    • Wong, W.1    Shen, X.2
  • 54
    • 0039302673 scopus 로고
    • A loss bound model for on-line stochastic prediction algorithms
    • YAMANISHI, K. (1995). A loss bound model for on-line stochastic prediction algorithms. Inform. Comput. 119 39-54.
    • (1995) Inform. Comput. , vol.119 , pp. 39-54
    • Yamanishi, K.1
  • 55
    • 0029779159 scopus 로고    scopus 로고
    • Lower bounds on expected redundancy for nonparametric classes
    • YU, B. (1996). Lower bounds on expected redundancy for nonparametric classes. IEEE Trans. Inform. Theory 42 272-275.
    • (1996) IEEE Trans. Inform. Theory , vol.42 , pp. 272-275
    • Yu, B.1
  • 57
    • 21744450855 scopus 로고    scopus 로고
    • Metric entropy and minimax risk in classification
    • (J. Mycielski, G. Rozenberg and A. Salomaa, eds.) Springer-Verlag, New York
    • HAUSSLER, D. and OPPER, M. (1997). Metric entropy and minimax risk in classification. In Lecture Notes in Comp. Sci.: Studies in Logic and Comp. Sci. (J. Mycielski, G. Rozenberg and A. Salomaa, eds.) 1261 212-235. Springer-Verlag, New York.
    • (1997) Lecture Notes in Comp. Sci.: Studies in Logic and Comp. Sci. , vol.1261 , pp. 212-235
    • Haussler, D.1    Opper, M.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.