-
2
-
-
12844276159
-
A mixed theory of information I
-
Aczél, J.; Daróczy, Z. A mixed theory of information I. RAIRO Inform. Theory 1978, 12, 149-155.
-
(1978)
RAIRO Inform. Theory
, vol.12
, pp. 149-155
-
-
Aczél, J.1
Daróczy, Z.2
-
3
-
-
0010985663
-
Why Shannon and Hartley entropies are "natural
-
Aczél, J.; Forte, B.; Ng, C.T. Why Shannon and Hartley entropies are "natural". Adv. Appl. Probab. 1974, 6, 131-146.
-
(1974)
Adv. Appl. Probab
, vol.6
, pp. 131-146
-
-
Aczél, J.1
Forte, B.2
Ng, C.T.3
-
4
-
-
33748536094
-
An interpretation of identification entropy
-
Ahlswede, R.; Cai, N. An interpretation of identification entropy. IEEE Trans. Inf. Theory 2006, 52, 4198-4207.
-
(2006)
IEEE Trans. Inf. Theory
, vol.52
, pp. 4198-4207
-
-
Ahlswede, R.1
Cai, N.2
-
5
-
-
0001199215
-
A general class of coefficients of divergence of one distribution from another
-
Ali, S.M.; Silvey, S.D. A general class of coefficients of divergence of one distribution from another. J. Roy. Statist. Soc. B 1966, 25, 131-142.
-
(1966)
J. Roy. Statist. Soc. B
, vol.25
, pp. 131-142
-
-
Ali, S.M.1
Silvey, S.D.2
-
6
-
-
0000242156
-
Information-theoretic considerations on estimation problems
-
Arimoto, S. Information-theoretic considerations on estimation problems. Information and Control 1971, 19, 181-194.
-
(1971)
Information and Control
, vol.19
, pp. 181-194
-
-
Arimoto, S.1
-
7
-
-
84856110960
-
-
Arimoto, S. Information measures and capacity of order a for discrete memoryless channels. In Topics in Information Theory, Colloq. Math. Soc. J. Bolyai 16; Csiszár, I.; Elias, P., Eds.; North Holland: Amsterdam, 1977; pp. 41-52.
-
Arimoto, S. Information measures and capacity of order a for discrete memoryless channels. In Topics in Information Theory, Colloq. Math. Soc. J. Bolyai 16; Csiszár, I.; Elias, P., Eds.; North Holland: Amsterdam, 1977; pp. 41-52.
-
-
-
-
8
-
-
0018064415
-
f-entropies, probability of error, and feature selection
-
Ben-Bassat, M. f-entropies, probability of error, and feature selection. Information and Control 1978, 39, 227-242.
-
(1978)
Information and Control
, vol.39
, pp. 227-242
-
-
Ben-Bassat, M.1
-
9
-
-
0029405656
-
Generalized privacy amplification
-
Bennett, C.; Brassard, G.; Crépeau, C.; Maurer, U. Generalized privacy amplification. IEEE Trans. Inf. Theory 1995, 47, 1915-1923.
-
(1995)
IEEE Trans. Inf. Theory
, vol.47
, pp. 1915-1923
-
-
Bennett, C.1
Brassard, G.2
Crépeau, C.3
Maurer, U.4
-
10
-
-
0002215069
-
On a measure of divergence between two statistical populations defined by their probability distributions
-
Bhattacharyya, A. On a measure of divergence between two statistical populations defined by their probability distributions. Bull. Calcutta Math. Soc. 1943, 35, 99-109.
-
(1943)
Bull. Calcutta Math. Soc
, vol.35
, pp. 99-109
-
-
Bhattacharyya, A.1
-
11
-
-
49949144765
-
The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming
-
Bregman, L.M. The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Comp. Math. and Math. Phys. 1967, 7, 200-217.
-
(1967)
USSR Comp. Math. and Math. Phys
, vol.7
, pp. 200-217
-
-
Bregman, L.M.1
-
12
-
-
0000417455
-
A coding theorem and Rényi's entropy
-
Campbell, L.L. A coding theorem and Rényi's entropy. Information and Control 1965, 8, 423-429.
-
(1965)
Information and Control
, vol.8
, pp. 423-429
-
-
Campbell, L.L.1
-
13
-
-
84972569667
-
-
Chaundry, T.W.; McLeod, J.B. On a functional equation. Edinburgh Mat. Notes 1960, 43, 7-8.
-
Chaundry, T.W.; McLeod, J.B. On a functional equation. Edinburgh Mat. Notes 1960, 43, 7-8.
-
-
-
-
14
-
-
0001856347
-
Eine informationstheorische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten.
-
Csiszár, I. Eine informationstheorische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten. Publ. Math. Inst. Hungar. Acad. Sci. 1963, 5, 85-107.
-
(1963)
Publ. Math. Inst. Hungar. Acad. Sci
, vol.5
, pp. 85-107
-
-
Csiszár, I.1
-
15
-
-
0000489740
-
Information-type measures of difference of probability distributions and indirect observations
-
Csiszár, I. Information-type measures of difference of probability distributions and indirect observations. Studia Sei. Math. Hungar. 1967, 2, 299-318.
-
(1967)
Studia Sei. Math. Hungar
, vol.2
, pp. 299-318
-
-
Csiszár, I.1
-
16
-
-
0001569844
-
A class of measures of informativity of observation channels
-
Csiszár, I. A class of measures of informativity of observation channels. Periodica Math. Hungar. 1972, 2, 191-213.
-
(1972)
Periodica Math. Hungar
, vol.2
, pp. 191-213
-
-
Csiszár, I.1
-
18
-
-
0025595687
-
Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
-
Csiszár, I. Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems. Ann. Statist. 1991, 19, 2032-2066.
-
(1991)
Ann. Statist
, vol.19
, pp. 2032-2066
-
-
Csiszár, I.1
-
19
-
-
0029219702
-
Generalized cutoff rates and Rényi information measures
-
Csiszár, I. Generalized cutoff rates and Rényi information measures. IEEE Trans. Inf. Theory 1995, 41, 26-34.
-
(1995)
IEEE Trans. Inf. Theory
, vol.41
, pp. 26-34
-
-
Csiszár, I.1
-
20
-
-
34347166269
-
Über die gemeinsame Charakterisierung der zu den nicht vollständigen Verteilungen gehörigen Entropien von Shannon und von Rényi.
-
Daróczy, Z. Über die gemeinsame Charakterisierung der zu den nicht vollständigen Verteilungen gehörigen Entropien von Shannon und von Rényi. Z. Wahrscheinlichkeitsth. Verw. Gebiete 1963, 7, 381-388.
-
(1963)
Z. Wahrscheinlichkeitsth. Verw. Gebiete
, vol.7
, pp. 381-388
-
-
Daróczy, Z.1
-
21
-
-
24544448078
-
Über Mittelwerte und Entropien vollständiger Wahrscheinlichkeitsverteilungen.
-
Daróczy, Z. Über Mittelwerte und Entropien vollständiger Wahrscheinlichkeitsverteilungen. Acta Math. Acad. Sci. Hungar. 1964, 15, 203-210.
-
(1964)
Acta Math. Acad. Sci. Hungar
, vol.15
, pp. 203-210
-
-
Daróczy, Z.1
-
22
-
-
0014747331
-
Generalized information functions
-
Daróczy, Z. Generalized information functions. Information and Control 1970, 16, 36-51.
-
(1970)
Information and Control
, vol.16
, pp. 36-51
-
-
Daróczy, Z.1
-
23
-
-
34250452347
-
On the measurable solutions of a functional equation
-
Daróczy, Z. On the measurable solutions of a functional equation. Acta Math. Acad. Sci. Hungar. 1971, 34, 11-14.
-
(1971)
Acta Math. Acad. Sci. Hungar
, vol.34
, pp. 11-14
-
-
Daróczy, Z.1
-
24
-
-
84856110802
-
-
Daróczy, Z.; Maksa, Gy. Nonnegative information functions. In Analytic Function Methods in Probability and Statistics, Colloq. Math. Soc. J. Bolyai 21; Gyires, B., Ed.; North Holland: Amsterdam, 1979; pp. 65-76.
-
Daróczy, Z.; Maksa, Gy. Nonnegative information functions. In Analytic Function Methods in Probability and Statistics, Colloq. Math. Soc. J. Bolyai 21; Gyires, B., Ed.; North Holland: Amsterdam, 1979; pp. 65-76.
-
-
-
-
25
-
-
0016561596
-
The role of boundedness in characterizing Shannon entropy
-
Diderrich, G. The role of boundedness in characterizing Shannon entropy. Information and Control 1975, 29, 149-161.
-
(1975)
Information and Control
, vol.29
, pp. 149-161
-
-
Diderrich, G.1
-
27
-
-
0009960844
-
On the concept of entropy of a finite probability scheme
-
in Russian
-
Faddeev, D.K. On the concept of entropy of a finite probability scheme (in Russian). Uspehi Mat. Nauk 1956, 11, 227-231.
-
(1956)
Uspehi Mat. Nauk
, vol.11
, pp. 227-231
-
-
Faddeev, D.K.1
-
28
-
-
84950358754
-
On the inequality Σ;Pif(Pi) ≥ Σpif(qi)
-
Fischer, P. On the inequality Σ;Pif(Pi) ≥ Σpif(qi). Metrika 1972, 18, 199-208.
-
(1972)
Metrika
, vol.18
, pp. 199-208
-
-
Fischer, P.1
-
29
-
-
84943733477
-
Why Shannon's entropy
-
Symposia Math. 15; Academic Press: New York
-
Forte, B. Why Shannon's entropy. In Conv. Inform. Teor., Rome 1973, Symposia Math. 15; Academic Press: New York, 1975; pp. 137-152.
-
(1975)
Conv. Inform. Teor., Rome 1973
, pp. 137-152
-
-
Forte, B.1
-
30
-
-
6344274901
-
Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
-
Grünwald, P.; Dawid, P. Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory. Ann. Statist. 2004, 32, 1367-1433.
-
(2004)
Ann. Statist
, vol.32
, pp. 1367-1433
-
-
Grünwald, P.1
Dawid, P.2
-
31
-
-
0002453010
-
Quantification method of classification processes. Concept of structural a-entropy
-
Havrda, J.; Charvát, F. Quantification method of classification processes. Concept of structural a-entropy. Kybernetika 1967, 3, 30-35.
-
(1967)
Kybernetika
, vol.3
, pp. 30-35
-
-
Havrda, J.1
Charvát, F.2
-
32
-
-
0004656628
-
Information without probability
-
Ingarden, R.S.; Urbanik, K. Information without probability. Colloq. Math. 1962, 9, 131-150.
-
(1962)
Colloq. Math
, vol.9
, pp. 131-150
-
-
Ingarden, R.S.1
Urbanik, K.2
-
33
-
-
11944266539
-
Information theory and statistical mechanics
-
Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620-630.
-
(1957)
Phys. Rev
, vol.106
, pp. 620-630
-
-
Jaynes, E.T.1
-
34
-
-
0025209334
-
General entropy criteria for inverse problems, with applications to data compression, pattern classification and cluster analysis
-
Jones, L.K.; Byrne, C.L. General entropy criteria for inverse problems, with applications to data compression, pattern classification and cluster analysis. IEEE Trans. Inf. Theory 1990, 36, 23-30.
-
(1990)
IEEE Trans. Inf. Theory
, vol.36
, pp. 23-30
-
-
Jones, L.K.1
Byrne, C.L.2
-
35
-
-
0039819231
-
Information et probabilité.
-
and 350-353
-
Kampé de Fériet, J.; Forte, B. Information et probabilité. C. R. Acad. Sci. Paris A 1967, 265, 110-114, 142-146, and 350-353.
-
(1967)
C. R. Acad. Sci. Paris A
, vol.265
, Issue.110-114
, pp. 142-146
-
-
Kampé de Fériet, J.1
Forte, B.2
-
36
-
-
84966202883
-
Measurable solutions of functional equations related to information theory
-
Kannappan, Pl.; Ng, C.T. Measurable solutions of functional equations related to information theory. Proc. Amer. Math. Soc. 1973, 38, 303-310.
-
(1973)
Proc. Amer. Math. Soc
, vol.38
, pp. 303-310
-
-
Kannappan, P.1
Ng, C.T.2
-
37
-
-
54749144975
-
A functional equation and its applications in information theory
-
Kannappan, Pl. and Ng, C.T. A functional equation and its applications in information theory. Ann. Polon. Math. 1974, 30, 105-112.
-
(1974)
Ann. Polon. Math
, vol.30
, pp. 105-112
-
-
Kannappan, P.1
Ng, C.T.2
-
38
-
-
0001450585
-
A new invariant for transitive dynamical systems
-
in Russian
-
Kolmogorov, A.N. A new invariant for transitive dynamical systems (in Russian). Dokl. Akad. NaukSSSR 1958, 119, 861-864.
-
(1958)
Dokl. Akad. NaukSSSR
, vol.119
, pp. 861-864
-
-
Kolmogorov, A.N.1
-
41
-
-
8444234841
-
On the axioms of information theory
-
Lee, P.M. On the axioms of information theory. Ann. Math. Statist. 1964, 35, 415-418.
-
(1964)
Ann. Math. Statist
, vol.35
, pp. 415-418
-
-
Lee, P.M.1
-
42
-
-
0040398997
-
Studies in statistical dynamics
-
Linhard, J.; Nielsen, V. Studies in statistical dynamics. Kong.Danske Vid. Selskab Mat-fys. Med. 1971, 38, 9, 1-42.
-
(1971)
Kong.Danske Vid. Selskab Mat-fys. Med
, vol.38
, Issue.9
, pp. 1-42
-
-
Linhard, J.1
Nielsen, V.2
-
43
-
-
34250237676
-
On the bounded solutions of a functional equation
-
Maksa, Gy. On the bounded solutions of a functional equation. Acta Math. Acad. Sci. Hungar. 1981, 37, 445-440.
-
(1981)
Acta Math. Acad. Sci. Hungar
, vol.37
, pp. 445-440
-
-
Maksa, G.1
-
45
-
-
0000557238
-
Thermodynamik quantenmechanischer Gesamtheiten.
-
Neumann, J. Thermodynamik quantenmechanischer Gesamtheiten. Gött. Nachr. 1927, 1, 273-291.
-
(1927)
Gött. Nachr
, vol.1
, pp. 273-291
-
-
Neumann, J.1
-
46
-
-
33645263129
-
A note on the inevitability of maximum entropy
-
Paris, J.; Vencovská, A. A note on the inevitability of maximum entropy. Int'l J. Inexact Reasoning 1990, 4, 183-223.
-
(1990)
Int'l J. Inexact Reasoning
, vol.4
, pp. 183-223
-
-
Paris, J.1
Vencovská, A.2
-
48
-
-
0002408684
-
On measures of entropy and information
-
Univ. Calif. Press: Berkeley
-
Rényi, A. On measures of entropy and information. In Proc. 4th Berkeley Symp. Math. Statist. Probability, 1960; Univ. Calif. Press: Berkeley 1961; Vol. 1, pp. 547-561.
-
(1961)
Proc. 4th Berkeley Symp. Math. Statist. Probability, 1960
, vol.1
, pp. 547-561
-
-
Rényi, A.1
-
49
-
-
0003145851
-
On the foundations of information theory
-
Rényi, A. On the foundations of information theory. Rev. Inst. Internat. Stat. 1965, 33, 1-4.
-
(1965)
Rev. Inst. Internat. Stat
, vol.33
, pp. 1-4
-
-
Rényi, A.1
-
50
-
-
0009909847
-
On the probability of large deviations of random variables
-
in Russian
-
Sanov, I.N. On the probability of large deviations of random variables (in Russian). Mat. Sbornik 1957, 42, 11-44.
-
(1957)
Mat. Sbornik
, vol.42
, pp. 11-44
-
-
Sanov, I.N.1
-
51
-
-
0003262162
-
Contribution aux applications statistiques de la théorie de l'information.
-
Schützenberger, M.P. Contribution aux applications statistiques de la théorie de l'information. Publ. Inst. Statist. Univ. Paris 1954, 3, 3-117.
-
(1954)
Publ. Inst. Statist. Univ. Paris
, vol.3
, pp. 3-117
-
-
Schützenberger, M.P.1
-
52
-
-
84856043672
-
A mathematical theory of communication
-
and
-
Shannon, C.E. A mathematical theory of communication. Bell System Tech. J. 1948, 27, 379-423 and 623-656.
-
(1948)
Bell System Tech. J
, vol.27
-
-
Shannon, C.E.1
-
53
-
-
0018877134
-
Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
-
Shore, J.E.; Johnson, R.W. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Trans. Inf. Theory 1980, 26, 26-37.
-
(1980)
IEEE Trans. Inf. Theory
, vol.26
, pp. 26-37
-
-
Shore, J.E.1
Johnson, R.W.2
-
55
-
-
33646516485
-
Possible generalizations of the Boltzmann-Gibbs statistics
-
Tsallis,C Possible generalizations of the Boltzmann-Gibbs statistics. J. Statist. Phys. 1988, 52, 479-487.
-
(1988)
J. Statist. Phys
, vol.52
, pp. 479-487
-
-
Tsallis, C.1
-
56
-
-
26844498966
-
A new derivation of the information function
-
Tverberg, H. A new derivation of the information function. Math. Scand. 1958, 6, 297-298.
-
(1958)
Math. Scand
, vol.6
, pp. 297-298
-
-
Tverberg, H.1
-
57
-
-
0006587258
-
Bounds on the minimal error probability for testing a finite or countable number of hypotheses
-
in Russian
-
Vajda, I. Bounds on the minimal error probability for testing a finite or countable number of hypotheses (in Russian). Probl. Inform. Transmission 1968, 4, 9-17.
-
(1968)
Probl. Inform. Transmission
, vol.4
, pp. 9-17
-
-
Vajda, I.1
-
60
-
-
0032124111
-
-
Zhang, Z.; Yeung, R.W. On characterizations of entropy function via information inequalities. IEEE Trans. Inf. Theory 1998, 44, 1440-1452.
-
-
-
|