-
1
-
-
10044228109
-
Solution of Shannon's problem on the monotonicity of entropy
-
S. Artstein, K. Ball, F. Barthe, and A. Naor, "Solution of Shannon's problem on the monotonicity of entropy," J. Amer. Math. Soc., vol.17, pp. 975-982, 2004.
-
(2004)
J. Amer. Math. Soc.
, vol.17
, pp. 975-982
-
-
Artstein, S.1
Ball, K.2
Barthe, F.3
Naor, A.4
-
2
-
-
0000061719
-
Entropy and the central limit theorem
-
A. R. Barron, "Entropy and the central limit theorem," Ann. Probab., vol.14, pp. 336-342, 1986.
-
(1986)
Ann. Probab.
, vol.14
, pp. 336-342
-
-
Barron, A.R.1
-
3
-
-
33947394440
-
Divergence and minimum mean-square error in continuoustime additive white Gaussian noise channels
-
Mar.
-
J. Binia, "Divergence and minimum mean-square error in continuoustime additive white Gaussian noise channels," IEEE Trans. Inf. Theory, vol.52, no.3, pp. 1160-1163, Mar. 2006.
-
(2006)
IEEE Trans. Inf. Theory
, vol.52
, Issue.3
, pp. 1160-1163
-
-
Binia, J.1
-
4
-
-
0000609688
-
Transformation ofWiener integrals under translations
-
R. H. Cameron and W. T. Martin, "Transformation ofWiener integrals under translations," Ann. Math., vol.45, pp. 386-396, 1944.
-
(1944)
Ann. Math.
, vol.45
, pp. 386-396
-
-
Cameron, R.H.1
Martin, W.T.2
-
5
-
-
57349137260
-
Scanning and sequential decision making for multidimensional data, part II: Noisy data
-
Dec.
-
A. Cohen, N. Merhav, and T.Weissman, "Scanning and sequential decision making for multidimensional data, part II: noisy data," IEEE Trans. Inf. Theory, vol.IT-54, no.12, pp. 5609-5631, Dec. 2008.
-
(2008)
IEEE Trans. Inf. Theory
, vol.IT-54
, Issue.12
, pp. 5609-5631
-
-
Cohen, A.1
Merhav, N.2
Weissman, T.3
-
8
-
-
0347696451
-
Evaluation of likelihood functions
-
T. E. Duncan, "Evaluation of likelihood functions," Inf. Contr., vol.13, pp. 62-74, 1968.
-
(1968)
Inf. Contr.
, vol.13
, pp. 62-74
-
-
Duncan, T.E.1
-
9
-
-
0343492067
-
On the calculation of mutual information
-
Jul.
-
T. E. Duncan, "On the calculation of mutual information," SIAM J. Appl. Math., vol.19, pp. 215-220, Jul. 1970.
-
(1970)
SIAM J. Appl. Math.
, vol.19
, pp. 215-220
-
-
Duncan, T.E.1
-
10
-
-
73849136668
-
Mutual information for stochastic signals and Lévy processes
-
Jan.
-
T. E. Duncan, "Mutual information for stochastic signals and Lévy processes," IEEE Trans. Inf. Theory, vol.IT-56, no.1, pp. 18-24, Jan. 2010.
-
(2010)
IEEE Trans. Inf. Theory
, vol.IT-56
, Issue.1
, pp. 18-24
-
-
Duncan, T.E.1
-
11
-
-
54749145122
-
Mutual information for stochastic signals and fractional Brownian motion
-
Oct.
-
T. E. Duncan, "Mutual information for stochastic signals and fractional Brownian motion," IEEE Trans. Inf. Theory, vol.IT-54, no.10, pp. 4432-4438, Oct. 2010.
-
(2010)
IEEE Trans. Inf. Theory
, vol.IT-54
, Issue.10
, pp. 4432-4438
-
-
Duncan, T.E.1
-
12
-
-
0004141199
-
Source coding with side information and universal coding
-
presented at
-
R. G. Gallager, "Source coding with side information and universal coding," presented at the M.I.T. LIDS-P-937, 1979.
-
(1979)
The M.I.T. LIDS-P-937
-
-
Gallager, R.G.1
-
13
-
-
0039065101
-
Stochastic resonance
-
L. Gammaitoni, P. Hanggi, P. Jung, and F. Marchesoni, "Stochastic resonance," Rev. Mod. Phys., vol.70, pp. 223-287, 1998. (Pubitemid 128367336)
-
(1998)
Reviews of Modern Physics
, vol.70
, Issue.1
, pp. 223-287
-
-
Gammaitoni, L.1
Hanggi, P.2
Jung, P.3
Marchesoni, F.4
-
14
-
-
0000259956
-
On transforming a certain class of stochastic processes by absolutely continuous substitution of measures
-
I. V. Girsanov, "On transforming a certain class of stochastic processes by absolutely continuous substitution of measures," Theory Probab. Appl., vol.5, pp. 285-301, 1960.
-
(1960)
Theory Probab. Appl.
, vol.5
, pp. 285-301
-
-
Girsanov, I.V.1
-
15
-
-
70449469739
-
Relative entropy and score function: New information-estimation relationships through arbitrary additive perturbation
-
presented at, Seoul, Korea, Jul. 3
-
D. Guo, "Relative entropy and score function: New information-estimation relationships through arbitrary additive perturbation," presented at the IEEE Int. Symp. Information Theory, Seoul, Korea, Jul. 3, 2009.
-
(2009)
The IEEE Int. Symp. Information Theory
-
-
Guo, D.1
-
16
-
-
17644366813
-
Mutual information and minimum mean-square error in Gaussian channels
-
Apr.
-
D. Guo, S. Shamai, and S. Verdú, "Mutual information and minimum mean-square error in Gaussian channels," IEEE Trans. Inf. Theory, vol.IT-51, no.4, pp. 1261-1283, Apr. 2005.
-
(2005)
IEEE Trans. Inf. Theory
, vol.IT-51
, Issue.4
, pp. 1261-1283
-
-
Guo, D.1
Shamai, S.2
Verdú, S.3
-
17
-
-
43749104456
-
Mutual information and conditional mean estimation in Poisson channels
-
May
-
D. Guo, S. Shamai, and S. Verdú, "Mutual information and conditional mean estimation in Poisson channels," IEEE Trans. Inf. Theory, vol.54, no.5, pp. 1837-1849, May 2008.
-
(2008)
IEEE Trans. Inf. Theory
, vol.54
, Issue.5
, pp. 1837-1849
-
-
Guo, D.1
Shamai, S.2
Verdú, S.3
-
18
-
-
0015099831
-
Capacity of a continuous memoryless channel with feedback
-
T. T. Kadota, M. Zakai, and J. Ziv, "Capacity of a continuous memoryless channel with feedback," IEEE Trans. Inf. Theory, vol.IT-17, pp. 372-378, 1971.
-
(1971)
IEEE Trans. Inf. Theory
, vol.IT-17
, pp. 372-378
-
-
Kadota, T.T.1
Zakai, M.2
Ziv, J.3
-
19
-
-
0015094912
-
Mutual information of the white Gaussian channel with and without feedback
-
T. T. Kadota, M. Zakai, and J. Ziv, "Mutual information of the white Gaussian channel with and without feedback," Trans. Inf. Theory, vol.IT-17, pp. 368-371, 1971.
-
(1971)
Trans. Inf. Theory
, vol.IT-17
, pp. 368-371
-
-
Kadota, T.T.1
Zakai, M.2
Ziv, J.3
-
20
-
-
33747265351
-
The structure of Radon-Nykodim derivatives with respect toWiener and related measures
-
T. Kailath, "The structure of Radon-Nykodim derivatives with respect toWiener and related measures," Ann. Math. Statist., vol.42, no.3, pp. 1054-1067, 1971.
-
(1971)
Ann. Math. Statist.
, vol.42
, Issue.3
, pp. 1054-1067
-
-
Kailath, T.1
-
22
-
-
70449503463
-
Directed information and causal estimation in continuous time
-
presented at, Seoul, Korea, Jul. 29th
-
Y. H. Kim, H. H. Permuter, and T. Weissman, "Directed information and causal estimation in continuous time," presented at the Int. Symp. Inf. Theory, Seoul, Korea, Jul. 29th, 2009.
-
(2009)
The Int. Symp. Inf. Theory
-
-
Kim, Y.H.1
Permuter, H.H.2
Weissman, T.3
-
23
-
-
77954824787
-
On the Shannon theory of information transmission in the case of continuous signals
-
Dec.
-
A. Kolmogorov, "On the Shannon theory of information transmission in the case of continuous signals," IRE Trans. Inf. Theory, vol.2, no.4, pp. 102-108, Dec. 1956.
-
(1956)
IRE Trans. Inf. Theory
, vol.2
, Issue.4
, pp. 102-108
-
-
Kolmogorov, A.1
-
26
-
-
33746879933
-
Optimum power allocation for parallel Gaussian channels with arbitrary input distributions
-
Jul.
-
A. Lozano, A. M. Tulino, and S. Verdú, "Optimum power allocation for parallel Gaussian channels with arbitrary input distributions," IEEE Trans. Inf. Theory, vol.52, no.7, pp. 3033-3051, Jul. 2006.
-
(2006)
IEEE Trans. Inf. Theory
, vol.52
, Issue.7
, pp. 3033-3051
-
-
Lozano, A.1
Tulino, A.M.2
Verdú, S.3
-
27
-
-
84935710959
-
Conditions for the absolute continuity of two diffusions
-
S. Orey, "Conditions for the absolute continuity of two diffusions," Trans. Amer. Math. Soc., vol.193, pp. 413-426, 1974.
-
(1974)
Trans. Amer. Math. Soc.
, vol.193
, pp. 413-426
-
-
Orey, S.1
-
28
-
-
34447344072
-
Generalized entropy power inequalities and monotonicity properties of information
-
Jul.
-
M. Madiman and A. R. Barron, "Generalized entropy power inequalities and monotonicity properties of information," IEEE Trans. Inf. Theory, vol.53, no.7, pp. 2317-2329, Jul. 2007.
-
(2007)
IEEE Trans. Inf. Theory
, vol.53
, Issue.7
, pp. 2317-2329
-
-
Madiman, M.1
Barron, A.R.2
-
29
-
-
52349110462
-
Some relations between mutual information and estimation error inWiener space
-
Jun.
-
E. Mayer-Wolf and M. Zakai, "Some relations between mutual information and estimation error inWiener space," Ann. Appl. Probab., vol.17, no.3, pp. 1102-1116, Jun. 2007.
-
(2007)
Ann. Appl. Probab.
, vol.17
, Issue.3
, pp. 1102-1116
-
-
Mayer-Wolf, E.1
Zakai, M.2
-
30
-
-
0029304928
-
A strong version of the redundancy-capacity theorem of universal coding
-
May
-
N. Merhav and M. Feder, "A strong version of the redundancy-capacity theorem of universal coding," IEEE Trans. Inf. Theory, vol.41, no.3, pp. 714-722, May 1995.
-
(1995)
IEEE Trans. Inf. Theory
, vol.41
, Issue.3
, pp. 714-722
-
-
Merhav, N.1
Feder, M.2
-
31
-
-
33144485285
-
Gradient of mutual information in linear vector Gaussian channels
-
Jan.
-
D. P. Palomar and S. Verdú, "Gradient of mutual information in linear vector Gaussian channels," IEEE Trans. Inf. Theory, vol.52, no.1, pp. 141-154, Jan. 2006.
-
(2006)
IEEE Trans. Inf. Theory
, vol.52
, Issue.1
, pp. 141-154
-
-
Palomar, D.P.1
Verdú, S.2
-
32
-
-
33847635134
-
Representation of mutual information via input estimates
-
Feb.
-
D. P. Palomar and S. Verdú, "Representation of mutual information via input estimates," IEEE Trans. Inf. Theory, vol.53, no.2, pp. 453-470, Feb. 2007.
-
(2007)
IEEE Trans. Inf. Theory
, vol.53
, Issue.2
, pp. 453-470
-
-
Palomar, D.P.1
Verdú, S.2
-
33
-
-
84860693606
-
Information and information stability of random variables and processes
-
presented at, Russian
-
M. S. Pinsker, "Information and information stability of random variables and processes," presented at the Moskva: Izv. Akad. Nauk, Russian, 1960.
-
(1960)
The Moskva: Izv. Akad. Nauk
-
-
Pinsker, M.S.1
-
34
-
-
0018458965
-
Encoding a source with unknown but ordered probabilities
-
Oct.
-
B. Y. Ryabko, "Encoding a source with unknown but ordered probabilities," Probl. Inf. Transm., pp. 134-139, Oct. 1979.
-
(1979)
Probl. Inf. Transm.
, pp. 134-139
-
-
Ryabko, B.Y.1
-
35
-
-
12944295719
-
Some inequalities satisfies by the quantities of information of Fisher and Shannon
-
Jun.
-
A. Stam, "Some inequalities satisfies by the quantities of information of Fisher and Shannon," Inf. Control, vol.2, pp. 101-112, Jun. 1959.
-
(1959)
Inf. Control
, vol.2
, pp. 101-112
-
-
Stam, A.1
-
36
-
-
33748580282
-
Monotonic decrease of the non-Gaussianness of the sum of independent random variables: A simple proof
-
Sep.
-
A. M. Tulino and S. Verdú, "Monotonic decrease of the non-Gaussianness of the sum of independent random variables: A simple proof," IEEE Trans. Inf. Theory, vol.52, no.9, pp. 4295-4297, Sep. 2006.
-
(2006)
IEEE Trans. Inf. Theory
, vol.52
, Issue.9
, pp. 4295-4297
-
-
Tulino, A.M.1
Verdú, S.2
-
37
-
-
70449473669
-
Mismatched estimation and relative entropy
-
presented at, Seoul, Korea, Jul. 29
-
S. Verdú, "Mismatched estimation and relative entropy," presented at the Int. Symp. Information Theory, Seoul, Korea, Jul. 29, 2009.
-
(2009)
The Int. Symp. Information Theory
-
-
Verdú, S.1
-
38
-
-
33646020998
-
A simple proof of the entropy power inequality
-
May
-
S. Verdú and D. Guo, "A simple proof of the entropy power inequality," IEEE Trans. Inf. Theory, vol.52, no.5, pp. 2165-2166, May 2006.
-
(2006)
IEEE Trans. Inf. Theory
, vol.52
, Issue.5
, pp. 2165-2166
-
-
Verdú, S.1
Guo, D.2
-
39
-
-
33748322276
-
A definition of conditional mutual information for arbitrary ensembles
-
A. D.Wyner, "A definition of conditional mutual information for arbitrary ensembles," Inf. Control, vol.38, pp. 51-59, 1978.
-
(1978)
Inf. Control
, vol.38
, pp. 51-59
-
-
Wyner, A.D.1
-
40
-
-
26444493543
-
On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel
-
Sep.
-
M. Zakai, "On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel," IEEE Trans. Inf. Theory, vol.51, no.9, pp. 3017-3024, Sep. 2005.
-
(2005)
IEEE Trans. Inf. Theory
, vol.51
, Issue.9
, pp. 3017-3024
-
-
Zakai, M.1
|