-
2
-
-
0039065101
-
-
L. Gammaitoni, P. Hänggi, P. Jung, and F. Marchesoni, Rev. Mod. Phys. 70, 223 (1998).
-
(1998)
Rev. Mod. Phys.
, vol.70
, pp. 223
-
-
Gammaitoni, L.1
Hänggi, P.2
Jung, P.3
Marchesoni, F.4
-
4
-
-
0001569941
-
-
E. Simonotto, M. Riani, C. Seife, M. Roberts, J. Twitty, and F. Moss, Phys. Rev. Lett. 78, 1186 (1997).
-
(1997)
Phys. Rev. Lett.
, vol.78
, pp. 1186
-
-
Simonotto, E.1
Riani, M.2
Seife, C.3
Roberts, M.4
Twitty, J.5
Moss, F.6
-
7
-
-
0007422443
-
-
B.J. Gluckman, P. So, T.I. Netoff, M.L. Spano, and S.J. Schiff, Chaos 8, 588 (1998).
-
(1998)
Chaos
, vol.8
, pp. 588
-
-
Gluckman, B.J.1
So, P.2
Netoff, T.I.3
Spano, M.L.4
Schiff, S.J.5
-
8
-
-
0027485141
-
-
J.K. Douglass, L. Wilkens, E. Pantazelou, and F. Moss, Nature (London) 365, 337 (1993).
-
(1993)
Nature (London)
, vol.365
, pp. 337
-
-
Douglass, J.K.1
Wilkens, L.2
Pantazelou, E.3
Moss, F.4
-
15
-
-
0000069081
-
-
Phys. Rev. EJ.J. Collins, C.C. Chow, A.C. Capela, and T.T. Imhoff, 54, 5575 (1996)
-
(1996)
Phys. Rev. E
, vol.54
, pp. 5575
-
-
Collins, J.J.1
Chow, C.C.2
Capela, A.C.3
Imhoff, T.T.4
-
16
-
-
4243722345
-
-
Phys. Rev. EC. Heneghan, C.C. Chow, J.J. Collins, T.T. Imhoff, S.B. Lowen, and M.C. Teich, 54, R2228 (1996).
-
(1996)
Phys. Rev. E
, vol.54
-
-
Heneghan, C.1
Chow, C.C.2
Collins, J.J.3
Imhoff, T.T.4
Lowen, S.B.5
Teich, M.C.6
-
23
-
-
0003464004
-
-
L. Schimansky-Geier, T. Pöschel, Springer, Berlin
-
P. Reimann and P. Hänggi, in Lecture Notes in Physics, edited by L. Schimansky-Geier and T. Pöschel (Springer, Berlin, 1997), Vol. 484, pp. 127–139.
-
(1997)
Lecture Notes in Physics
, vol.484
, pp. 127-139
-
-
Reimann, P.1
Hänggi, P.2
-
25
-
-
0003996053
-
-
B. Sakmann, E. Neher, Plenum, New York
-
Single-Channel Recording, 2nd ed., edited by B. Sakmann and E. Neher (Plenum, New York, 1995).
-
(1995)
Single-Channel Recording, 2nd ed.
-
-
-
28
-
-
0030443185
-
-
S. Marom, H. Salman, V. Lyakhov, and E. Braun, J. Membr. Biol. 154, 267 (1996).
-
(1996)
J. Membr. Biol.
, vol.154
, pp. 267
-
-
Marom, S.1
Salman, H.2
Lyakhov, V.3
Braun, E.4
-
31
-
-
21144463797
-
-
The problem of extracting these conductance fluctuations from the current recordings in the presence of a time-dependent (e.g., periodic) driving is explained in D. Petracchi , J. Stat. Phys. 70, 393 (1993).
-
(1993)
J. Stat. Phys.
, vol.70
, pp. 393
-
-
Petracchi, D.1
-
32
-
-
85036368023
-
-
N. G. Van Kampen, Stochastic Processes in Physics and Chemistry, 2nd, enlarged and extended ed. (North-Holland, Amsterdam, 1992)
-
N. G. Van Kampen, Stochastic Processes in Physics and Chemistry, 2nd, enlarged and extended ed. (North-Holland, Amsterdam, 1992).
-
-
-
-
33
-
-
85036174193
-
-
R. L. Stratonovich, Topics in the Theory of Random Noise (Gordon and Breach, New York, 1963), Vol. I
-
R. L. Stratonovich, Topics in the Theory of Random Noise (Gordon and Breach, New York, 1963), Vol. I.
-
-
-
-
34
-
-
85036411747
-
-
It is remarkable that the permutation invariance of (Formula presented) with respect to the set of probabilities (Formula presented) and the property of additivity, i.e., (Formula presented)— when the probabilities factorize in the composed state space —characterize the Shannon entropy (Formula presented) almost uniquely: any functional satisfying these requirements is a linear combination of the Shannon entropy and the Hartley entropy (Formula presented), with (Formula presented) being the number of (Formula presented) that are different from zero). The additional requirements of (i) (Formula presented) being a continuous function of p, (Formula presented), and (ii) (Formula presented) = (Formula presented)(Formula presented), with (Formula presented) [A. Feinstein, Foundations of Information Theory (Mc Graw-Hill, New York, 1958)] determine then the Shannon entropy uniquely
-
It is remarkable that the permutation invariance of (Formula presented) with respect to the set of probabilities (Formula presented) and the property of additivity, i.e., (Formula presented)— when the probabilities factorize in the composed state space —characterize the Shannon entropy (Formula presented) almost uniquely: any functional satisfying these requirements is a linear combination of the Shannon entropy and the Hartley entropy (Formula presented), with (Formula presented) being the number of (Formula presented) that are different from zero). The additional requirements of (i) (Formula presented) being a continuous function of p, (Formula presented), and (ii) (Formula presented) = (Formula presented)(Formula presented), with (Formula presented) [A. Feinstein, Foundations of Information Theory (Mc Graw-Hill, New York, 1958)] determine then the Shannon entropy uniquely.
-
-
-
-
36
-
-
0003634127
-
-
MIT Press, Cambridge, MA
-
F. Rieke, D. Warland, R. de Ruyter van Steveninck, and W. Bialek, Spikes: Exploring the Neural Code (MIT Press, Cambridge, MA, 1997).
-
(1997)
Spikes: Exploring the Neural Code
-
-
Rieke, F.1
Warland, D.2
de Ruyter van Steveninck, R.3
Bialek, W.4
-
37
-
-
0000989527
-
-
S.P. Strong, R. Koberle, R.R. de Ruyter van Steveninck, and W. Bialek, Phys. Rev. Lett. 80, 197 (1997).
-
(1997)
Phys. Rev. Lett.
, vol.80
, pp. 197
-
-
Strong, S.P.1
Koberle, R.2
de Ruyter van Steveninck, R.R.3
Bialek, W.4
-
38
-
-
5244320016
-
-
The many facets of entropy are beautifully outlined in A. Wherl, Rep. Math. Phys. 30, 119 (1991)
-
(1991)
Rep. Math. Phys.
, vol.30
, pp. 119
-
-
Wherl, A.1
-
40
-
-
85036238957
-
-
The informational capacity of an informational channel is defined as the maximal rate of mutual information obtained for all possible statistical distributions of input signals with a fixed rms amplitude 15
-
The informational capacity of an informational channel is defined as the maximal rate of mutual information obtained for all possible statistical distributions of input signals with a fixed rms amplitude 15.
-
-
-
|