메뉴 건너뛰기




Volumn 61, Issue 4, 2000, Pages 4272-4280

Stochastic resonance in ion channels characterized by information theory

Author keywords

[No Author keywords available]

Indexed keywords

ION CHANNEL; POTASSIUM CHANNEL;

EID: 0034172761     PISSN: 1063651X     EISSN: None     Source Type: Journal    
DOI: 10.1103/PhysRevE.61.4272     Document Type: Article
Times cited : (127)

References (41)
  • 23
    • 0003464004 scopus 로고    scopus 로고
    • L. Schimansky-Geier, T. Pöschel, Springer, Berlin
    • P. Reimann and P. Hänggi, in Lecture Notes in Physics, edited by L. Schimansky-Geier and T. Pöschel (Springer, Berlin, 1997), Vol. 484, pp. 127–139.
    • (1997) Lecture Notes in Physics , vol.484 , pp. 127-139
    • Reimann, P.1    Hänggi, P.2
  • 25
    • 0003996053 scopus 로고
    • B. Sakmann, E. Neher, Plenum, New York
    • Single-Channel Recording, 2nd ed., edited by B. Sakmann and E. Neher (Plenum, New York, 1995).
    • (1995) Single-Channel Recording, 2nd ed.
  • 31
    • 21144463797 scopus 로고
    • The problem of extracting these conductance fluctuations from the current recordings in the presence of a time-dependent (e.g., periodic) driving is explained in D. Petracchi , J. Stat. Phys. 70, 393 (1993).
    • (1993) J. Stat. Phys. , vol.70 , pp. 393
    • Petracchi, D.1
  • 32
    • 85036368023 scopus 로고    scopus 로고
    • N. G. Van Kampen, Stochastic Processes in Physics and Chemistry, 2nd, enlarged and extended ed. (North-Holland, Amsterdam, 1992)
    • N. G. Van Kampen, Stochastic Processes in Physics and Chemistry, 2nd, enlarged and extended ed. (North-Holland, Amsterdam, 1992).
  • 33
    • 85036174193 scopus 로고    scopus 로고
    • R. L. Stratonovich, Topics in the Theory of Random Noise (Gordon and Breach, New York, 1963), Vol. I
    • R. L. Stratonovich, Topics in the Theory of Random Noise (Gordon and Breach, New York, 1963), Vol. I.
  • 34
    • 85036411747 scopus 로고    scopus 로고
    • It is remarkable that the permutation invariance of (Formula presented) with respect to the set of probabilities (Formula presented) and the property of additivity, i.e., (Formula presented)— when the probabilities factorize in the composed state space —characterize the Shannon entropy (Formula presented) almost uniquely: any functional satisfying these requirements is a linear combination of the Shannon entropy and the Hartley entropy (Formula presented), with (Formula presented) being the number of (Formula presented) that are different from zero). The additional requirements of (i) (Formula presented) being a continuous function of p, (Formula presented), and (ii) (Formula presented) = (Formula presented)(Formula presented), with (Formula presented) [A. Feinstein, Foundations of Information Theory (Mc Graw-Hill, New York, 1958)] determine then the Shannon entropy uniquely
    • It is remarkable that the permutation invariance of (Formula presented) with respect to the set of probabilities (Formula presented) and the property of additivity, i.e., (Formula presented)— when the probabilities factorize in the composed state space —characterize the Shannon entropy (Formula presented) almost uniquely: any functional satisfying these requirements is a linear combination of the Shannon entropy and the Hartley entropy (Formula presented), with (Formula presented) being the number of (Formula presented) that are different from zero). The additional requirements of (i) (Formula presented) being a continuous function of p, (Formula presented), and (ii) (Formula presented) = (Formula presented)(Formula presented), with (Formula presented) [A. Feinstein, Foundations of Information Theory (Mc Graw-Hill, New York, 1958)] determine then the Shannon entropy uniquely.
  • 38
    • 5244320016 scopus 로고
    • The many facets of entropy are beautifully outlined in A. Wherl, Rep. Math. Phys. 30, 119 (1991)
    • (1991) Rep. Math. Phys. , vol.30 , pp. 119
    • Wherl, A.1
  • 40
    • 85036238957 scopus 로고    scopus 로고
    • The informational capacity of an informational channel is defined as the maximal rate of mutual information obtained for all possible statistical distributions of input signals with a fixed rms amplitude 15
    • The informational capacity of an informational channel is defined as the maximal rate of mutual information obtained for all possible statistical distributions of input signals with a fixed rms amplitude 15.


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.