메뉴 건너뛰기




Volumn 13, Issue 5, 2002, Pages 1035-1044

Generalized information potential criterion for adaptive system training

Author keywords

Minimum error entropy; Parzen windowing; Renyi's entropy; Supervised training

Indexed keywords

INFORMATION CONTENT;

EID: 0036737108     PISSN: 10459227     EISSN: None     Source Type: Journal    
DOI: 10.1109/TNN.2002.1031936     Document Type: Article
Times cited : (209)

References (31)
  • 2
    • 0010437931 scopus 로고
    • Toward an organizing principle for a layered perceptual network
    • D. Anderson, Ed. New York: Amer. Inst. Phys.
    • R. Linsker, "Toward an organizing principle for a layered perceptual network," in Neural Information Processing Systems, D. Anderson, Ed. New York: Amer. Inst. Phys., 1988, pp. 485-494.
    • (1988) Neural Information Processing Systems , pp. 485-494
    • Linsker, R.1
  • 4
    • 84980082105 scopus 로고
    • Transmission of information
    • R. V. Hartley, "Transmission of information," Bell Syst. Tech. J., vol. 7, 1928.
    • (1928) Bell Syst. Tech. J. , vol.7
    • Hartley, R.V.1
  • 5
    • 84856043672 scopus 로고
    • A mathematical theory of communications
    • C. E. Shannon, "A mathematical theory of communications," Bell Syst. Tech. J., vol. 27, pp. 379-423, 623-656, 1948.
    • (1948) Bell Syst. Tech. J. , vol.27 , pp. 379-423
    • Shannon, C.E.1
  • 8
    • 0028416938 scopus 로고
    • Independent component analysis, a new concept?
    • P. Comon, "Independent component analysis, a new concept?," Signal Processing, vol. 36, pp. 287-314, 1994.
    • (1994) Signal Processing , vol.36 , pp. 287-314
    • Comon, P.1
  • 9
    • 0000056917 scopus 로고    scopus 로고
    • Adaptive online learning algorithms for blind separation: Maximum entropy and minimum mutual information
    • H. Yang and S. Amari, "Adaptive online learning algorithms for blind separation: Maximum entropy and minimum mutual information," Neural Comput., vol. 9, pp. 1457-1482, 1997.
    • (1997) Neural Comput. , vol.9 , pp. 1457-1482
    • Yang, H.1    Amari, S.2
  • 10
    • 0003580192 scopus 로고
    • On estimation of a probability density function and mode
    • San Francisco, CA: Holden-Day
    • E. Parzen, "On estimation of a probability density function and mode," in Time Series Analysis Papers. San Francisco, CA: Holden-Day, 1967.
    • (1967) Time Series Analysis Papers
    • Parzen, E.1
  • 11
    • 0000986833 scopus 로고    scopus 로고
    • Information theoretic learning
    • S. Haykin, Ed. New York: Wiley
    • J. C. Principe, D. Xu, and J. Fisher, "Information theoretic learning," in Unsupervised Adaptive Filtering, S. Haykin, Ed. New York: Wiley, 2000, vol. I, pp. 265-319.
    • (2000) Unsupervised Adaptive Filtering , vol.1 , pp. 265-319
    • Principe, J.C.1    Xu, D.2    Fisher, J.3
  • 12
    • 0035369829 scopus 로고    scopus 로고
    • Blind source separation using Renyi's mutual information
    • June
    • K. E. Hild II, D. Erdogmus, and J. C. Principe, "Blind source separation using Renyi's mutual information," IEEE Signal Processing Lett., vol. 8, pp. 174-176, June 2001.
    • (2001) IEEE Signal Processing Lett. , vol.8 , pp. 174-176
    • Hild K.E. II1    Erdogmus, D.2    Principe, J.C.3
  • 13
    • 84892189639 scopus 로고    scopus 로고
    • A novel measure for independent component analysis (ICA)
    • Seattle, WA
    • D. Xu, J. C. Principe, J. Fisher, and H. Wu, "A novel measure for independent component analysis (ICA)," in Proc. ICASSP'98, vol. II, Seattle, WA, pp. 1161-1164.
    • Proc. ICASSP'98 , vol.2 , pp. 1161-1164
    • Xu, D.1    Principe, J.C.2    Fisher, J.3    Wu, H.4
  • 14
    • 0002341107 scopus 로고    scopus 로고
    • Mutual information in learning feature transformations
    • Stanford, CA
    • K. Torkkola and W. M. Campbell, "Mutual information in learning feature transformations," presented at the Proc. Int. Conf. Machine Learning, Stanford, CA, 2000.
    • (2000) Proc. Int. Conf. Machine Learning
    • Torkkola, K.1    Campbell, W.M.2
  • 16
    • 0001541230 scopus 로고    scopus 로고
    • Detecting differences between delay vector distributions
    • C. Diks, J. Houwelingen, F. Takens, and J. deGoede, "Detecting differences between delay vector distributions," Phys. Rev. E, vol. 53, pp. 2169-2176, 1996.
    • (1996) Phys. Rev. E , vol.53 , pp. 2169-2176
    • Diks, C.1    Houwelingen, J.2    Takens, F.3    DeGoede, J.4
  • 17
    • 33646981873 scopus 로고
    • Characterization of strange attractors
    • P. Grassberger and I. Procaccia, "Characterization of strange attractors," Phys. Rev. Lett., vol. 50, no. 5, pp. 346-349, 1983.
    • (1983) Phys. Rev. Lett. , vol.50 , Issue.5 , pp. 346-349
    • Grassberger, P.1    Procaccia, I.2
  • 18
    • 0012092745 scopus 로고    scopus 로고
    • Comparison of entropy and mean square error criteria in adaptive system training using higher order statistics
    • Helsinki, Finland
    • D. Erdogmus and J. C. Principe, "Comparison of entropy and mean square error criteria in adaptive system training using higher order statistics," presented at the Proc. Independent Components Analysis (ICA), Helsinki, Finland, 2000.
    • (2000) Proc. Independent Components Analysis (ICA)
    • Erdogmus, D.1    Principe, J.C.2
  • 19
    • 0036647905 scopus 로고    scopus 로고
    • An entropy minimization algorithm for short-term prediction of chaotic time series
    • July, submitted for publication
    • _, "An entropy minimization algorithm for short-term prediction of chaotic time series," IEEE Trans. Signal Processing, vol. 50, pp. 1780-1786, July 2002, submitted for publication.
    • (2002) IEEE Trans. Signal Processing , vol.50 , pp. 1780-1786
  • 25
    • 0022471098 scopus 로고
    • Learning internal representations by error back-propagation
    • D. Rumelhart, G. Hinton, and R. Williams, "Learning internal representations by error back-propagation," Nature, vol. 323, pp. 533-536, 1986.
    • (1986) Nature , vol.323 , pp. 533-536
    • Rumelhart, D.1    Hinton, G.2    Williams, R.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.