메뉴 건너뛰기




Volumn 134, Issue 2, 2005, Pages 332-349

Model selection criteria based on Kullback information measures for nonlinear regression

Author keywords

AIC; Akaike information criterion; I divergence; J divergence; Kullback Leibler information; Nonlinear regression

Indexed keywords


EID: 21644466361     PISSN: 03783758     EISSN: None     Source Type: Journal    
DOI: 10.1016/j.jspi.2004.05.002     Document Type: Article
Times cited : (20)

References (15)
  • 1
    • 0000501656 scopus 로고
    • Information theory and an extension of the maximum likelihood principle
    • B.N. Petrov F. Csáki (Eds.) Akadémia Kiadó Budapest, Hungary
    • H. Akaike Information theory and an extension of the maximum likelihood principle in: B.N. Petrov F. Csáki (Eds.) Second International Symposium on Information Theory 1973 Akadémia Kiadó Budapest, Hungary 267-281
    • (1973) Second International Symposium on Information Theory , pp. 267-281
    • Akaike, H.1
  • 2
    • 0016355478 scopus 로고
    • A new look at the statistical model identification
    • H. Akaike A new look at the statistical model identification IEEE Trans. Automat. Control AC-19 1974 716-723
    • (1974) IEEE Trans. Automat. Control AC-19 , pp. 716-723
    • Akaike, H.1
  • 3
    • 0031591140 scopus 로고    scopus 로고
    • Unifying the derivations of the Akaike and corrected Akaike information criteria
    • J.E. Cavanaugh Unifying the derivations of the Akaike and corrected Akaike information criteria Statist. Probab. Lett. 33 1997 201-208
    • (1997) Statist. Probab. Lett. , vol.33 , pp. 201-208
    • Cavanaugh, J.E.1
  • 4
    • 0033130206 scopus 로고    scopus 로고
    • A large-sample model selection criterion based on Kullback's symmetric divergence
    • J.E. Cavanaugh A large-sample model selection criterion based on Kullback's symmetric divergence Statist. Probab. Lett. 42 1999 333-343
    • (1999) Statist. Probab. Lett. , vol.42 , pp. 333-343
    • Cavanaugh, J.E.1
  • 5
    • 3042856494 scopus 로고    scopus 로고
    • Criteria for linear model selection based on Kullback's symmetric divergence
    • Cavanaugh, J.E., 2004. Criteria for linear model selection based on Kullback's symmetric divergence. Austral. New Zealand J. Statist., 46, 257-274.
    • (2004) Austral. New Zealand J. Statist. , vol.46 , pp. 257-274
    • Cavanaugh, J.E.1
  • 7
    • 70349119250 scopus 로고
    • Regression and time series model selection in small samples
    • C.M. Hurvich C.-L. Tsai Regression and time series model selection in small samples Biometrika 76 1989 297-307
    • (1989) Biometrika , vol.76 , pp. 297-307
    • Hurvich, C.M.1    Tsai, C.-L.2
  • 8
    • 77956887294 scopus 로고
    • Improved estimators of Kullback-Leibler information for autoregressive model selection in small samples
    • C.M. Hurvich R.H. Shumway C.-L. Tsai Improved estimators of Kullback-Leibler information for autoregressive model selection in small samples Biometrika 77 1990 709-719
    • (1990) Biometrika , vol.77 , pp. 709-719
    • Hurvich, C.M.1    Shumway, R.H.2    Tsai, C.-L.3
  • 15
    • 84963178774 scopus 로고
    • Further analysis of the data by Akaike's information criterion and the finite corrections
    • N. Sugiura Further analysis of the data by Akaike's information criterion and the finite corrections Comm. Statist. A7 1978 13-26
    • (1978) Comm. Statist. , vol.A7 , pp. 13-26
    • Sugiura, N.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.