메뉴 건너뛰기




Volumn , Issue , 2000, Pages 265-271

Bayesian averaging is well-temperated

Author keywords

[No Author keywords available]

Indexed keywords

MAXIMUM LIKELIHOOD ESTIMATION; OPTIMIZATION; SAMPLING;

EID: 34047115472     PISSN: 10495258     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (7)

References (14)
  • 3
    • 0003856278 scopus 로고    scopus 로고
    • Using adaptive bagging to debias regressions
    • U.C. Berkeley
    • L. Breiman: Using adaptive bagging to debias regressions. Technical Report 547' Statistics Dept. U.C. Berkeley' (1999).
    • (1999) Technical Report 547' Statistics Dept
    • Breiman, L.1
  • 4
    • 45249128876 scopus 로고
    • Combining forecast: A review and annotated bibliography
    • R.T. Clemen Combining forecast: A review and annotated bibliography. Journal of Forecasting 5' 559 (1989).
    • (1989) Journal of Forecasting , vol.5 , pp. 559
    • Clemen, R.T.1
  • 6
    • 0027188154 scopus 로고
    • Stochastic linear learning: Exact test and training error averages
    • L.K. Hansen: Stochastic Linear Learning: Exact Test and Training Error Averages. Neural Networks 6' 393-396' (1993)
    • (1993) Neural Networks , vol.6 , pp. 393-396
    • Hansen, L.K.1
  • 7
    • 0031326925 scopus 로고    scopus 로고
    • Mutual information' metric entropy' and cumulative relative entropy risk
    • D. Haussler and M. Opper: Mutual Information' Metric Entropy' and Cumulative Relative Entropy Risk Annals of Statistics 25 2451-2492 (1997)
    • (1997) Annals of Statistics , vol.25 , pp. 2451-2492
    • Haussler, D.1    Opper, M.2
  • 8
    • 0001691634 scopus 로고    scopus 로고
    • Bias/variance decomposition for likelihood-based estimators
    • T. Heskes: Bias/Variance Decomposition for Likelihood-Based Estimators. Neural Computation 10' pp 1425-1433' (1998).
    • (1998) Neural Computation , vol.10 , pp. 1425-1433
    • Heskes, T.1
  • 10
    • 0026289079 scopus 로고
    • Note on generalization' regularization' and architecture selection in nonlinear learning systems
    • B.H. Juang' S.Y. Kung & C.A. Kamm (eds)' Pis-cataway' New Jersey: IEEE
    • J. Moody: "Note on Generalization' Regularization' and Architecture Selection in Nonlinear Learning Systems'" in B.H. Juang' S.Y. Kung & C.A. Kamm (eds.) Proceedings of the first IEEE Workshop on Neural Networks for Signal Processing' Pis-cataway' New Jersey: IEEE' 1-10' (1991).
    • (1991) Proceedings of the First IEEE Workshop on Neural Networks for Signal Processing , pp. 1-10
    • Moody, J.1
  • 11
    • 0028544395 scopus 로고
    • Network information criterion - Determining the number of hidden units for an artificial neural network model
    • N. Murata' S. Yoshizawa & S. Amari: Network Information Criterion - Determining the Number of Hidden Units for an Artificial Neural Network Model. IEEE Transactions on Neural Networks' vol. 5' no. 6' pp. 865-872' 1994.
    • (1994) IEEE Transactions on Neural Networks , vol.5 , Issue.6 , pp. 865-872
    • Murata, N.1    Yoshizawa, S.2    Amari, S.3
  • 13
    • 84950860747 scopus 로고
    • Consequences and detection of misspecified nonlinear regression models
    • H. White' "Consequences and Detection of Misspecified Nonlinear Regression Models'" Journal of the American Statistical Association' 76(374)' 419-433' (1981).
    • (1981) Journal of the American Statistical Association , vol.76 , Issue.374 , pp. 419-433
    • White, H.1
  • 14
    • 0001025418 scopus 로고
    • Bayesian interpolation
    • D.J.C MacKay: Bayesian Interpolation' Neural Computation 4' 415-447' (1992).
    • (1992) Neural Computation , vol.4 , pp. 415-447
    • MacKay, D.J.C.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.