메뉴 건너뛰기




Volumn 37, Issue 1, 2009, Pages 119-139

New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation

Author keywords

Asymptotic normality; Bayes optimal rule; Consistency; Local polynomial regression; Loss function; Prediction error

Indexed keywords


EID: 74049108277     PISSN: 03195724     EISSN: None     Source Type: Journal    
DOI: 10.1002/cjs.10005     Document Type: Article
Times cited : (22)

References (25)
  • 2
    • 0035370926 scopus 로고    scopus 로고
    • Relative loss bounds for on-line density estimation with the exponential family of distributions
    • K. S. Azoury & M. K. Warmuth (2001). Relative loss bounds for on-line density estimation with the exponential family of distributions. Machine Learning, 43, 211-246.
    • (2001) Machine Learning , vol.43 , pp. 211-246
    • Azoury, K.S.1    Warmuth, M.K.2
  • 3
    • 23744473964 scopus 로고    scopus 로고
    • On the optimality of conditional expectation as a Bregman predictor
    • A. Banerjee, X. Guo & H. Wang (2005). On the optimality of conditional expectation as a Bregman predictor. IEEE Transactions on Information Theory, 51, 2664-2669.
    • (2005) Ieee Transactions On Information Theory , vol.51 , pp. 2664-2669
    • Banerjee, A.1    Guo, X.2    Wang, H.3
  • 4
    • 49949144765 scopus 로고
    • A relaxation method of finding a common point of convex set sand its application to the solution of problems in convex programming
    • L.M. Brègman (1967). A relaxation method of finding a common point of convex set sand its application to the solution of problems in convex programming. U.S.S.R. Computational Mathematics and Mathematical Physics, 7, 620-631.
    • (1967) U.s.s.r. Computational Mathematics and Mathematical Physics , vol.7 , pp. 620-631
    • Brègman, L.M.1
  • 5
    • 0346786584 scopus 로고    scopus 로고
    • Arching classifiers (with discussion)
    • L. Breiman (1998). Arching classifiers (with discussion). Annals of Statistics, 26, 801-824.
    • (1998) Annals of Statistics , vol.26 , pp. 801-824
    • Breiman, L.1
  • 7
    • 80053264999 scopus 로고
    • How biased is the apparent error rate of a prediction rule?
    • B. Efron (1986). How biased is the apparent error rate of a prediction rule? Journal of American Statistical Association, 81, 461-470.
    • (1986) Journal of American Statistical Association , vol.81 , pp. 461-470
    • Efron, B.1
  • 8
    • 4944239996 scopus 로고    scopus 로고
    • The estimation of prediction error: Covariance penalties and cross-validation (with discussion)
    • B. Efron (2004). The estimation of prediction error: covariance penalties and cross-validation (with discussion). Journal of American Statistical Association, 99, 619-642.
    • (2004) Journal of American Statistical Association , vol.99 , pp. 619-642
    • Efron, B.1
  • 10
    • 0031211090 scopus 로고    scopus 로고
    • A decision-theoretic generalization of on-line learning and an application to boosting
    • Y. Freund & R. E. Schapire (1997). A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55, 119-139.
    • (1997) Journal of Computer and System Sciences , vol.55 , pp. 119-139
    • Freund, Y.1    Schapire, R.E.2
  • 11
    • 21744462998 scopus 로고    scopus 로고
    • On bias, variance, 0/1-loss, and the curse-of-dimensionality
    • J. Friedman (1997). On bias, variance, 0/1-loss, and the curse-of-dimensionality. Journal of Data Mining and Knowledge Discovery, 1, 55-77.
    • (1997) Journal of Data Mining and Knowledge Discovery , vol.1 , pp. 55-77
    • Friedman, J.1
  • 12
    • 0034164230 scopus 로고    scopus 로고
    • Additive logistic regression: A statistical view of boosting (with discussion)
    • J. Friedman, T. Hastie & R. Tibshirani (2000). Additive logistic regression: a statistical view of boosting (with discussion). Annals of Statistics, 28, 337-407.
    • (2000) Annals of Statistics , vol.28 , pp. 337-407
    • Friedman, J.1    Hastie, T.2    Tibshirani, R.3
  • 13
    • 6344274901 scopus 로고    scopus 로고
    • Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
    • P. D. Grünwald & A. P. Dawid (2004). Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory. Annals of Statistics, 32, 1367-1433.
    • (2004) Annals of Statistics , vol.32 , pp. 1367-1433
    • Grünwald, P.D.1    Dawid, A.P.2
  • 16
    • 0033280350 scopus 로고    scopus 로고
    • Proceedings of the Twelfth Annual Conference on Computational Learning Theory, Santa Cruz, CA. ACM Press, New York, NY
    • J. Kivinen & M. K. Warmuth (1999). Boosting as entropy projection. Proceedings of the Twelfth Annual Conference on Computational Learning Theory, Santa Cruz, CA. ACM Press, New York, NY, pp. 134-144.
    • (1999) Boosting As Entropy Projection , pp. 134-144
    • Kivinen, J.1    Warmuth, M.K.2
  • 18
    • 0033280975 scopus 로고    scopus 로고
    • Proceedings of the TwelfthAnnualConferenceonComputationalLearningTheory,SantaCruz,CA.ACMPress,NewYork, NY
    • J. Lafferty (1999). Additive models, boosting, and inference for generalized divergences. Proceedings of the TwelfthAnnualConferenceonComputationalLearningTheory,SantaCruz,CA.ACMPress,NewYork, NY, pp. 125-133.
    • (1999) Additive Models, Boosting, and Inference For Generalized Divergences , pp. 125-133
    • Lafferty, J.1
  • 24
    • 0016335739 scopus 로고
    • Quasi-likelihood functions, generalized linear models, and the Gauss-Newton method
    • R. W. M. Wedderburn (1974). Quasi-likelihood functions, generalized linear models, and the Gauss-Newton method. Biometrika, 61, 439-447.
    • (1974) Biometrika , vol.61 , pp. 439-447
    • Wedderburn, R.W.M.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.