메뉴 건너뛰기




Volumn , Issue , 2011, Pages 1245-1250

Fast AdaBoost training using weighted novelty selection

Author keywords

[No Author keywords available]

Indexed keywords

ADABOOST LEARNING; CLASSIFICATION TASKS; DATA POINTS; DISCRIMINATIVE MODELS; LEARNING PROCESS; LOSS OF ACCURACY; NUMBER OF DATUM; SAMPLING STRATEGIES; TRAINING DATA; TRAINING TIME;

EID: 80054722112     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1109/IJCNN.2011.6033366     Document Type: Conference Paper
Times cited : (20)

References (20)
  • 1
    • 0028324717 scopus 로고
    • Cryptographic limitations on learning boolean formulae and finite automata
    • M. Kearns and L. G. Valiant, "Cryptographic limitations on learning boolean formulae and finite automata," Journal of the Association for Computing Machinery, vol. 41, no. 1, pp. 67-95, 1994.
    • (1994) Journal of the Association for Computing Machinery , vol.41 , Issue.1 , pp. 67-95
    • Kearns, M.1    Valiant, L.G.2
  • 2
    • 33745897632 scopus 로고    scopus 로고
    • Probabilistic boosting-tree: Learning discriminative models for classification, recognition, and clustering
    • Z. Tu, "Probabilistic boosting-tree: learning discriminative models for classification, recognition, and clustering," Proceedings of ICCV, vol. 2, pp. 1589-1596, 2005.
    • (2005) Proceedings of ICCV , vol.2 , pp. 1589-1596
    • Tu, Z.1
  • 5
    • 0025448521 scopus 로고
    • The strength of weak learnability
    • R. E. Schapire, "The strength of weak learnability," Machine Learning, vol. 5, no. 2, pp. 197-227, 1990.
    • (1990) Machine Learning , vol.5 , Issue.2 , pp. 197-227
    • Schapire, R.E.1
  • 6
    • 84983110889 scopus 로고
    • A decision-theoretic generalization of on-line learning and an application to boosting
    • Y. Freund and R. E. Schapire, "A decision-theoretic generalization of on-line learning and an application to boosting," European Conference on Computational Learning Theory, pp. 23-37, 1995.
    • (1995) European Conference on Computational Learning Theory , pp. 23-37
    • Freund, Y.1    Schapire, R.E.2
  • 8
    • 0034164230 scopus 로고    scopus 로고
    • Additive logistic regression: A statistical view of boosting
    • J. Friedman, T. Hastie, and R. Tibshirani, "Additive logistic regression: a statistical view of boosting," Annals of Statistics, vol. 28, no. 2, pp. 337-407, 2000.
    • (2000) Annals of Statistics , vol.28 , Issue.2 , pp. 337-407
    • Friedman, J.1    Hastie, T.2    Tibshirani, R.3
  • 12
    • 78650502911 scopus 로고    scopus 로고
    • A novel fast training algorithm for adaboost
    • H. chuan Wang and L. ming Zhang, "A novel fast training algorithm for adaboost," Journal of Fudan University, vol. 1, 2004.
    • (2004) Journal of Fudan University , vol.1
    • Wang, H.C.1    Zhang, L.M.2
  • 13
    • 0001071040 scopus 로고
    • Resource-allocating network for function interpolation
    • J. Platt, "Resource-allocating network for function interpolation," Neural Computation, vol. 3, no. 2, pp. 213-225, 1991.
    • (1991) Neural Computation , vol.3 , Issue.2 , pp. 213-225
    • Platt, J.1
  • 14
    • 59149093832 scopus 로고    scopus 로고
    • Density-weighted nystrm method for computing large kernel eigensystems
    • K. Zhang and J. T. Kwok, "Density-weighted nystrm method for computing large kernel eigensystems," Neural Computation, vol. 21, no. 1, pp. 121-146, 2009.
    • (2009) Neural Computation , vol.21 , Issue.1 , pp. 121-146
    • Zhang, K.1    Kwok, J.T.2
  • 16
    • 77950857605 scopus 로고    scopus 로고
    • Simplifying mixture models through function approximation
    • Apr.
    • K. Zhang and J. T. Kwok, "Simplifying mixture models through function approximation," IEEE Transactions on Neural Networks, vol. 21, no. 4, pp. 644-658, Apr. 2010.
    • (2010) IEEE Transactions on Neural Networks , vol.21 , Issue.4 , pp. 644-658
    • Zhang, K.1    Kwok, J.T.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.