메뉴 건너뛰기




Volumn 2, Issue , 2003, Pages 752-759

Low Bias Bagged Support Vector Machines

Author keywords

[No Author keywords available]

Indexed keywords

ALGORITHMS; APPROXIMATION THEORY; DECOMPOSITION; ERROR ANALYSIS; LEARNING SYSTEMS; OPTIMIZATION; PARAMETER ESTIMATION; POLYNOMIALS; PROBLEM SOLVING;

EID: 1942452226     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (70)

References (19)
  • 1
    • 0034869307 scopus 로고    scopus 로고
    • Optimal artificial neural network architecture selection for voting
    • IEEE
    • Andersen, T., Rimer, M., & Martinez, T. R. (2001). Optimal artificial neural network architecture selection for voting. Proc. of IJCNN'01 (pp. 790-795). IEEE.
    • (2001) Proc. of IJCNN'01 , pp. 790-795
    • Andersen, T.1    Rimer, M.2    Martinez, T.R.3
  • 2
    • 0032645080 scopus 로고    scopus 로고
    • An empirical comparison of voting classification algorithms: Bagging, boosting, and variants
    • Bauer, E., & Kohavi, R. (1999). An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning, 36, 105-139.
    • (1999) Machine Learning , vol.36 , pp. 105-139
    • Bauer, E.1    Kohavi, R.2
  • 3
    • 0030211964 scopus 로고    scopus 로고
    • Bagging predictors
    • Breiman, L. (1996). Bagging predictors. Machine Learning, 24, 123-140.
    • (1996) Machine Learning , vol.24 , pp. 123-140
    • Breiman, L.1
  • 5
    • 0035478854 scopus 로고    scopus 로고
    • Random Forests
    • Breiman, L. (2001). Random Forests. Machine Learning, 45, 5-32.
    • (2001) Machine Learning , vol.45 , pp. 5-32
    • Breiman, L.1
  • 6
    • 0035173242 scopus 로고    scopus 로고
    • Combining Support Vector Machines for Accurate Face Detection
    • Buciu, I., Kotropoulos, C., & Pitas, I. (2001). Combining Support Vector Machines for Accurate Face Detection. Proc. of ICIP'01 (pp. 1054-1057).
    • (2001) Proc. of ICIP'01 , pp. 1054-1057
    • Buciu, I.1    Kotropoulos, C.2    Pitas, I.3
  • 8
    • 0036583160 scopus 로고    scopus 로고
    • A Parallel Mixture of SVMs for Very Large Scale Problems
    • 1105-1114
    • Collobert, R., Bengio, S., & Bengio, Y. (2002). A Parallel Mixture of SVMs for Very Large Scale Problems. Neural Computation, 14, 1105-1114.
    • (2002) Neural Computation , vol.14
    • Collobert, R.1    Bengio, S.2    Bengio, Y.3
  • 10
    • 0000259511 scopus 로고    scopus 로고
    • Approximate statistical tests for comparing supervised classification learning algorithms
    • Dietterich, T. G. (1998). Approximate statistical tests for comparing supervised classification learning algorithms. Neural Computation, 10, 1895-1924.
    • (1998) Neural Computation , vol.10 , pp. 1895-1924
    • Dietterich, T.G.1
  • 11
    • 0034250160 scopus 로고    scopus 로고
    • An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization
    • Dietterich, T. G. (2000). An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, 40, 139-158.
    • (2000) Machine Learning , vol.40 , pp. 139-158
    • Dietterich, T.G.1
  • 12
    • 0012937288 scopus 로고    scopus 로고
    • A Unified Bias-Variance Decomposition and its Applications
    • th ICML (pp. 231-238).
    • (2000) th ICML , pp. 231-238
    • Domingos, P.1
  • 13
    • 0002978642 scopus 로고    scopus 로고
    • Experiments with a new boosting algorithm
    • Morgan Kauffman
    • th ICML (pp. 148-156). Morgan Kauffman.
    • (1996) th ICML , pp. 148-156
    • Freund, Y.1    Schapire, R.2
  • 14
    • 21744462998 scopus 로고    scopus 로고
    • On bias, variance, 0/1 loss and the curse of dimensionality
    • Friedman, J. (1997). On bias, variance, 0/1 loss and the curse of dimensionality. Data Mining and Knowledge Discovery, 1, 55-77.
    • (1997) Data Mining and Knowledge Discovery , vol.1 , pp. 55-77
    • Friedman, J.1
  • 15
    • 0037403462 scopus 로고    scopus 로고
    • Variance and bias for general loss function
    • in press
    • James, G. (2003). Variance and bias for general loss function. Machine Learning, (in press).
    • (2003) Machine Learning
    • James, G.1
  • 16
    • 10044232174 scopus 로고    scopus 로고
    • Pattern Classification Using Support Vector Machine Ensemble
    • IEEE
    • Kim, H., Pang, S., Je, H., Kim, D., & Bang, S. (2002). Pattern Classification Using Support Vector Machine Ensemble. Proc. of ICPR'02 (pp. 20160-20163). IEEE.
    • (2002) Proc. of ICPR'02 , pp. 20160-20163
    • Kim, H.1    Pang, S.2    Je, H.3    Kim, D.4    Bang, S.5
  • 18
    • 84947560298 scopus 로고    scopus 로고
    • Bias-variance analysis and ensembles of SVM
    • Springer-Verlag
    • Valentini, G., & Dietterich, T. (2002). Bias-variance analysis and ensembles of SVM. MCS2002, Cagliari, Italy (pp. 222-231). Springer-Verlag.
    • (2002) MCS2002, Cagliari, Italy , pp. 222-231
    • Valentini, G.1    Dietterich, T.2
  • 19
    • 0036825897 scopus 로고    scopus 로고
    • NEURObjects: An object-oriented library for neural network development
    • Valentini, G., & Masulli, F. (2002). NEURObjects: an object-oriented library for neural network development. Neurocomputing, 48, 623-646.
    • (2002) Neurocomputing , vol.48 , pp. 623-646
    • Valentini, G.1    Masulli, F.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.