-
1
-
-
0032645080
-
An empirical comparison of voting classification algorithms: Bagging, boosting, and variants
-
E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning, 36(1,2):105-139, 1999.
-
(1999)
Machine Learning
, vol.36
, Issue.1-2
, pp. 105-139
-
-
Bauer, E.1
Kohavi, R.2
-
2
-
-
0034497712
-
A parallel decision tree builder for mining very large visualization datasets
-
K. Bowyer, N. Chawla, J. T.E. Moore, L. Hall, and W. Kegelmeyer. A parallel decision tree builder for mining very large visualization datasets. In IEEE Systems, Man, and Cybernetics Conference, pages 1888-1893, 2000.
-
(2000)
IEEE Systems, Man, and Cybernetics Conference
, pp. 1888-1893
-
-
Bowyer, K.1
Chawla, N.2
Moore, J.T.E.3
Hall, L.4
Kegelmeyer, W.5
-
4
-
-
0030211964
-
Bagging predictors
-
L. Breiman. Bagging predictors. Machine Learning, 24:123-140, 1996.
-
(1996)
Machine Learning
, vol.24
, pp. 123-140
-
-
Breiman, L.1
-
5
-
-
0035478854
-
Random forests
-
L. Breiman. Random forests. Machine Learning, 45(1):5-32, 2001.
-
(2001)
Machine Learning
, vol.45
, Issue.1
, pp. 5-32
-
-
Breiman, L.1
-
6
-
-
0003802343
-
-
Wadsworth International Group, Belmont, CA
-
L. Breiman, J. Friedman, R. Olshen, and P. Stone. Classification and Regression Trees. Wadsworth International Group, Belmont, CA., 1984.
-
(1984)
Classification and Regression Trees
-
-
Breiman, L.1
Friedman, J.2
Olshen, R.3
Stone, P.4
-
7
-
-
0037235943
-
Distributed learning with bagging-like performance
-
N. Chawla, T. Moore, L. Hall, K. Bowyer, W. Kegelmeyer, and C. Springer. Distributed learning with bagging-like performance. Pattern Recognition Letters, 24:455-471, 2003.
-
(2003)
Pattern Recognition Letters
, vol.24
, pp. 455-471
-
-
Chawla, N.1
Moore, T.2
Hall, L.3
Bowyer, K.4
Kegelmeyer, W.5
Springer, C.6
-
8
-
-
0001087620
-
Logistic regression, AdaBoost and Bregman distances
-
M. Collins, R. Schapire, and Y. Singer. Logistic regression, AdaBoost and Bregman distances. In Proceedings of the Thirteenth Annual Conference on Computational Learning Theory, pages 158-169, 2000.
-
(2000)
Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
, pp. 158-169
-
-
Collins, M.1
Schapire, R.2
Singer, Y.3
-
9
-
-
0034250160
-
An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization
-
T. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Machine Learning, 40(2):139-157, 2000.
-
(2000)
Machine Learning
, vol.40
, Issue.2
, pp. 139-157
-
-
Dietterich, T.1
-
11
-
-
84899023111
-
Learning from infinite data in finite time
-
Cambridge, MA, MIT Press
-
G. Hulten and P. Domingos. Learning from infinite data in finite time. In Advances in Neural Information Processing Systems 14, pages 673-680, Cambridge, MA, 2002. MIT Press.
-
(2002)
Advances in Neural Information Processing Systems
, vol.14
, pp. 673-680
-
-
Hulten, G.1
Domingos, P.2
-
14
-
-
0025448521
-
The strength of weak learnability
-
R. Schapire. The strength of weak learnability. Machine Learning, 5(2):197-227, 1990.
-
(1990)
Machine Learning
, vol.5
, Issue.2
, pp. 197-227
-
-
Schapire, R.1
|