-
1
-
-
0032645080
-
An empirical comparison of voting classification algorithms: Bagging, boosting and variants
-
BAUER, E. and KOHAVI, R. (1999). An empirical comparison of voting classification algorithms: Bagging, boosting and variants. Machine Learning 36 105-139.
-
(1999)
Machine Learning
, vol.36
, pp. 105-139
-
-
Bauer, E.1
Kohavi, R.2
-
2
-
-
0030211964
-
Bagging predictors
-
BREIMAN, L. (1996). Bagging predictors. Machine Learning 24 123-140.
-
(1996)
Machine Learning
, vol.24
, pp. 123-140
-
-
Breiman, L.1
-
3
-
-
0004198448
-
Arcing the edge
-
Dept. Statistics, Univ. California, Berkeley
-
BREIMAN, L. (1997). Arcing the edge. Technical Report 486, Dept. Statistics, Univ. California, Berkeley. Available at www.stat.berkeley.edu.
-
(1997)
Technical Report
, vol.486
-
-
Breiman, L.1
-
4
-
-
0346786584
-
Arcing classifiers
-
BREIMAN, L. (1998). Arcing classifiers (with discussion). Ann. Statist. 26 801-849.
-
(1998)
Ann. Statist.
, vol.26
, pp. 801-849
-
-
Breiman, L.1
-
5
-
-
0000275022
-
Prediction games and arcing algorithms
-
BREIMAN, L. (1999). Prediction games and arcing algorithms. Neural Computation 11 1493-1517.
-
(1999)
Neural Computation
, vol.11
, pp. 1493-1517
-
-
Breiman, L.1
-
6
-
-
0013228784
-
Some infinite theory for predictor ensembles
-
Dept. Statistics, Univ. California, Berkeley
-
BREIMAN, L. (2000). Some infinite theory for predictor ensembles. Technical Report 577, Dept. Statistics, Univ. California, Berkeley.
-
(2000)
Technical Report
, vol.577
-
-
Breiman, L.1
-
8
-
-
0034250160
-
An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting and randomization
-
DIETTERICH, T. (2000). An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting and randomization. Machine Learning 40 139-157.
-
(2000)
Machine Learning
, vol.40
, pp. 139-157
-
-
Dietterich, T.1
-
9
-
-
85156217048
-
Boosting decision trees
-
MIT Press, Cambridge, MA
-
DRUCKER, H. and CORTES, C. (1996). Boosting decision trees. In Advances in Neural Information Processing Systems 8 479-485. MIT Press, Cambridge, MA.
-
(1996)
Advances in Neural Information Processing Systems
, vol.8
, pp. 479-485
-
-
Drucker, H.1
Cortes, C.2
-
13
-
-
0034164230
-
Additive logistic regression: A statistical view of boosting
-
FRIEDMAN, J., HASTIE, T. and TIBSHIRANI, R. (2000). Additive logistic regression: A statistical view of boosting (with discussion). Ann. Statist. 28 337-407.
-
(2000)
Ann. Statist.
, vol.28
, pp. 337-407
-
-
Friedman, J.1
Hastie, T.2
Tibshirani, R.3
-
14
-
-
26444545593
-
Process consistency for AdaBoost
-
JIANG, W. (2004). Process consistency for AdaBoost. Ann. Statist. 32 13-29.
-
(2004)
Ann. Statist.
, vol.32
, pp. 13-29
-
-
Jiang, W.1
-
15
-
-
9444269961
-
On the Bayes-risk consistency of regularized boosting methods
-
LUGOSI, G. and VAYATIS, N. (2004). On the Bayes-risk consistency of regularized boosting methods. Ann. Statist. 32 30-55.
-
(2004)
Ann. Statist.
, vol.32
, pp. 30-55
-
-
Lugosi, G.1
Vayatis, N.2
-
16
-
-
84937440094
-
The consistency of greedy algorithms for classification
-
Springer, New York
-
MANNOR, S., MEIR, R. and ZHANG, T. (2002). The consistency of greedy algorithms for classification. In Proc. 15th Annual Conference on Computational Learning Theory. Lecture Notes in Comp. Sci. 2375 319-333. Springer, New York.
-
(2002)
Proc. 15th Annual Conference on Computational Learning Theory. Lecture Notes in Comp. Sci.
, vol.2375
, pp. 319-333
-
-
Mannor, S.1
Meir, R.2
Zhang, T.3
-
17
-
-
0032280519
-
Boosting the margin: A new explanation for the effectiveness of voting methods
-
SCHAPIRE, R., FREUND, Y., BARTLETT, P. and LEE, W. (1998). Boosting the margin: A new explanation for the effectiveness of voting methods. Ann. Statist. 26 1651-1686.
-
(1998)
Ann. Statist.
, vol.26
, pp. 1651-1686
-
-
Schapire, R.1
Freund, Y.2
Bartlett, P.3
Lee, W.4
-
18
-
-
0033281701
-
Improved boosting algorithms using confidence-rated predictions
-
SCHAPIRE, R. and SINGER, Y. (1999). Improved boosting algorithms using confidence-rated predictions. Machine Learning 37 297-336.
-
(1999)
Machine Learning
, vol.37
, pp. 297-336
-
-
Schapire, R.1
Singer, Y.2
-
20
-
-
1542307688
-
Boosting with early stopping: Convergence and consistency
-
Dept. Statistics, Univ. California, Berkeley
-
ZHANG, T. and YU, B. (2003). Boosting with early stopping: Convergence and consistency. Technical Report 635, Dept. Statistics, Univ. California, Berkeley. Available from www.stat.berkeley.edu/~binyu/publications.html.
-
(2003)
Technical Report
, vol.635
-
-
Zhang, T.1
Yu, B.2
|