-
1
-
-
0031211090
-
A Decision-theoretic Generalization of Online Learning and an Application to Boosting
-
[1] Y. Freund and R. E. Schapire, “A Decision-theoretic Generalization of Online Learning and an Application to Boosting,” Journal of Computer and System Sciences, vol. 55, pp. 119–139, August 1997.
-
(1997)
Journal of Computer and System Sciences
, vol.55
, pp. 119-139
-
-
Freund, Y.1
Schapire, R.E.2
-
2
-
-
0032280519
-
Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods
-
[2] R. E. Schapire, Y. Freund, P. Bartlett, and W. S. Lee, “Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods,” The Annals of Statistics, vol. 26, no. 5, pp. 1651–1686, 1998.
-
(1998)
The Annals of Statistics
, vol.26
, Issue.5
, pp. 1651-1686
-
-
Schapire, R.E.1
Freund, Y.2
Bartlett, P.3
Lee, W.S.4
-
5
-
-
0030370417
-
Bagging, Boosting, and C4.5
-
AAAI Press/MIT Press
-
[5] J. Quinlan, “Bagging, Boosting, and C4.5,” in Thirteenth National Conference on Artificial Intelligence, (Cambridge), pp. 163–175, AAAI Press/MIT Press, 1996.
-
(1996)
Thirteenth National Conference on Artificial Intelligence
, pp. 163-175
-
-
Quinlan, J.1
-
6
-
-
0034250160
-
An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization
-
[6] T. G. Dietterich, “An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization,” Machine Learning, vol. 40, no. 2, pp. 139–157, 2000.
-
(2000)
Machine Learning
, vol.40
, Issue.2
, pp. 139-157
-
-
Dietterich, T.G.1
-
7
-
-
0031638384
-
Boosting in the limit: Maximizing the margin of learned ensembles
-
[7] A. J. Grove and D. Schuurmans, “Boosting in the limit: Maximizing the margin of learned ensembles,” in AAAI/IAAI, pp. 692–699, 1998.
-
(1998)
AAAI/IAAI
, pp. 692-699
-
-
Grove, A.J.1
Schuurmans, D.2
-
8
-
-
0342502195
-
Soft margins for Adaboost
-
[8] G. Rätsch, T. Onoda, and K. Müller, “Soft margins for Adaboost,” Machine Learning, vol. 42, pp. 287–320, 2001.
-
(2001)
Machine Learning
, vol.42
, pp. 287-320
-
-
Rätsch, G.1
Onoda, T.2
Müller, K.3
-
9
-
-
0003450542
-
-
Statistics for Engineering and Information Science, Springer Verlag
-
[9] V. Vapnik, The nature of statistical learning theory. Statistics for Engineering and Information Science, Springer Verlag, 2000.
-
(2000)
The Nature of Statistical Learning Theory
-
-
Vapnik, V.1
-
10
-
-
0004161838
-
Cambridge U
-
niversity Press, second ed
-
[10] W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery, Numerical Recipes in C – The Art of Scientific Computing. Cambridge University Press, second ed., 1992.
-
(1992)
Numerical Recipes in C – the Art of Scientific Computing
-
-
Press, W.H.1
Teukolsky, S.A.2
Vetterling, W.T.3
Flannery, B.P.4
-
11
-
-
0003428336
-
-
of, Econometric Society Monographs, . Cambridge University Press
-
[11] W. Härdle, Applied Nonparametric Regression, vol. 19 of Econometric Society Monographs. Cambridge University Press, 1990.
-
(1990)
Applied Nonparametric Regression
, vol.19
-
-
Härdle, W.1
-
13
-
-
85124125604
-
Heterogeneous Uncertainty Sampling for Supervised Learning
-
(Cohen and Hirsh, eds.), (San Francisco, Morgan Kaufmann
-
[13] D. D. Lewis and J. Catlett, “Heterogeneous Uncertainty Sampling for Supervised Learning,” in Eleventh International Conference on Machine Learning (Cohen and Hirsh, eds.), (San Francisco), pp. 148–156, Morgan Kaufmann, 1994.
-
(1994)
Eleventh International Conference on Machine Learning
, pp. 148-156
-
-
Lewis, D.D.1
Catlett, J.2
-
14
-
-
84947589550
-
Variance Reduction via Noise and Bias Constraints.
-
A. Sharkey, ed.), (London), Springer-Verlag
-
[14] Y. Raviv and N. Intrator, “Variance Reduction via Noise and Bias Constraints.,” in Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems(A. Sharkey, ed.), (London), pp. 163–175, Springer-Verlag, 1999.
-
(1999)
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
, pp. 163-175
-
-
Raviv, Y.1
Intrator, N.2
|