-
2
-
-
0030211964
-
Bagging predictors
-
-, "Bagging predictors", Machine learning, vol. 24, no. 2, pp. 123-140, 1996.
-
(1996)
Machine Learning
, vol.24
, Issue.2
, pp. 123-140
-
-
Breiman, L.1
-
3
-
-
0002978642
-
Experiments with a new boosting algorithm
-
Y. Freund, R. E. Schapire et al., "Experiments with a new boosting algorithm", in ICML, vol. 96, 1996, pp. 148-156.
-
(1996)
ICML
, vol.96
, pp. 148-156
-
-
Freund, Y.1
Schapire, R.E.2
-
4
-
-
33749418783
-
A comparison of ensemble creation techniques
-
Springer
-
R. E. Banfield, L. O. Hall, K. W. Bowyer, D. Bhadoria, W. P. Kegelmeyer, and S. Eschrich, "A comparison of ensemble creation techniques", in Multiple classifier systems. Springer, 2004, pp. 223-232.
-
(2004)
Multiple Classifier Systems
, pp. 223-232
-
-
Banfield, R.E.1
Hall, L.O.2
Bowyer, K.W.3
Bhadoria, D.4
Kegelmeyer, W.P.5
Eschrich, S.6
-
5
-
-
0032645080
-
An empirical comparison of voting classification algorithms: Bagging, boosting, and variants
-
E. Bauer and R. Kohavi, "An empirical comparison of voting classification algorithms: Bagging, boosting, and variants", Machine learning, vol. 36, no. 1-2, pp. 105-139, 1999.
-
(1999)
Machine Learning
, vol.36
, Issue.1-2
, pp. 105-139
-
-
Bauer, E.1
Kohavi, R.2
-
6
-
-
80053403826
-
Ensemble methods in machine learning
-
Springer
-
T. G. Dietterich, "Ensemble methods in machine learning", in Multiple classifier systems. Springer, 2000, pp. 1-15.
-
(2000)
Multiple Classifier Systems
, pp. 1-15
-
-
Dietterich, T.G.1
-
7
-
-
0032139235
-
The random subspace method for constructing decision forests
-
T. K. Ho, "The random subspace method for constructing decision forests", Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 20, no. 8, pp. 832-844, 1998.
-
(1998)
Pattern Analysis and Machine Intelligence, IEEE Transactions On
, vol.20
, Issue.8
, pp. 832-844
-
-
Ho, T.K.1
-
8
-
-
84983110889
-
A desicion-theoretic generalization of online learning and an application to boosting
-
Springer
-
Y. Freund and R. E. Schapire, "A desicion-theoretic generalization of online learning and an application to boosting", in Computational learning theory. Springer, 1995, pp. 23-37.
-
(1995)
Computational Learning Theory
, pp. 23-37
-
-
Freund, Y.1
Schapire, R.E.2
-
9
-
-
0002595663
-
Boosting the margin: A new explanation for the effectiveness of voting methods
-
R. E. Schapire and Y. Freund, "Boosting the margin: a new explanation for the effectiveness of voting methods", The Annals of Statistics, vol. 26, pp. 322-330, 1998.
-
(1998)
The Annals of Statistics
, vol.26
, pp. 322-330
-
-
Schapire, R.E.1
Freund, Y.2
-
10
-
-
0036083445
-
A data complexity analysis of comparative advantages of decision forest constructors
-
T. K. Ho, "A data complexity analysis of comparative advantages of decision forest constructors", Pattern Analysis & Applications, vol. 5, no. 2, pp. 102-112, 2002.
-
(2002)
Pattern Analysis & Applications
, vol.5
, Issue.2
, pp. 102-112
-
-
Ho, T.K.1
-
11
-
-
0035478854
-
Random forests
-
L. Breiman, "Random forests", Machine learning, vol. 45, no. 1, pp. 5-32, 2001.
-
(2001)
Machine Learning
, vol.45
, Issue.1
, pp. 5-32
-
-
Breiman, L.1
-
12
-
-
13244289883
-
Joint analysis of two microarray gene-expression data sets to select lung adenocarcinoma marker genes
-
H. Jiang, Y. Deng, H.-S. Chen, L. Tao, Q. Sha, J. Chen, C.-J. Tsai, and S. Zhang, "Joint analysis of two microarray gene-expression data sets to select lung adenocarcinoma marker genes", BMC bioinformatics, vol. 5, no. 1, p. 81, 2004.
-
(2004)
BMC Bioinformatics
, vol.5
, Issue.1
, pp. 81
-
-
Jiang, H.1
Deng, Y.2
Chen, H.-S.3
Tao, L.4
Sha, Q.5
Chen, J.6
Tsai, C.-J.7
Zhang, S.8
-
13
-
-
34250698845
-
A feature selection method for multilevel mental fatigue eeg classification
-
K.-Q. Shen, C.-J. Ong, X.-P. Li, Z. Hui, and E. P. Wilder-Smith, "A feature selection method for multilevel mental fatigue eeg classification", Biomedical Engineering, IEEE Transactions on, vol. 54, no. 7, pp. 1231-1237, 2007.
-
(2007)
Biomedical Engineering, IEEE Transactions On
, vol.54
, Issue.7
, pp. 1231-1237
-
-
Shen, K.-Q.1
Ong, C.-J.2
Li, X.-P.3
Hui, Z.4
Wilder-Smith, E.P.5
-
14
-
-
0345548657
-
Random forest: A classification and regression tool for compound classification and qsar modeling
-
V. Svetnik, A. Liaw, C. Tong, J. C. Culberson, R. P. Sheridan, and B. P. Feuston, "Random forest: a classification and regression tool for compound classification and qsar modeling", Journal of chemical information and computer sciences, vol. 43, no. 6, pp. 1947-1958, 2003.
-
(2003)
Journal of Chemical Information and Computer Sciences
, vol.43
, Issue.6
, pp. 1947-1958
-
-
Svetnik, V.1
Liaw, A.2
Tong, C.3
Culberson, J.C.4
Sheridan, R.P.5
Feuston, B.P.6
-
15
-
-
13344278660
-
Random forest classifier for remote sensing classification
-
M. Pal, "Random forest classifier for remote sensing classification", International Journal of Remote Sensing, vol. 26, no. 1, pp. 217-222, 2005.
-
(2005)
International Journal of Remote Sensing
, vol.26
, Issue.1
, pp. 217-222
-
-
Pal, M.1
-
17
-
-
80052872695
-
Combining randomization and discrimination for fine-grained image categorization
-
IEEE
-
B. Yao, A. Khosla, and L. Fei-Fei, "Combining randomization and discrimination for fine-grained image categorization", in Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on. IEEE, 2011, pp. 1577-1584.
-
(2011)
Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference On
, pp. 1577-1584
-
-
Yao, B.1
Khosla, A.2
Fei-Fei, L.3
-
19
-
-
54249099241
-
Consistency of random forests and other averaging classifiers
-
G. Biau, L. Devroye, and G. Lugosi, "Consistency of random forests and other averaging classifiers", The Journal of Machine Learning Research, vol. 9, pp. 2015-2033, 2008.
-
(2008)
The Journal of Machine Learning Research
, vol.9
, pp. 2015-2033
-
-
Biau, G.1
Devroye, L.2
Lugosi, G.3
-
20
-
-
33745653724
-
Random forests and adaptive nearest neighbors
-
Y. Lin and Y. Jeon, "Random forests and adaptive nearest neighbors", Journal of the American Statistical Association, vol. 101, no. 474, pp. 578-590, 2006.
-
(2006)
Journal of the American Statistical Association
, vol.101
, Issue.474
, pp. 578-590
-
-
Lin, Y.1
Jeon, Y.2
-
21
-
-
33847236254
-
Multivariate feature selection and hierarchical classification for infrared spectroscopy: Serumbased detection of bovine spongiform encephalopathy
-
B. H. Menze, W. Petrich, and F. A. Hamprecht, "Multivariate feature selection and hierarchical classification for infrared spectroscopy: serumbased detection of bovine spongiform encephalopathy", Analytical and bioanalytical chemistry, vol. 387, no. 5, pp. 1801-1807, 2007.
-
(2007)
Analytical and Bioanalytical Chemistry
, vol.387
, Issue.5
, pp. 1801-1807
-
-
Menze, B.H.1
Petrich, W.2
Hamprecht, F.A.3
-
22
-
-
56049119676
-
Robust feature selection using ensemble feature selection techniques
-
Springer
-
Y. Saeys, T. Abeel, and Y. Van de Peer, "Robust feature selection using ensemble feature selection techniques", in Machine Learning and Knowledge Discovery in Databases. Springer, 2008, pp. 313-325.
-
(2008)
Machine Learning and Knowledge Discovery in Databases
, pp. 313-325
-
-
Saeys, Y.1
Abeel, T.2
Van De Peer, Y.3
-
23
-
-
68949140728
-
A comparison of random forest and its gini importance with standard chemometric methods for the feature selection and classification of spectral data
-
B. H. Menze, B. M. Kelm, R. Masuch, U. Himmelreich, P. Bachert, W. Petrich, and F. A. Hamprecht, "A comparison of random forest and its gini importance with standard chemometric methods for the feature selection and classification of spectral data", BMC bioinformatics, vol. 10, no. 1, p. 213, 2009.
-
(2009)
BMC Bioinformatics
, vol.10
, Issue.1
, pp. 213
-
-
Menze, B.H.1
Kelm, B.M.2
Masuch, R.3
Himmelreich, U.4
Bachert, P.5
Petrich, W.6
Hamprecht, F.A.7
-
24
-
-
68949154557
-
Feature selection with ensembles, artificial variables, and redundancy elimination
-
E. Tuv, A. Borisov, G. Runger, and K. Torkkola, "Feature selection with ensembles, artificial variables, and redundancy elimination", The Journal of Machine Learning Research, vol. 10, pp. 1341-1366, 2009.
-
(2009)
The Journal of Machine Learning Research
, vol.10
, pp. 1341-1366
-
-
Tuv, E.1
Borisov, A.2
Runger, G.3
Torkkola, K.4
-
25
-
-
61749086397
-
Subgroup analysis via recursive partitioning
-
X. Su, C.-L. Tsai, H. Wang, D. M. Nickerson, and B. Li, "Subgroup analysis via recursive partitioning", The Journal of Machine Learning Research, vol. 10, pp. 141-158, 2009.
-
(2009)
The Journal of Machine Learning Research
, vol.10
, pp. 141-158
-
-
Su, X.1
Tsai, C.-L.2
Wang, H.3
Nickerson, D.M.4
Li, B.5
-
26
-
-
26944501740
-
Bias-variance analysis of support vector machines for the development of svm-based ensemble methods
-
G. Valentini and T. G. Dietterich, "Bias-variance analysis of support vector machines for the development of svm-based ensemble methods", The Journal of Machine Learning Research, vol. 5, pp. 725-775, 2004.
-
(2004)
The Journal of Machine Learning Research
, vol.5
, pp. 725-775
-
-
Valentini, G.1
Dietterich, T.G.2
-
27
-
-
44449124996
-
Rotboost: A technique for combining rotation forest and adaboost
-
C.-X. Zhang and J.-S. Zhang, "Rotboost: A technique for combining rotation forest and adaboost", Pattern Recognition Letters, vol. 29, no. 10, pp. 1524-1536, 2008.
-
(2008)
Pattern Recognition Letters
, vol.29
, Issue.10
, pp. 1524-1536
-
-
Zhang, C.-X.1
Zhang, J.-S.2
-
28
-
-
33646430006
-
Extremely randomized trees
-
P. Geurts, D. Ernst, and L. Wehenkel, "Extremely randomized trees", Machine learning, vol. 63, no. 1, pp. 3-42, 2006.
-
(2006)
Machine Learning
, vol.63
, Issue.1
, pp. 3-42
-
-
Geurts, P.1
Ernst, D.2
Wehenkel, L.3
-
29
-
-
0001942829
-
Neural networks and the bias/variance dilemma
-
S. Geman, E. Bienenstock, and R. Doursat, "Neural networks and the bias/variance dilemma", Neural computation, vol. 4, no. 1, pp. 1-58, 1992.
-
(1992)
Neural Computation
, vol.4
, Issue.1
, pp. 1-58
-
-
Geman, S.1
Bienenstock, E.2
Doursat, R.3
-
30
-
-
84992322729
-
Error-correcting output coding corrects bias and variance
-
E. B. Kong and T. G. Dietterich, "Error-correcting output coding corrects bias and variance." in ICML, 1995, pp. 313-321.
-
(1995)
ICML
, pp. 313-321
-
-
Kong, E.B.1
Dietterich, T.G.2
-
31
-
-
0002872346
-
Bias plus variance decomposition for zero-one loss functions
-
R. Kohavi, D. H. Wolpert et al., "Bias plus variance decomposition for zero-one loss functions", in ICML, 1996, pp. 275-283.
-
(1996)
ICML
, pp. 275-283
-
-
Kohavi, R.1
Wolpert, D.H.2
-
32
-
-
21744462998
-
On bias, variance, 0/1łloss, and the curse-of-dimensionality
-
J. H. Friedman, "On bias, variance, 0/1łloss, and the curse-of-dimensionality", Data mining and knowledge discovery, vol. 1, no. 1, pp. 55-77, 1997.
-
(1997)
Data Mining and Knowledge Discovery
, vol.1
, Issue.1
, pp. 55-77
-
-
Friedman, J.H.1
-
33
-
-
0346786584
-
Arcing classifier (with discussion and a rejoinder by the author)
-
L. Breiman, "Arcing classifier (with discussion and a rejoinder by the author)", The annals of statistics, vol. 26, no. 3, pp. 801-849, 1998.
-
(1998)
The Annals of Statistics
, vol.26
, Issue.3
, pp. 801-849
-
-
Breiman, L.1
-
34
-
-
0037403462
-
Variance and bias for general loss functions
-
G. M. James, "Variance and bias for general loss functions", Machine Learning, vol. 51, no. 2, pp. 115-135, 2003.
-
(2003)
Machine Learning
, vol.51
, Issue.2
, pp. 115-135
-
-
James, G.M.1
|