-
1
-
-
0028496468
-
Learning boolean concepts in the presence of many irrelevant features
-
H. Almuallin and T. G. Dietterich. Learning boolean concepts in the presence of many irrelevant features. Artificial Intelligence, 69(1-2):279-305, 1994.
-
(1994)
Artificial Intelligence
, vol.69
, Issue.1-2
, pp. 279-305
-
-
Almuallin, H.1
Dietterich, T.G.2
-
2
-
-
0001492549
-
Shape quantization and recognition with randomized trees
-
Y. Amit and D. Geman. Shape quantization and recognition with randomized trees. Neural Computation, 9(7):1545-88, 1997.
-
(1997)
Neural Computation
, vol.9
, Issue.7
, pp. 1545-1588
-
-
Amit, Y.1
Geman, D.2
-
3
-
-
0032645080
-
An empirical comparison of voting classification algorithms: Bagging, boosting and variants
-
E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithms: Bagging, boosting and variants. Machine Learning, 36(1/2):525-536, 1999.
-
(1999)
Machine Learning
, vol.36
, Issue.1-2
, pp. 525-536
-
-
Bauer, E.1
Kohavi, R.2
-
4
-
-
34147208482
-
Using metarules to organize and group discovered association rules
-
A. Berrado and G.C. Runger. Using metarules to organize and group discovered association rules. Data Mining and Knowledge Discovery, 14(3):409-431, 2007.
-
(2007)
Data Mining and Knowledge Discovery
, vol.14
, Issue.3
, pp. 409-431
-
-
Berrado, A.1
Runger, G.C.2
-
5
-
-
34047185230
-
Tree-based ensembles with dynamic soft feature selection
-
I. Guyon, S. Gunn, M. Nikravesh, and L. Zadeh, editors, Springer
-
A. Borisov, V. Eruhimov, and E. Tuv. Tree-based ensembles with dynamic soft feature selection. In I. Guyon, S. Gunn, M. Nikravesh, and L. Zadeh, editors, Feature Extraction Foundations and Applications: Studies in Fuzziness and Soft Computing. Springer, 2006.
-
(2006)
Feature Extraction Foundations and Applications: Studies in Fuzziness and Soft Computing
-
-
Borisov, A.1
Eruhimov, V.2
Tuv, E.3
-
6
-
-
0026966646
-
A training algorithm for optimal margin classifiers
-
D. Haussler, editor, Pittsburgh, PA, ACM Press
-
B. Boser, I. Guyon, and V. Vapnik. A training algorithm for optimal margin classifiers. In D. Haussler, editor, 5th Annual ACM Workshop on COLT, Pittsburgh, PA, pages 144-152. ACM Press, 1992.
-
(1992)
5th Annual ACM Workshop on COLT
, pp. 144-152
-
-
Boser, B.1
Guyon, I.2
Vapnik, V.3
-
8
-
-
0030211964
-
Bagging predictors
-
L. Breiman. Bagging predictors. Machine Learning, 24(2):123-140, 1996.
-
(1996)
Machine Learning
, vol.24
, Issue.2
, pp. 123-140
-
-
Breiman, L.1
-
9
-
-
0346786584
-
Arcing classifiers
-
L. Breiman. Arcing classifiers. The Annals of Statistics, 26(3):801-849, 1998.
-
(1998)
The Annals of Statistics
, vol.26
, Issue.3
, pp. 801-849
-
-
Breiman, L.1
-
10
-
-
0035478854
-
Random forests
-
L. Breiman. Random forests. Machine Learning, 45(1):5-32, 2001.
-
(2001)
Machine Learning
, vol.45
, Issue.1
, pp. 5-32
-
-
Breiman, L.1
-
11
-
-
0003802343
-
-
Wadsworth, Belmont, MA
-
L. Breiman, J. Friedman, R. Olshen, and C. Stone. Classification and Regression Trees. Wadsworth, Belmont, MA, 1984.
-
(1984)
Classification and Regression Trees
-
-
Breiman, L.1
Friedman, J.2
Olshen, R.3
Stone, C.4
-
12
-
-
34250080806
-
-
S. Cost and S. Salzberg. A weighted nearest neighbor algorithm for learning with symbolic features. Machine Learning, 10(1):57-78, 1993.
-
S. Cost and S. Salzberg. A weighted nearest neighbor algorithm for learning with symbolic features. Machine Learning, 10(1):57-78, 1993.
-
-
-
-
13
-
-
0020748239
-
Computer intensive methods in statistics
-
P. Diaconis and B. Efron. Computer intensive methods in statistics. Scientific American, (248): 116-131, 1983.
-
(1983)
Scientific American
, vol.248
, pp. 116-131
-
-
Diaconis, P.1
Efron, B.2
-
14
-
-
0034250160
-
An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization
-
T. G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, 40(2):139-157, 2000a.
-
(2000)
Machine Learning
, vol.40
, Issue.2
, pp. 139-157
-
-
Dietterich, T.G.1
-
15
-
-
80053403826
-
Ensemble methods in machine learning
-
First International Workshop on Multiple Classifier Systems 2000, Cagliari, Italy, of, Springer
-
T. G. Dietterich. Ensemble methods in machine learning. In First International Workshop on Multiple Classifier Systems 2000, Cagliari, Italy, volume 1857 of Lecture Notes in Computer Science, pages 1-15. Springer, 2000b.
-
(2000)
Lecture Notes in Computer Science
, vol.1857
, pp. 1-15
-
-
Dietterich, T.G.1
-
16
-
-
3242708140
-
Least angle regression
-
B. Efron, T. Hastie, L. Johnstone, and R. Tibshirani. Least angle regression. Annals of Statistics, 32:407-499, 2004.
-
(2004)
Annals of Statistics
, vol.32
, pp. 407-499
-
-
Efron, B.1
Hastie, T.2
Johnstone, L.3
Tibshirani, R.4
-
18
-
-
0003591748
-
Greedy function approximation: A gradient boosting machine
-
Technical report, Dept. of Statistics, Stanford University
-
J. Friedman. Greedy function approximation: a gradient boosting machine. Technical report, Dept. of Statistics, Stanford University, 1999.
-
(1999)
-
-
Friedman, J.1
-
19
-
-
0034164230
-
Additive logistic regression: A statistical view of boosting
-
J. Friedman, T. Hastie, and R. Tibshirani. Additive logistic regression: A statistical view of boosting. Annals of Statistics, 28:832-844, 2000.
-
(2000)
Annals of Statistics
, vol.28
, pp. 832-844
-
-
Friedman, J.1
Hastie, T.2
Tibshirani, R.3
-
20
-
-
84950941772
-
-
J. H Friedman,M. Jacobson, andW. Stuetzle. Projection pursuit regression. Journal of the American Statistical Association, 76:817-823, 1981.
-
J. H Friedman,M. Jacobson, andW. Stuetzle. Projection pursuit regression. Journal of the American Statistical Association, 76:817-823, 1981.
-
-
-
-
22
-
-
0036161259
-
Gene selection for cancer classification using support vector machines
-
I. Guyon, J. Weston, S. Barnhill, and V. Vapnik. Gene selection for cancer classification using support vector machines. Machine Learning, 46(1-3):389-422, 2002.
-
(2002)
Machine Learning
, vol.46
, Issue.1-3
, pp. 389-422
-
-
Guyon, I.1
Weston, J.2
Barnhill, S.3
Vapnik, V.4
-
25
-
-
0032139235
-
The random subspace method for constructing decision forests
-
T. K. Ho. The random subspace method for constructing decision forests. IEEE Trans. on Pattern Analysis and Machine Intelligence, 20(8):832-844, 1998.
-
(1998)
IEEE Trans. on Pattern Analysis and Machine Intelligence
, vol.20
, Issue.8
, pp. 832-844
-
-
Ho, T.K.1
-
26
-
-
85146422424
-
A practical approach to feature selection
-
San Francisco, CA, USA, Morgan Kaufmann Publishers Inc. ISBN 1-5586-247-X
-
K. Kira and L. A. Rendell. A practical approach to feature selection. In ML92: Proceedings of the ninth international workshop on Machine learning, pages 249-256, San Francisco, CA, USA, 1992. Morgan Kaufmann Publishers Inc. ISBN 1-5586-247-X.
-
(1992)
ML92: Proceedings of the ninth international workshop on Machine learning
, pp. 249-256
-
-
Kira, K.1
Rendell, L.A.2
-
27
-
-
0000012317
-
Toward optimal feature selection
-
Bari, Italy, URL
-
D. Koller and M. Sahami. Toward optimal feature selection. In Proceedings of ICML-96, 13th International Conference on Machine Learning, pages 284-292, Bari, Italy, 1996. URL citeseer.nj.nec.com/ koller96toward.html.
-
(1996)
Proceedings of ICML-96, 13th International Conference on Machine Learning
, pp. 284-292
-
-
Koller, D.1
Sahami, M.2
-
29
-
-
17044405923
-
Toward integrating feature selection algorithms for classification and clustering
-
H. Liu and L. Yu. Toward integrating feature selection algorithms for classification and clustering. IEEE Trans. Knowledge and Data Eng., 17(4):491-502, 2005.
-
(2005)
IEEE Trans. Knowledge and Data Eng
, vol.17
, Issue.4
, pp. 491-502
-
-
Liu, H.1
Yu, L.2
-
30
-
-
33745655665
-
Learning theory: Stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization
-
S. Mukherjee, P. Niyogi, T. Poggio, and R. Rifkin. Learning theory: Stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization. Advances in Computational Mathematics, 25:161-193, 2006.
-
(2006)
Advances in Computational Mathematics
, vol.25
, pp. 161-193
-
-
Mukherjee, S.1
Niyogi, P.2
Poggio, T.3
Rifkin, R.4
-
31
-
-
85156199954
-
Improving committee diagnosis with resampling techniques
-
D. S. Touretzky, M. C. Mozer, and M. Hesselmo, editors, Cambridge, MA: MIT Press
-
B. Parmanto, P. Munro, and H. Doyle. Improving committee diagnosis with resampling techniques. In D. S. Touretzky, M. C. Mozer, and M. Hesselmo, editors, Advances in Neural Information Processing Systems 8, pages 882-888. Cambridge, MA: MIT Press, 1996.
-
(1996)
Advances in Neural Information Processing Systems 8
, pp. 882-888
-
-
Parmanto, B.1
Munro, P.2
Doyle, H.3
-
32
-
-
33750573858
-
Bagging regularizes
-
MIT, Cambridge, MA
-
T. Poggio, R. Rifkin, S. Mukherjee, and A. Rakhlin. Bagging regularizes. In CBCL Paper 214/AI Memo 2002-003. MIT, Cambridge, MA, 2002.
-
(2002)
CBCL Paper 214/AI Memo 2002-003
-
-
Poggio, T.1
Rifkin, R.2
Mukherjee, S.3
Rakhlin, A.4
-
33
-
-
1842420581
-
General conditions for predictivity in learning theory
-
T. Poggio, R. Rifkin, S. Mukherjee, and P. Niyogi. General conditions for predictivity in learning theory. Nature, 428:419-422, 2004.
-
(2004)
Nature
, vol.428
, pp. 419-422
-
-
Poggio, T.1
Rifkin, R.2
Mukherjee, S.3
Niyogi, P.4
-
34
-
-
0141990695
-
Theoretical and empirical analysis of relief and relieff
-
M. Robnik-Sikonja and I. Kononenko. Theoretical and empirical analysis of relief and relieff. Machine Learning, 53:23-69, 2003.
-
(2003)
Machine Learning
, vol.53
, pp. 23-69
-
-
Robnik-Sikonja, M.1
Kononenko, I.2
-
35
-
-
0022909661
-
Toward memory-based reasoning
-
December
-
C. Stanfill and D. Waltz. Toward memory-based reasoning. Communications of the ACM, 29: 1213-1228, December 1986.
-
(1986)
Communications of the ACM
, vol.29
, pp. 1213-1228
-
-
Stanfill, C.1
Waltz, D.2
-
37
-
-
2942701493
-
Ranking a random feature for variable and feature selection
-
March
-
H. Stoppiglia, G. Dreyfus, R. Dubois, and Y. Oussar. Ranking a random feature for variable and feature selection. Journal of Machine Learning Research, 3:1399-1414, March 2003.
-
(2003)
Journal of Machine Learning Research
, vol.3
, pp. 1399-1414
-
-
Stoppiglia, H.1
Dreyfus, G.2
Dubois, R.3
Oussar, Y.4
-
38
-
-
68949089706
-
Ensemble learning and feature selection
-
I. Guyon, S. Gunn,M. Nikravesh, and L. Zadeh, editors, Springer
-
E. Tuv. Ensemble learning and feature selection. In I. Guyon, S. Gunn,M. Nikravesh, and L. Zadeh, editors, Feature Extraction, Foundations and Applications. Springer, 2006.
-
(2006)
Feature Extraction, Foundations and Applications
-
-
Tuv, E.1
-
40
-
-
1942452226
-
Low bias bagged support vector machines
-
G. Valentini and T. Dietterich. Low bias bagged support vector machines. In ICML 2003, pages 752-759, 2003.
-
(2003)
ICML 2003
, pp. 752-759
-
-
Valentini, G.1
Dietterich, T.2
-
41
-
-
84865801454
-
Ensembles of learning machines
-
M. Marinaro and R. Tagliaferri, editors, Neural Nets WIRN Vietri-02, Springer-Verlag
-
G. Valentini and F. Masulli. Ensembles of learning machines. In M. Marinaro and R. Tagliaferri, editors, Neural Nets WIRN Vietri-02, Lecture Notes in Computer Science. Springer-Verlag, 2002.
-
(2002)
Lecture Notes in Computer Science
-
-
Valentini, G.1
Masulli, F.2
-
42
-
-
0038502343
-
Resampling methods for variable selection in robust regression
-
J.W. Wisnowski, J.R. Simpson, D.C. Montgomery, and G.C. Runger. Resampling methods for variable selection in robust regression. Computational Statistics and Data Analysis, 43(3):341-355, 2003.
-
(2003)
Computational Statistics and Data Analysis
, vol.43
, Issue.3
, pp. 341-355
-
-
Wisnowski, J.W.1
Simpson, J.R.2
Montgomery, D.C.3
Runger, G.C.4
-
43
-
-
25144492516
-
Efficient feature selection via analysis of relevance and redundancy
-
L. Yu and H. Liu. Efficient feature selection via analysis of relevance and redundancy. J. of Machine Learning Research, 5:1205-1224, 2004.
-
(2004)
J. of Machine Learning Research
, vol.5
, pp. 1205-1224
-
-
Yu, L.1
Liu, H.2
|