-
3
-
-
0030211964
-
Bagging predictors
-
2
-
Breiman, L. (1996). Bagging predictors. Machine Learning, 24(2), 123-140.
-
(1996)
Machine Learning
, vol.24
, pp. 123-140
-
-
Breiman, L.1
-
4
-
-
0035478854
-
Random forests
-
1
-
Breiman, L. (2001). Random forests. Machine Learning, 45(1), 5-32.
-
(2001)
Machine Learning
, vol.45
, pp. 5-32
-
-
Breiman, L.1
-
6
-
-
0036161011
-
Choosing multiple parameters for support vector machines
-
Chapelle, O., Vapnik, V., Bousquet, O., & Mukherjee, S. (2002). Choosing multiple parameters for support vector machines. Machine Learning, 46, 131-159.
-
(2002)
Machine Learning
, vol.46
, pp. 131-159
-
-
Chapelle, O.1
Vapnik, V.2
Bousquet, O.3
Mukherjee, S.4
-
7
-
-
0042326376
-
Bayesian trigonometric support vector classifier
-
9
-
Chu, W., Keerthi, S. S., & Ong, C. J. (2003). Bayesian trigonometric support vector classifier. Neural Computation, 15(9), 2227-2254.
-
(2003)
Neural Computation
, vol.15
, pp. 2227-2254
-
-
Chu, W.1
Keerthi, S.S.2
Ong, C.J.3
-
8
-
-
1242331293
-
Bayesian support vector regression using a unified loss function
-
1
-
Chu, W., Keerthi, S. S., & Ong, C. J. (2004). Bayesian support vector regression using a unified loss function. IEEE Transactions on Neural Networks, 15(1), 29-44.
-
(2004)
IEEE Transactions on Neural Networks
, vol.15
, pp. 29-44
-
-
Chu, W.1
Keerthi, S.S.2
Ong, C.J.3
-
9
-
-
34249753618
-
Support vector networks
-
3
-
Cortes, C., & Vapnik, V. N. (1995). Support vector networks. Machine Learning, 20(3), 273-297.
-
(1995)
Machine Learning
, vol.20
, pp. 273-297
-
-
Cortes, C.1
Vapnik, V.N.2
-
11
-
-
26444573178
-
Which is the best multiclass SVM method? An empirical study
-
Duan, K. B., & Keerthi, S. S. (2005). Which is the best multiclass SVM method? An empirical study. Lecture Notes in Computer Science, 3541, 278-285.
-
(2005)
Lecture Notes in Computer Science
, vol.3541
, pp. 278-285
-
-
Duan, K.B.1
Keerthi, S.S.2
-
13
-
-
0036161259
-
Gene selection for cancer classification using support vector machines
-
1-3
-
Guyon, I., Weston, J., Barnhill, S., & Vapnik, V. N. (2002). Gene selection for cancer classification using support vector machines. Machine Learning, 46(1-3), 389-422.
-
(2002)
Machine Learning
, vol.46
, pp. 389-422
-
-
Guyon, I.1
Weston, J.2
Barnhill, S.3
Vapnik, V.N.4
-
15
-
-
33745891586
-
-
Springer Berlin
-
Guyon, I., Gunn, S., Nikravesh, M., & Zadeh, L. A. (2006a). Feature extraction, foundations and applications. Berlin: Springer.
-
(2006)
Feature Extraction, Foundations and Applications
-
-
Guyon, I.1
Gunn, S.2
Nikravesh, M.3
Zadeh, L.A.4
-
16
-
-
70350225878
-
Feature selection benchmark for classification problems
-
Springer Berlin
-
Guyon, I., Gunn, S., Hur, A. B., & Dror, G. (2006b). Feature selection benchmark for classification problems. In I. Guyon, S. Gunn, M. Nikravesh, & L. A. Zadeh (Eds.), Feature extraction, foundations and applications. Berlin: Springer.
-
(2006)
Feature Extraction, Foundations and Applications
-
-
Guyon, I.1
Gunn, S.2
Hur, A.B.3
Dror, G.4
Guyon, I.5
Gunn, S.6
Nikravesh, M.7
Zadeh, L.A.8
-
17
-
-
23844545305
-
Feature selection algorithms for the generation of multiple classifier systems and their application to handwritten word recognition
-
11
-
Günter, S., & Bunke, H. (2004). Feature selection algorithms for the generation of multiple classifier systems and their application to handwritten word recognition. Pattern Recognition Letters, 25(11), 1323-1336.
-
(2004)
Pattern Recognition Letters
, vol.25
, pp. 1323-1336
-
-
Günter, S.1
Bunke, H.2
-
20
-
-
0036738840
-
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
-
5
-
Keerthi, S. S. (2002). Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms. IEEE Transactions on Neural Networks, 13(5), 1225-1229.
-
(2002)
IEEE Transactions on Neural Networks
, vol.13
, pp. 1225-1229
-
-
Keerthi, S.S.1
-
21
-
-
0031381525
-
Wrappers for feature subset selection
-
Kohavi, R., & John, G. (1997). Wrappers for feature subset selection. Artificial Intelligence, 97, 273-324.
-
(1997)
Artificial Intelligence
, vol.97
, pp. 273-324
-
-
Kohavi, R.1
John, G.2
-
22
-
-
34547481968
-
Embedded methods
-
Springer Berlin
-
Lal, T. N., Chapelle, O., Weston, J., & Elisseeff, A. (2006). Embedded methods. In I. Guyon, S. Gunn, M. Nikravesh, & L. A. Zadeh (Eds.), Feature extraction, foundations and applications. Berlin: Springer.
-
(2006)
Feature Extraction, Foundations and Applications
-
-
Lal, T.N.1
Chapelle, O.2
Weston, J.3
Elisseeff, A.4
Guyon, I.5
Gunn, S.6
Nikravesh, M.7
Zadeh, L.A.8
-
23
-
-
0003710388
-
-
(Technical Report). Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan
-
Lee, J. H., & Lin, C. J. (2000). Automatic model selection for support vector machines (Technical Report). Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan. Online at http://www.csie.ntu.edu.tw/cjlin/papers/modelselect.ps.gz.
-
(2000)
Automatic Model Selection for Support Vector Machines
-
-
Lee, J.H.1
Lin, C.J.2
-
24
-
-
10044295425
-
-
(Technical Report). Department of Computer Science, National Taiwan University
-
Lin, H. T., Lin, C. J., & Weng, R. C. (2003). A note on Platt's probabilistic outputs for support vector machines (Technical Report). Department of Computer Science, National Taiwan University. http://www.csie.ntu.edu.tw/ ~cjlin/papers/plattprob.ps.
-
(2003)
A Note on Platt's Probabilistic Outputs for Support Vector Machines
-
-
Lin, H.T.1
Lin, C.J.2
Weng, R.C.3
-
25
-
-
0033337021
-
Fisher discriminant analysis with kernels
-
IEEE New York
-
Mika, S., Rätsch, G., Weston, J., Schölkopf, B., & Müller, K.-R. (1999). Fisher discriminant analysis with kernels. In Y.-H. Hu, J. Larsen, E. Wilson, & S. Douglas (Eds.), Neural networks for signal processing IX (pp. 41-48). New York: IEEE.
-
(1999)
Neural Networks for Signal Processing IX
, pp. 41-48
-
-
Mika, S.1
Rätsch, G.2
Weston, J.3
Schölkopf, B.4
Müller, K.-R.5
Hu, Y.-H.6
Larsen, J.7
Wilson, E.8
Douglas, S.9
-
26
-
-
30044438683
-
Combined SVM-based feature selection and classification
-
1-3
-
Neumann, J., Schnörr, C., & Steidl, G. (2005). Combined SVM-based feature selection and classification. Machine Learning, 61(1-3), 129-150.
-
(2005)
Machine Learning
, vol.61
, pp. 129-150
-
-
Neumann, J.1
Schnörr, C.2
Steidl, G.3
-
27
-
-
33745834241
-
-
Irvine, CA: University of California, Department of Information and Computer Science
-
Newman, D. J., Hettich, S., Blake, C. L., & Merz, C. J. (1998). UCI repository of machine learning databases. Irvine, CA: University of California, Department of Information and Computer Science. http://www.ics.uci.edu/~mlearn/ MLRepository.html.
-
(1998)
UCI Repository of Machine Learning Databases
-
-
Newman, D.J.1
Hettich, S.2
Blake, C.L.3
Merzc., J.4
-
28
-
-
36849001186
-
A note on generating random permutations
-
3
-
Page, E. S. (1967). A note on generating random permutations. Applied Statistics, 16(3), 273-274.
-
(1967)
Applied Statistics
, vol.16
, pp. 273-274
-
-
Page, E.S.1
-
30
-
-
0003243224
-
Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods
-
MIT Press Cambridge
-
Platt, J. (2000). Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. In A. Smola, P. Bartlett, B. Scholkopf, & D. Schuurmans (Eds.), Advances in large margin classifiers. Cambridge: MIT Press.
-
(2000)
Advances in Large Margin Classifiers
-
-
Platt, J.1
Smola, A.2
Bartlett, P.3
Scholkopf, B.4
Schuurmans, D.5
-
33
-
-
0342502195
-
Soft margins for AdaBoost
-
3
-
Rätsch, G., Onoda, T., & Müller, K.-R. (2001). Soft margins for AdaBoost. Machine Learning, 42(3), 287-320.
-
(2001)
Machine Learning
, vol.42
, pp. 287-320
-
-
Rätsch, G.1
Onoda, T.2
Müller, K.-R.3
-
34
-
-
84899016174
-
Minimum Bayes error feature selection for continuous speech recognition
-
T. Leen, T. Dietterich, & V. Tresp (Eds.)
-
Saon, G., & Padmanabhan, M. (2001). Minimum Bayes error feature selection for continuous speech recognition. In T. Leen, T. Dietterich, & V. Tresp (Eds.), Advances in neural information processing systems (Vol. 13, pp. 800-806).
-
(2001)
Advances in Neural Information Processing Systems
, vol.13
, pp. 800-806
-
-
Saon, G.1
Padmanabhan, M.2
-
37
-
-
0034264380
-
Bounds on error expectation for support vector machines
-
9
-
Vapnik, V. N., & Chapelle, O. (2000). Bounds on error expectation for support vector machines. Neural Computation, 12(9), 2013-2036.
-
(2000)
Neural Computation
, vol.12
, pp. 2013-2036
-
-
Vapnik, V.N.1
Chapelle, O.2
-
38
-
-
84898948710
-
Feature selection for SVMs
-
T. Leen, T. Dietterich, & V. Tresp (Eds.)
-
Weston, J., Mukherjee, S., Chapelle, O., Pontil, M., Poggio, T., & Vapnik, V. N. (2001). Feature selection for SVMs. In T. Leen, T. Dietterich, & V. Tresp (Eds.), Advances in neural information processing systems (Vol. 13, pp. 668-674).
-
(2001)
Advances in Neural Information Processing Systems
, vol.13
, pp. 668-674
-
-
Weston, J.1
Mukherjee, S.2
Chapelle, O.3
Pontil, M.4
Poggio, T.5
Vapnik, V.N.6
-
39
-
-
0002295913
-
Gaussian processes for regression
-
D.S. Touretzky, M.C. Mozer, & M.E. Hasselmo (Eds.)
-
Williams, C. K. I., & Rasmussen, C. E. (1996). Gaussian processes for regression. In D.S. Touretzky, M.C. Mozer, & M.E. Hasselmo (Eds.), Advances in neural information processing systems (Vol. 8, pp. 598-604).
-
(1996)
Advances in Neural Information Processing Systems
, vol.8
, pp. 598-604
-
-
Williams, C.K.I.1
Rasmussen, C.E.2
|