-
1
-
-
0000343716
-
Submodel selection and evaluation in regression: The x-random case
-
Breiman, L., &, Spector, P. (1992), Submodel selection and evaluation in regression: The x-random case, International Statistical Review, 60, 291-319.
-
(1992)
International Statistical Review
, vol.60
, pp. 291-319
-
-
Breiman, L.1
Spector, P.2
-
2
-
-
52649160239
-
Efficient leave-one-out cross-validation of kernel fisher discriminant
-
cawley03efficieiit.litnil
-
Cawley, G. C., & Talbot, N. L. C. (2003). Efficient leave-one-out cross-validation of kernel fisher discriminant classifiers. citeseer.ist.psu. edu/cawley03efficieiit.litnil.
-
(2003)
-
-
Cawley, G.C.1
Talbot, N.L.C.2
-
4
-
-
84864047275
-
Multiple instance learning for computer aided diagnosis
-
Fung, G., Dundar, M., Krishnapuram, B., & Rao, R. B. (2006), Multiple instance learning for computer aided diagnosis. NIPS 2006: Advances in Neural Information Processing Systems.
-
(2006)
NIPS 2006: Advances in Neural Information Processing Systems
-
-
Fung, G.1
Dundar, M.2
Krishnapuram, B.3
Rao, R.B.4
-
5
-
-
0035789613
-
Proximal support vector machine classifiers
-
August 26-29, San Francisco, CA pp, New York: Asscociation for Computing Machinery
-
Fung, G., & Mangasarian, O. L. (2001). Proximal support vector machine classifiers. Proceedings KDD-2001: Knowledge Discovery and Data Mining, August 26-29, 2001, San Francisco, CA (pp. 77-86). New York: Asscociation for Computing Machinery. ftp://ftp.cs.wisc.edu/pub/dmi/tech- reports/01-02.ps.
-
(2001)
Proceedings KDD-2001: Knowledge Discovery and Data Mining
, pp. 77-86
-
-
Fung, G.1
Mangasarian, O.L.2
-
9
-
-
0030654389
-
Algorithmic stability and sanity-check bounds for leave-one-out cross-validation
-
Kearns, M. J., & Ron, D. (1997). Algorithmic stability and sanity-check bounds for leave-one-out cross-validation. Computational Learing Theory (pp. 152-162).
-
(1997)
Computational Learing Theory
, pp. 152-162
-
-
Kearns, M.J.1
Ron, D.2
-
10
-
-
85164392958
-
A study of cross-validation and bootstrap for accuracy estimation and model selection
-
Kohavi, R. (1995a). A study of cross-validation and bootstrap for accuracy estimation and model selection. IJCAI (pp. 1137-1145).
-
(1995)
IJCAI
, pp. 1137-1145
-
-
Kohavi, R.1
-
11
-
-
85164392958
-
A study of cross-validation and bootstrap for accuracy estimation and model selection
-
Kohavi, R. (1995b). A study of cross-validation and bootstrap for accuracy estimation and model selection. International Joint Conference on Artificial Intelligence IJCAI (pp. 1137-1145).
-
(1995)
International Joint Conference on Artificial Intelligence IJCAI
, pp. 1137-1145
-
-
Kohavi, R.1
-
12
-
-
0031381525
-
Wrappers for feature subset selection
-
Kohavi, R., & John, G. H. (1997). Wrappers for feature subset selection. Artificial Intelligence, 97, 273-324.
-
(1997)
Artificial Intelligence
, vol.97
, pp. 273-324
-
-
Kohavi, R.1
John, G.H.2
-
14
-
-
0033337021
-
Fisher discriminant analysis with kernels
-
IEEE
-
Mika, S., Rätsch, G., Weston, J., Schölkopf, B., & Müller, K.-R. (1999). Fisher discriminant analysis with kernels. Neural Networks for Signal Processing IX (pp. 41-48). IEEE.
-
(1999)
Neural Networks for Signal Processing IX
, pp. 41-48
-
-
Mika, S.1
Rätsch, G.2
Weston, J.3
Schölkopf, B.4
Müller, K.-R.5
-
17
-
-
0013161560
-
On feature selection: Learning with exponentially many irrelevant features as training examples
-
Morgan Kaufmann, San Francisco, CA
-
Ng, A. Y. (1998). On feature selection: learning with exponentially many irrelevant features as training examples. Proc. 15th International Conf. on Machine Learning (pp. 404-412). Morgan Kaufmann, San Francisco, CA.
-
(1998)
Proc. 15th International Conf. on Machine Learning
, pp. 404-412
-
-
Ng, A.Y.1
-
19
-
-
84890445089
-
Overfitting in making comparisons between variable selection methods
-
Reunanen, J. (2003). Overfitting in making comparisons between variable selection methods. Journal of Machine Learning Research, 3, 1371-1382.
-
(2003)
Journal of Machine Learning Research
, vol.3
, pp. 1371-1382
-
-
Reunanen, J.1
-
21
-
-
0002619965
-
Ridge regression learning algorithm in dual variables
-
Morgan Kaufmann, San Francisco, CA
-
Saunders, G., Garnmerman, A., & Vovk, V. (1998). Ridge regression learning algorithm in dual variables. Proc. 15th International Conf. on Machine Learning (pp. 515-521). Morgan Kaufmann, San Francisco, CA.
-
(1998)
Proc. 15th International Conf. on Machine Learning
, pp. 515-521
-
-
Saunders, G.1
Garnmerman, A.2
Vovk, V.3
-
22
-
-
0017336301
-
Asymptotics for and against cross-validation
-
Stone, M. (1977). Asymptotics for and against cross-validation. Biometrika, 64, 29-35.
-
(1977)
Biometrika
, vol.64
, pp. 29-35
-
-
Stone, M.1
-
23
-
-
0032638628
-
Least squares support vector machine classifiers
-
Suykens, J., & Vandewalle, J. (1999). Least squares support vector machine classifiers. Neural Processing Letters, 9, 293-300.
-
(1999)
Neural Processing Letters
, vol.9
, pp. 293-300
-
-
Suykens, J.1
Vandewalle, J.2
-
27
-
-
0034863533
-
Kernel MSE algorithm: A unified framework for KFD, LS-SVM and KRR
-
Xu, J., Zhang, X., & Li, Y. (2001). Kernel MSE algorithm: a unified framework for KFD, LS-SVM and KRR. Proc. of IJCNN-01 (pp. 1486-1491).
-
(2001)
Proc. of IJCNN-01
, pp. 1486-1491
-
-
Xu, J.1
Zhang, X.2
Li, Y.3
-
28
-
-
0042879446
-
Leave-one-out bounds for kernel methods
-
Zhang, T. (2003). Leave-one-out bounds for kernel methods. Neural Computation, 15, 1397-1437.
-
(2003)
Neural Computation
, vol.15
, pp. 1397-1437
-
-
Zhang, T.1
|