-
2
-
-
29144489844
-
For most large underdetermined systems of linear equations the minimal 1-nonn solution is also the sparsest solution
-
Stanford University
-
Donoho, D. For most large underdetermined systems of linear equations the minimal 1-nonn solution is also the sparsest solution. Technical report, Statistics Department, Stanford University, 2004.
-
(2004)
Technical Report, Statistics Department
-
-
Donoho, D.1
-
3
-
-
4544371135
-
Dimensionality reduction for supervised learning with reproducing kernel hilbert spaces
-
Fukumizu, K., Bach, F., and Jordan, M. Dimensionality reduction for supervised learning with reproducing kernel hilbert spaces. Journal of Machine Learning Research, 5:73-99, 2004.
-
(2004)
Journal of Machine Learning Research
, vol.5
, pp. 73-99
-
-
Fukumizu, K.1
Bach, F.2
Jordan, M.3
-
5
-
-
33646528415
-
Measuring statistical dependence with hilbert-schrnidt norms
-
Gretton, A., Bousquet, O., Smola, A., and Scholkopf, B. Measuring statistical dependence with hilbert-schrnidt norms. In Int'l Conf. on Algorithmic Lemming Theory, pp. 63-77, 2005.
-
(2005)
Int'l Conf. on Algorithmic Lemming Theory
, pp. 63-77
-
-
Gretton, A.1
Bousquet, O.2
Smola, A.3
Scholkopf, B.4
-
7
-
-
0036161259
-
Gene selection for cancer classification using support vector machines
-
Guyon, I., Weston, J., Banihill, S., and Vapnik, V. Gene selection for cancer classification using support vector machines. Machine Learning, 46(1-3):389-422, 2002.
-
(2002)
Machine Learning
, vol.46
, Issue.1-3
, pp. 389-422
-
-
Guyon, I.1
Weston, J.2
Banihill, S.3
Vapnik, V.4
-
8
-
-
84864039505
-
Laplaeian score for feature selection
-
He, X., Cai, D., and Niyogi, P. Laplaeian score for feature selection. In Advances in Neural Information Processing Systems, pp. 507-514, 2006.
-
(2006)
Advances in Neural Information Processing Systems
, pp. 507-514
-
-
He, X.1
Cai, D.2
Niyogi, P.3
-
11
-
-
0031381525
-
Wrappers for feature subset selection
-
Kohavi, R. and John, G. H. Wrappers for feature subset selection. Artificial Intelligence, 97(1-2):273-324, 1997.
-
(1997)
Artificial Intelligence
, vol.97
, Issue.1-2
, pp. 273-324
-
-
Kohavi, R.1
John, G.H.2
-
12
-
-
77956505189
-
Convex principal feature selection
-
Masaeli, M., Yan, Y., Cui, Y., Fung, G., and Dy, J. G. Convex principal feature selection. In SIAM Int'l Conf. on Data Mining, pp. 619-628, 2010.
-
(2010)
SIAM Int'l Conf. on Data Mining
, pp. 619-628
-
-
Masaeli, M.1
Yan, Y.2
Cui, Y.3
Fung, G.4
Dy, J.G.5
-
13
-
-
0033337021
-
Fisher discriminant analysis with kernels
-
Mika, S., Ratsch, G., Western, J., Scholkoph, B., and Muller, K.R. Fisher discriminant analysis with kernels. In Neural Networks for Signal Processing IX, pp. 41-48, 1999.
-
(1999)
Neural Networks for Signal Processing IX
, pp. 41-48
-
-
Mika, S.1
Ratsch, G.2
Western, J.3
Scholkoph, B.4
Muller, K.R.5
-
14
-
-
33749263029
-
Generalized spectral bounds for sparse LDA
-
Moghaddam, B., Weiss, Y., and Avidan, S. Generalized spectral bounds for sparse LDA. In Int'l Conf. on Machine Learning, pp. 641-648, 2006.
-
(2006)
Int'l Conf. on Machine Learning
, pp. 641-648
-
-
Moghaddam, B.1
Weiss, Y.2
Avidan, S.3
-
16
-
-
24344458137
-
Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy
-
Peng, H., Long, F. and Ding, C. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. In IEEE Trans, on Pattern Analysis and Machine Intelligence, pp. 1226-1238, 2005.
-
(2005)
IEEE Trans, on Pattern Analysis and Machine Intelligence
, pp. 1226-1238
-
-
Peng, H.1
Long, F.2
Ding, C.3
-
17
-
-
34547964410
-
Supervised feature selection via dependence estimation
-
Song, L., Smola, A. J., Gretton, A., Borgwardt, K. M., and Bedo, J. Supervised feature selection via dependence estimation. In Int'l Conf. on Machine Learning, pp. 823-830, 2007.
-
(2007)
Int'l Conf. on Machine Learning
, pp. 823-830
-
-
Song, L.1
Smola, A.J.2
Gretton, A.3
Borgwardt, K.M.4
Bedo, J.5
-
18
-
-
85194972808
-
Regression shrinkage and selection via the lasso
-
Tibshirani, R. Regression shrinkage and selection via the lasso. J. Royal Stat Soc. B, 58(1):267-288, 1996.
-
(1996)
J. Royal Stat Soc. B
, vol.58
, Issue.1
, pp. 267-288
-
-
Tibshirani, R.1
-
20
-
-
25144492516
-
Efficient feature selection via analysis of relevance and redundancy
-
Yu, L. and Liu, H. Efficient feature selection via analysis of relevance and redundancy. Journal of Machine Learning Research, 5:1205-1224. 2004.
-
(2004)
Journal of Machine Learning Research
, vol.5
, pp. 1205-1224
-
-
Yu, L.1
Liu, H.2
-
21
-
-
33645035051
-
Model selection and estimation in regression with grouped variables
-
Yuan, M. and Lin, Y. Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. B, 1 (68) 49-67, 2006.
-
(2006)
J. R. Stat. Soc. B
, vol.1
, Issue.68
, pp. 49-67
-
-
Yuan, M.1
Lin, Y.2
-
22
-
-
34547981441
-
Spectral feature selection for supervised and unsupervised learning
-
DOI 10.1145/1273496.1273641, Proceedings, Twenty-Fourth International Conference on Machine Learning, ICML 2007
-
Zhao, Z. and Liu, H. Spectral feature selection for supervised and unsupervised learning. In Int'l Conf. on Machine Learning, pp. 1151-1157, 2007. (Pubitemid 47275183)
-
(2007)
ACM International Conference Proceeding Series
, vol.227
, pp. 1151-1157
-
-
Zhao, Z.1
Liu, H.2
-
23
-
-
24644515558
-
1-norm support vector machines
-
Zhu, Ji, Rosset, Sanaron, Hastie, Trevor, and Tibshirani, Robert. 1-norm support vector machines. In Advances in Neural Info. Proc. Systems, 2003.
-
(2003)
Advances in Neural Info. Proc. Systems
-
-
Zhu, J.1
Rosset, S.2
Hastie, T.3
Tibshirani, R.4
-
24
-
-
33745309913
-
Sparse principal component analysis
-
Zou, H., Hastie, T., and Tibshirani, R. Sparse principal component analysis. Journal of Computational and Graphical Statistics, 15(2):262-286. 2006.
-
(2006)
Journal of Computational and Graphical Statistics
, vol.15
, Issue.2
, pp. 262-286
-
-
Zou, H.1
Hastie, T.2
Tibshirani, R.3
|