-
1
-
-
5844297152
-
Theory of reproducing kernels
-
Aronszajn, N. (1950). Theory of reproducing kernels, Trans. Amer. Math. Soc, 68, 337-404.
-
(1950)
Trans. Amer. Math. Soc
, vol.68
, pp. 337-404
-
-
Aronszajn, N.1
-
2
-
-
33645505792
-
Convexity, classification, and risk bounds
-
Bartlett, P. L., Jordan, M. I., & McAuliffe, J. D. (2006). Convexity, classification, and risk bounds. J. Amer. Statist. Assoc, 101(473), 138-156.
-
(2006)
J. Amer. Statist. Assoc
, vol.101
, Issue.473
, pp. 138-156
-
-
Bartlett, P.L.1
Jordan, M.I.2
McAuliffe, J.D.3
-
3
-
-
33846829559
-
On regularization algorithms in learning theory
-
Bauer, F., Pereverzev, S., & Rosasco, L. (2007). On regularization algorithms in learning theory, Journal of Complexity, 1, 52-72.
-
(2007)
Journal of Complexity
, vol.1
, pp. 52-72
-
-
Bauer, F.1
Pereverzev, S.2
Rosasco, L.3
-
6
-
-
0043245810
-
-
2-loss: Regression and classification. journal of American Statistical Association, 98, 324-340.
-
2-loss: Regression and classification. journal of American Statistical Association, 98, 324-340.
-
-
-
-
7
-
-
34547424354
-
Optimal rates for regularization operators in learning theory
-
264/CSAIL-TR 2006-062, Cambridge, MA: MIT. Available online at
-
Caponnetto, A. (2006). Optimal rates for regularization operators in learning theory (Tech. Rep. CBCL Paper 264/CSAIL-TR 2006-062). Cambridge, MA: MIT. Available online at http://cbcl.mit.edu/projects/cbcl/publications/ps/ MIT-CSAIL-TR-2006-062.pdf.
-
(2006)
Tech. Rep. CBCL Paper
-
-
Caponnetto, A.1
-
8
-
-
34548537866
-
Optimal rates for regularized least-squares algorithm
-
Caponnetto, A., & De Vito, E. (2007). Optimal rates for regularized least-squares algorithm. Found. Comput. Math., 3, 331-368.
-
(2007)
Found. Comput. Math
, vol.3
, pp. 331-368
-
-
Caponnetto, A.1
De Vito, E.2
-
9
-
-
50649084677
-
Cluster kernels for semisupervised learning
-
S. Becker, S. Thrün, & K. Obermayer Eds, Cambridge, MA: MIT Press
-
Chapelle, O., Weston, J., & Scholkopf, B. (2003). Cluster kernels for semisupervised learning. In S. Becker, S. Thrün, & K. Obermayer (Eds.), Neural information processing systems, 15 (pp. 585-592). Cambridge, MA: MIT Press.
-
(2003)
Neural information processing systems
, vol.15
, pp. 585-592
-
-
Chapelle, O.1
Weston, J.2
Scholkopf, B.3
-
10
-
-
24944432318
-
Model selection for regularized least-squares algorithm in learning theory
-
De Vito, E., Caponnetto, A., & Rosasco, L. (2005). Model selection for regularized least-squares algorithm in learning theory. Found. Comput. Math., 5(1), 59-85.
-
(2005)
Found. Comput. Math
, vol.5
, Issue.1
, pp. 59-85
-
-
De Vito, E.1
Caponnetto, A.2
Rosasco, L.3
-
11
-
-
47049131326
-
Discretization error analysis for Tikhonov regularization
-
De Vito, E., Rosasco, L., & Caponnetto, A. (2006). Discretization error analysis for Tikhonov regularization. Anal. Appl, 4(1), 81-99.
-
(2006)
Anal. Appl
, vol.4
, Issue.1
, pp. 81-99
-
-
De Vito, E.1
Rosasco, L.2
Caponnetto, A.3
-
12
-
-
21844447610
-
Learning from examples as an inverse problem
-
De Vito, E., Rosasco, L., Caponnetto, A., De Giovannini, U., & Odone, F. (2005). Learning from examples as an inverse problem. Journal of Machine Learning Research, 6, 883-904.
-
(2005)
Journal of Machine Learning Research
, vol.6
, pp. 883-904
-
-
De Vito, E.1
Rosasco, L.2
Caponnetto, A.3
De Giovannini, U.4
Odone, F.5
-
13
-
-
47049123664
-
-
Tech. Rep, DISI, Università degli Studi di Genova, Italy
-
De Vito, E., Rosasco, L., & Verri, A. (2005). Spectral methods for regularization in learning theory (Tech. Rep.). DISI, Università degli Studi di Genova, Italy.
-
(2005)
Spectral methods for regularization in learning theory
-
-
De Vito, E.1
Rosasco, L.2
Verri, A.3
-
15
-
-
0034419669
-
Regularization networks and support vector machines
-
Evgeniou, T., Pontil, M., & Poggio, T. (2000). Regularization networks and support vector machines. Adv. Comp. Math., 13, 1-50.
-
(2000)
Adv. Comp. Math
, vol.13
, pp. 1-50
-
-
Evgeniou, T.1
Pontil, M.2
Poggio, T.3
-
16
-
-
0001219859
-
Regularization theory and neural networks architectures
-
Girosi, E., Jones, M., & Poggio, T. (1995). Regularization theory and neural networks architectures. Neural Computation, 7(2), 219-269.
-
(1995)
Neural Computation
, vol.7
, Issue.2
, pp. 219-269
-
-
Girosi, E.1
Jones, M.2
Poggio, T.3
-
17
-
-
0004236492
-
-
3rd ed, Baltimore, MD: Johns Hopkins University Press
-
Golub, G. H., & Van Loan, C. F. (1996). Matrix computations (3rd ed.). Baltimore, MD: Johns Hopkins University Press.
-
(1996)
Matrix computations
-
-
Golub, G.H.1
Van Loan, C.F.2
-
18
-
-
84925605946
-
The entire regularization path for the support vector machine
-
Hastie, S. T., Rosset, S., Tibshirani, R., & Zhu, J. (2004). The entire regularization path for the support vector machine. JMLR, 5, 1391-1415.
-
(2004)
JMLR
, vol.5
, pp. 1391-1415
-
-
Hastie, S.T.1
Rosset, S.2
Tibshirani, R.3
Zhu, J.4
-
21
-
-
33845518144
-
Universal kernels
-
Micchelli, C. A., Xu, Y., & Zhang, H. (2006). Universal kernels. JMLR, 7, 2651-2667.
-
(2006)
JMLR
, vol.7
, pp. 2651-2667
-
-
Micchelli, C.A.1
Xu, Y.2
Zhang, H.3
-
22
-
-
21844436192
-
Regularization by early stopping
-
Computer Sciences Laboratory and RSISE, Australian National University. Available online at
-
Ong, C., & Canu, S. (2004). Regularization by early stopping (Tech. Rep.). Computer Sciences Laboratory and RSISE, Australian National University. Available online at http://asi.insa-rouen.fr/scanu/.
-
(2004)
Tech. Rep
-
-
Ong, C.1
Canu, S.2
-
23
-
-
14344254996
-
Learning with nonpositive kernels
-
New York: ACM Press. Available online at
-
Ong, C., Mary, X., Canu, S., & Smola, A. (2004). Learning with nonpositive kernels. In Proceedings of the 21st International Conference on Machine Learning. New York: ACM Press. Available online at http://www.aicml.cs. ualberta.ca/_banff04/icml /pages /papers/392.pdf.
-
(2004)
Proceedings of the 21st International Conference on Machine Learning
-
-
Ong, C.1
Mary, X.2
Canu, S.3
Smola, A.4
-
24
-
-
0042049518
-
A theory of networks for approximation and learning
-
C. Lau Ed, Piscataway, NJ: IEEE Press
-
Poggio, T., & Girosi, F. (1992). A theory of networks for approximation and learning. In C. Lau (Ed.), Foundation of neural networks (pp. 91-106). Piscataway, NJ: IEEE Press.
-
(1992)
Foundation of neural networks
, pp. 91-106
-
-
Poggio, T.1
Girosi, F.2
-
25
-
-
1842420581
-
General conditions for predictivity in learning theory
-
Poggio, T., Rifkin, R., Mukherjee, S., & Niyogi, P. (2004). General conditions for predictivity in learning theory. Nature, 428, 419-422.
-
(2004)
Nature
, vol.428
, pp. 419-422
-
-
Poggio, T.1
Rifkin, R.2
Mukherjee, S.3
Niyogi, P.4
-
26
-
-
0032523506
-
Properties of support vector machines
-
Pontil, M., & Verri, A. (1998). Properties of support vector machines. Neural Computation, 10, 977-996.
-
(1998)
Neural Computation
, vol.10
, pp. 977-996
-
-
Pontil, M.1
Verri, A.2
-
27
-
-
33746061490
-
Stability results in learning theory
-
Rakhlin, A., Mukherjee, S., & Poggio, T. (2005). Stability results in learning theory. Analysis and Applications, 3, 397-419.
-
(2005)
Analysis and Applications
, vol.3
, pp. 397-419
-
-
Rakhlin, A.1
Mukherjee, S.2
Poggio, T.3
-
29
-
-
3042850649
-
Shannon sampling and function reconstruction from point values
-
Smale, S., & Zhou, D.-X. (2004). Shannon sampling and function reconstruction from point values. Bull. Amer. Math. Soc. (N.S.), 41(3), 279-305.
-
(2004)
Bull. Amer. Math. Soc. (N.S.)
, vol.41
, Issue.3
, pp. 279-305
-
-
Smale, S.1
Zhou, D.-X.2
-
30
-
-
27844555491
-
Shannon sampling. II. Connections to learning theory
-
Smale, S., & Zhou, D.-X. (2005). Shannon sampling. II. Connections to learning theory. Appl. Comput. Harmon. Anal., 19(3), 285-302.
-
(2005)
Appl. Comput. Harmon. Anal
, vol.19
, Issue.3
, pp. 285-302
-
-
Smale, S.1
Zhou, D.-X.2
-
31
-
-
34547455409
-
Learning theory estimates via integral operators and their approximations
-
Smale, S., & Zhou, D.-X. (2007). Learning theory estimates via integral operators and their approximations. Constructive Approximation, 26(2), 153-172.
-
(2007)
Constructive Approximation
, vol.26
, Issue.2
, pp. 153-172
-
-
Smale, S.1
Zhou, D.-X.2
-
37
-
-
33744772341
-
Learning rates of least-square regularized regression
-
Wu, Q., Ying, Y., & Zhou, D.-X. (2006). Learning rates of least-square regularized regression. Found. Comput. Math., 6(2), 171-192.
-
(2006)
Found. Comput. Math
, vol.6
, Issue.2
, pp. 171-192
-
-
Wu, Q.1
Ying, Y.2
Zhou, D.-X.3
-
38
-
-
34547435898
-
On early stopping in gradient descent learning
-
Yao, Y., Rosasco, L., & Caponnetto, A. (2007). On early stopping in gradient descent learning. Constructive Approximation, 26, 289-315.
-
(2007)
Constructive Approximation
, vol.26
, pp. 289-315
-
-
Yao, Y.1
Rosasco, L.2
Caponnetto, A.3
-
39
-
-
84864072823
-
Analysis of spectral kernel design based semi-supervised learning
-
Y. Weiss, B. Scholkopf, & J. Platt Eds, Cambridge, MA: MIT Press
-
Zhang, T., & Ando, R. (2006). Analysis of spectral kernel design based semi-supervised learning. In Y. Weiss, B. Scholkopf, & J. Platt (Eds.), Advances in neural information processing systems, 18 (pp. 1601-1608). Cambridge, MA: MIT Press.
-
(2006)
Advances in neural information processing systems
, vol.18
, pp. 1601-1608
-
-
Zhang, T.1
Ando, R.2
-
40
-
-
84899028404
-
Nonparametric transforms of graph kernels for semi-supervised learning
-
L. K. Saul, Y. Weiss, & L. Bottou Eds, Cambridge, MA: MIT Press
-
Zhu, X., Kandola, J., Ghahramani, Z., & Lafferty, J. (2005). Nonparametric transforms of graph kernels for semi-supervised learning. In L. K. Saul, Y. Weiss, & L. Bottou (Eds.), Neural information processing systems, 17. Cambridge, MA: MIT Press.
-
(2005)
Neural information processing systems, 17
-
-
Zhu, X.1
Kandola, J.2
Ghahramani, Z.3
Lafferty, J.4
|