-
1
-
-
5844297152
-
Theory of reproducing kernels
-
Aronszajn, N. (1950). Theory of reproducing kernels. Trans. Amer. Math. Soc., 68, 337-404.
-
(1950)
Trans. Amer. Math. Soc.
, vol.68
, pp. 337-404
-
-
Aronszajn, N.1
-
2
-
-
0032028728
-
The sample complexity of pattern classification with neural networks: The size of the weights is more import than the size of the network
-
Bartlett, P. L. (1998). The sample complexity of pattern classification with neural networks: The size of the weights is more import than the size of the network. IEEE Trans. Inform. Theory, 44, 525-536.
-
(1998)
IEEE Trans. Inform. Theory
, vol.44
, pp. 525-536
-
-
Bartlett, P.L.1
-
4
-
-
34548537866
-
Optimal rates for regularized least-squares algorithm
-
Caponnetto, A., & De Vito, E. (2007). Optimal rates for regularized least-squares algorithm. Found. Comput. Math., 7, 331-368.
-
(2007)
Found. Comput. Math.
, vol.7
, pp. 331-368
-
-
Caponnetto, A.1
De Vito, E.2
-
5
-
-
0036071370
-
On the mathematical foundations of learning theory
-
Cucker, F., & Smale, S. (2001). On the mathematical foundations of learning theory. Bull. Amer. Math. Soc., 39, 1-49.
-
(2001)
Bull. Amer. Math. Soc.
, vol.39
, pp. 1-49
-
-
Cucker, F.1
Smale, S.2
-
6
-
-
0036436325
-
Best choices for regularization parameters in learning theory: On the bias-variance problem
-
Cucker, F., & Smale, S. (2002). Best choices for regularization parameters in learning theory: On the bias-variance problem. Found. Comput. Math., 2, 413-428.
-
(2002)
Found. Comput. Math.
, vol.2
, pp. 413-428
-
-
Cucker, F.1
Smale, S.2
-
7
-
-
7044231546
-
An iterative thresholding algorithm for linear inverse problems with sparsity constraint
-
Daubechies, I., Defrise, M., & Demol, C. (2004). An iterative thresholding algorithm for linear inverse problems with sparsity constraint. Comm. Pure Appl. Math., 57, 1413-1541.
-
(2004)
Comm. Pure Appl. Math.
, vol.57
, pp. 1413-1541
-
-
Daubechies, I.1
Defrise, M.2
Demol, C.3
-
8
-
-
0034419669
-
Regularization networks and support vector machines
-
Evgeniou, T., Pontil, M., & Poggio, T. (2000). Regularization networks and support vector machines. Adv. Comput. Math., 13, 1-50.
-
(2000)
Adv. Comput. Math.
, vol.13
, pp. 1-50
-
-
Evgeniou, T.1
Pontil, M.2
Poggio, T.3
-
9
-
-
0001219859
-
Regularization theory and neural network architectures
-
Girosi, F., Jones, M., & Poggio, T. (1995). Regularization theory and neural network architectures. Neural Comput., 7, 219-269.
-
(1995)
Neural Comput.
, vol.7
, pp. 219-269
-
-
Girosi, F.1
Jones, M.2
Poggio, T.3
-
10
-
-
0003624357
-
-
Berlin: Springer
-
Györfi, L., Kohler, M., Krzyzak, A., & Walk, H. (2002). A distribution-free theory of nonparametric regression. Berlin: Springer.
-
(2002)
A distribution-free theory of nonparametric regression
-
-
Györfi, L.1
Kohler, M.2
Krzyzak, A.3
Walk, H.4
-
11
-
-
0033681936
-
Support vector selection by linear programming
-
Washington, DC: IEEE Computer Society Press
-
Kecman, V., & Hadzic, I. (2000). Support vector selection by linear programming. In Proceedings of the International Joint Conference on Neural Networks (Vol. 5, pp. 193-198). Washington, DC: IEEE Computer Society Press.
-
(2000)
Proceedings of the International Joint Conference on Neural Networks
, vol.5
, pp. 193-198
-
-
Kecman, V.1
Hadzic, I.2
-
12
-
-
0000482137
-
On the relationship between generalization error, hypothesis complexity, and sample complexity for radical basis functions
-
Niyogi, N., & Girosi, F. (1996). On the relationship between generalization error, hypothesis complexity, and sample complexity for radical basis functions. Neural Comput., 8, 819-842.
-
(1996)
Neural Comput.
, vol.8
, pp. 819-842
-
-
Niyogi, N.1
Girosi, F.2
-
16
-
-
0037749769
-
Estimating the approximation error in learning theory
-
Smale, S., & Zhou, D. X. (2003). Estimating the approximation error in learning theory. Anal. Appl., 1, 17-41.
-
(2003)
Anal. Appl.
, vol.1
, pp. 17-41
-
-
Smale, S.1
Zhou, D.X.2
-
17
-
-
27844555491
-
Shannon sampling II: Connections to learning theory
-
Smale, S., & Zhou, D. X. (2005). Shannon sampling II: Connections to learning theory. Appl. Comput. Harmonic Anal., 19, 285-302.
-
(2005)
Appl. Comput. Harmonic Anal.
, vol.19
, pp. 285-302
-
-
Smale, S.1
Zhou, D.X.2
-
18
-
-
34547455409
-
Learning theory estimates via integral operators and their applications
-
Smale, S., & Zhou, D. X. (2007). Learning theory estimates via integral operators and their applications. Constr. Approx., 26, 153-172.
-
(2007)
Constr. Approx.
, vol.26
, pp. 153-172
-
-
Smale, S.1
Zhou, D.X.2
-
22
-
-
33744772341
-
Learning rates of least-square regularized regression
-
Wu, Q., Ying, Y., & Zhou, D. X. (2006). Learning rates of least-square regularized regression. Found. Comput. Math., 6, 171-192.
-
(2006)
Found. Comput. Math.
, vol.6
, pp. 171-192
-
-
Wu, Q.1
Ying, Y.2
Zhou, D.X.3
-
23
-
-
17444402055
-
SVM soft margin classifiers: Linear programming versus quadratic programming
-
Wu, Q., & Zhou, D. X. (2005). SVM soft margin classifiers: Linear programming versus quadratic programming. Neural Comput., 17, 1160-1187.
-
(2005)
Neural Comput.
, vol.17
, pp. 1160-1187
-
-
Wu, Q.1
Zhou, D.X.2
-
24
-
-
84954358500
-
Learning with sample dependent hypothesis spaces
-
Wu, Q., & Zhou, D. X. (2008). Learning with sample dependent hypothesis spaces. Comput. Math. Appl., 56, 2896-2907.
-
(2008)
Comput. Math. Appl.
, vol.56
, pp. 2896-2907
-
-
Wu, Q.1
Zhou, D.X.2
-
26
-
-
0038105204
-
Capacity of reproducing kernel spaces in learning theory
-
Zhou, D. X. (2003). Capacity of reproducing kernel spaces in learning theory. IEEE Trans. Inform. Theory, 49, 1743-1752.
-
(2003)
IEEE Trans. Inform. Theory
, vol.49
, pp. 1743-1752
-
-
Zhou, D.X.1
|