-
3
-
-
0033459856
-
Risk bounds for model selection via penalization
-
A. Barron, L. Birgé, and P. Massart, "Risk bounds for model selection via penalization," Probab. Theory Relat. Fields, vol. 113, pp. 301-413, 1999.
-
(1999)
Probab. Theory Relat. Fields
, vol.113
, pp. 301-413
-
-
Barron, A.1
Birgé, L.2
Massart, P.3
-
4
-
-
0036997840
-
Model selection for regression on a random design
-
Y. Baraud, "Model selection for regression on a random design," ESAIM Probab. Statist., vol. 6, pp. 127-146, 2002.
-
(2002)
ESAIM Probab. Statist
, vol.6
, pp. 127-146
-
-
Baraud, Y.1
-
5
-
-
34247553430
-
Concentration Inequalities and Model Selection
-
Berlin, Germany: Springer-Verlag
-
P. Massart, Concentration Inequalities and Model Selection, ser. Lecture Notes in Mathematics. Berlin, Germany: Springer-Verlag, 2007, vol. 1896.
-
(2007)
ser. Lecture Notes in Mathematics
, vol.1896
-
-
Massart, P.1
-
6
-
-
0000595627
-
Some applications of concentration inequalities to statistics
-
P. Massart, "Some applications of concentration inequalities to statistics," Annales de la Faculté des Sciences de Toulouse, vol. IX, pp. 245-303, 2000.
-
(2000)
Annales de la Faculté des Sciences de Toulouse
, vol.9
, pp. 245-303
-
-
Massart, P.1
-
7
-
-
26444592981
-
Local Rademacher complexities
-
P. Bartlett, O. Bousquet, and S. Mendelson, "Local Rademacher complexities," Ann. Statist., vol. 33, no. 4, pp. 1497-537, 2005.
-
(2005)
Ann. Statist
, vol.33
, Issue.4
, pp. 1497-1537
-
-
Bartlett, P.1
Bousquet, O.2
Mendelson, S.3
-
8
-
-
84898928351
-
Kernel projection machine: A newtool for pattern recognition
-
G. Blanchard, P. Massart, R. Vert, and L. Zwald, L. K. Saul, Y.Weiss, and L. Bottou, Eds, Cambridge, MA: MIT Press
-
G. Blanchard, P. Massart, R. Vert, and L. Zwald, L. K. Saul, Y.Weiss, and L. Bottou, Eds., "Kernel projection machine: A newtool for pattern recognition," in Advances in Neural Information Processing System 17 Cambridge, MA: MIT Press, 2005, pp. 1649-1656.
-
(2005)
Advances in Neural Information Processing System 17
, pp. 1649-1656
-
-
-
9
-
-
0036258405
-
Support vector machines and the Bayes rule in classification
-
Y. Lin, "Support vector machines and the Bayes rule in classification," Data Mining Knowl. Disc., vol. 6, pp. 259-275, 2002.
-
(2002)
Data Mining Knowl. Disc
, vol.6
, pp. 259-275
-
-
Lin, Y.1
-
10
-
-
4644257995
-
Statistical behavior and consistency of classification methods based on convex risk minimization
-
T. Zhang, "Statistical behavior and consistency of classification methods based on convex risk minimization," Ann. Statist., vol. 32, no. 1, pp. 56-85, 2004.
-
(2004)
Ann. Statist
, vol.32
, Issue.1
, pp. 56-85
-
-
Zhang, T.1
-
11
-
-
0003624357
-
A Distribution-Free Theory of Nonparametric Regression
-
New York: Springer Verlag
-
L. Györfi, M. Kohler, A. Krzyÿzak, and H. Walk, A Distribution-Free Theory of Nonparametric Regression, ser. Statistics. New York: Springer Verlag, 2002.
-
(2002)
ser. Statistics
-
-
Györfi, L.1
Kohler, M.2
Krzyÿzak, A.3
Walk, H.4
-
12
-
-
0032028728
-
The sample complexity of pattern classification with neural networks: The size of the weights is more important than the size of the network
-
Mar
-
P. L. Bartlett, "The sample complexity of pattern classification with neural networks: The size of the weights is more important than the size of the network," IEEE Trans. Inf. Theory, vol. 44, no. 2, pp. 525-536, Mar. 1998.
-
(1998)
IEEE Trans. Inf. Theory
, vol.44
, Issue.2
, pp. 525-536
-
-
Bartlett, P.L.1
-
13
-
-
84879394399
-
Support vector machine soft margin classifiers: Error analysis
-
D.-R. Chen, Q.Wu, Y. Ying, and D.-X. Zhou, "Support vector machine soft margin classifiers: Error analysis," J. Mach. Learn. Res., vol. 5, pp. 1143-1175, 2004.
-
(2004)
J. Mach. Learn. Res
, vol.5
, pp. 1143-1175
-
-
Chen, D.-R.1
Wu, Q.2
Ying, Y.3
Zhou, D.-X.4
-
14
-
-
33746243474
-
Risk bounds for statistical learning
-
P. Massart and E. Nédélec, "Risk bounds for statistical learning," Ann. Statist., vol. 34, no. 5, pp. 2326-2366, 2006.
-
(2006)
Ann. Statist
, vol.34
, Issue.5
, pp. 2326-2366
-
-
Massart, P.1
Nédélec, E.2
-
15
-
-
21844497119
-
Asymptotical minimax recovery of sets with smooth boundaries
-
E. Mammen and A. B. Tsybakov, "Asymptotical minimax recovery of sets with smooth boundaries," Ann. Statist., vol. 23, no. 2, pp. 502-524, 1995.
-
(1995)
Ann. Statist
, vol.23
, Issue.2
, pp. 502-524
-
-
Mammen, E.1
Tsybakov, A.B.2
-
16
-
-
3142725508
-
Optimal aggregation of classifiers in statistical learning
-
A. Tsybakov, "Optimal aggregation of classifiers in statistical learning," Ann. Statist., vol. 32, no. 1, pp. 135-166, 2004.
-
(2004)
Ann. Statist
, vol.32
, Issue.1
, pp. 135-166
-
-
Tsybakov, A.1
-
17
-
-
23744505130
-
Square root penalty: Adaptation to the margin in classification and in edge estimation
-
A. Tsybakov and S. van de Geer, "Square root penalty: Adaptation to the margin in classification and in edge estimation," Ann. Statist., vol. 33, no. 3, pp. 1203-1224, 2005.
-
(2005)
Ann. Statist
, vol.33
, Issue.3
, pp. 1203-1224
-
-
Tsybakov, A.1
van de Geer, S.2
-
19
-
-
33746043351
-
-
J.-Y. Audibert, A randomized online learning algorithm for better variance control, in Proc. 19th Conf. Comput. Learn. Theory, ser. Lecture Notes in Computer Science, G. Lugosi and H. U. Simon, Eds. Berlin, Germany: Springer-Verlag, 2006, 4005, pp. 392-407.
-
J.-Y. Audibert, "A randomized online learning algorithm for better variance control," in Proc. 19th Conf. Comput. Learn. Theory, ser. Lecture Notes in Computer Science, G. Lugosi and H. U. Simon, Eds. Berlin, Germany: Springer-Verlag, 2006, vol. 4005, pp. 392-407.
-
-
-
-
20
-
-
33645724205
-
Minimax optimal classification with dyadic decision trees
-
Apr
-
C. Scott and R. Nowak, "Minimax optimal classification with dyadic decision trees," IEEE Trans. Inf. Theory, vol. 52, no. 4, pp. 1335-1353, Apr. 2006.
-
(2006)
IEEE Trans. Inf. Theory
, vol.52
, Issue.4
, pp. 1335-1353
-
-
Scott, C.1
Nowak, R.2
-
21
-
-
34247197035
-
Fast rates for support vector machines using Gaussian kernels
-
I. Steinwart and C. Scovel, "Fast rates for support vector machines using Gaussian kernels," Ann. Statist., vol. 35, pp. 575-607, 2007.
-
(2007)
Ann. Statist
, vol.35
, pp. 575-607
-
-
Steinwart, I.1
Scovel, C.2
-
22
-
-
21844447610
-
Learning from examples as an inverse problem
-
E. De Vito, L. Rosasco, A. Caponnetto, U. De Giovannini, and F. Odone, "Learning from examples as an inverse problem," J. Mach. Learn. Res., vol. 6, pp. 883-904, 2005.
-
(2005)
J. Mach. Learn. Res
, vol.6
, pp. 883-904
-
-
De Vito, E.1
Rosasco, L.2
Caponnetto, A.3
De Giovannini, U.4
Odone, F.5
-
23
-
-
34547418100
-
Statistical performance of support vector machines
-
to be published
-
G. Blanchard, O. Bousquet, and P. Massart, "Statistical performance of support vector machines," Ann. Statist., to be published.
-
Ann. Statist
-
-
Blanchard, G.1
Bousquet, O.2
Massart, P.3
-
24
-
-
0002439508
-
Regularization networks and support vector machines
-
A. J. Smola, P. L. Bartlett, B. Schölkopf, and D. Schuurmans, Eds. Cambridge, MA: MIT Press
-
T. Evgeniou, M. Pontil, and T. Poggio, "Regularization networks and support vector machines," in Advances in Large Margin Classifiers A. J. Smola, P. L. Bartlett, B. Schölkopf, and D. Schuurmans, Eds. Cambridge, MA: MIT Press, 2000, pp. 171-203.
-
(2000)
Advances in Large Margin Classifiers
, pp. 171-203
-
-
Evgeniou, T.1
Pontil, M.2
Poggio, T.3
-
25
-
-
24044515976
-
On a kernel-based method for pattern recognition, regression, approximation and operator inversion
-
A. J. Smola and B. Schölkopf, "On a kernel-based method for pattern recognition, regression, approximation and operator inversion," Algorithmica, vol. 22, pp. 211-231, 1998.
-
(1998)
Algorithmica
, vol.22
, pp. 211-231
-
-
Smola, A.J.1
Schölkopf, B.2
-
27
-
-
0347243182
-
Nonlinear component analysis as a kernel eigenvalue problem
-
B. Schölkopf, A. J. Smola, and K.-R. Müller, "Nonlinear component analysis as a kernel eigenvalue problem," Neural Comput., vol. 10, pp. 1299-1319, 1998.
-
(1998)
Neural Comput
, vol.10
, pp. 1299-1319
-
-
Schölkopf, B.1
Smola, A.J.2
Müller, K.-R.3
-
28
-
-
9444244014
-
Asymptotics of spectral projections of some random matrices approximating integral operators
-
V. Koltchinskii, "Asymptotics of spectral projections of some random matrices approximating integral operators," Progr. Probab., vol. 43, pp. 191-227, 1998.
-
(1998)
Progr. Probab
, vol.43
, pp. 191-227
-
-
Koltchinskii, V.1
-
29
-
-
0000957849
-
Asymptotic theory for the principal component analysis of a vector random function: Some applications to statistical inference
-
J. Dauxois, A. Pousse, and Y. Romain, "Asymptotic theory for the principal component analysis of a vector random function: Some applications to statistical inference," J. Multivariate Anal., vol. 12, pp. 136-154, 1982.
-
(1982)
J. Multivariate Anal
, vol.12
, pp. 136-154
-
-
Dauxois, J.1
Pousse, A.2
Romain, Y.3
-
30
-
-
84864041215
-
On the convergence of eigenspaces in kernel principal component analysis
-
Y. Weiss, L. Bottou, and J. Platt, Eds. Cambridge, MA: MIT Press
-
L. Zwald and G. Blanchard, "On the convergence of eigenspaces in kernel principal component analysis," in Advances in Neural Information Processing Systems 18, Y. Weiss, L. Bottou, and J. Platt, Eds. Cambridge, MA: MIT Press, 2006, pp. 1649-1656.
-
(2006)
Advances in Neural Information Processing Systems 18
, pp. 1649-1656
-
-
Zwald, L.1
Blanchard, G.2
-
32
-
-
41549168778
-
Convergence rates of general regularization methods for statistical inverse problems and applications
-
N. Bissantz, T. Hohage, A. Munk, and F. Ruymgaart, "Convergence rates of general regularization methods for statistical inverse problems and applications," SIAM J. Numer. Anal., vol. 45, pp. 2610-2636, 2007.
-
(2007)
SIAM J. Numer. Anal
, vol.45
, pp. 2610-2636
-
-
Bissantz, N.1
Hohage, T.2
Munk, A.3
Ruymgaart, F.4
-
33
-
-
0003969585
-
Estimation of Dependences Based on Empirical Data
-
New York: Springer-Verlag
-
V. Vapnik, Estimation of Dependences Based on Empirical Data, ser. Statistics. New York: Springer-Verlag, 1982.
-
(1982)
ser. Statistics
-
-
Vapnik, V.1
-
34
-
-
0142095807
-
Concentration inequalities and empirical processes theory applied to the analysis of learning algorithms,
-
Ph.D. dissertation, Dept. Appl. Math, École Polytechnique, Palaiseau, France
-
O. Bousquet, "Concentration inequalities and empirical processes theory applied to the analysis of learning algorithms," Ph.D. dissertation, Dept. Appl. Math., École Polytechnique, Palaiseau, France, 2002.
-
(2002)
-
-
Bousquet, O.1
-
37
-
-
0039813137
-
Gaussian regression and optimal finite dimensional linear models
-
C. M. Bishop, Ed. New York: Springer-Verlag
-
H. Zhu, C. Williams, R. Rohwer, and M. Morciniec, "Gaussian regression and optimal finite dimensional linear models," in Neural Networks and Machine Learning, C. M. Bishop, Ed. New York: Springer-Verlag, 1998, pp. 167-184.
-
(1998)
Neural Networks and Machine Learning
, pp. 167-184
-
-
Zhu, H.1
Williams, C.2
Rohwer, R.3
Morciniec, M.4
-
38
-
-
51349149556
-
-
C. K. I. Williams and M. Seeger, P. Langley, Ed., The effect of the input density distribution on kernel-based classifiers, in Proc. 17th Int. Conf. Mach. Learn., San Francisco, CA, 2000, pp. 1159-1166.
-
C. K. I. Williams and M. Seeger, P. Langley, Ed., "The effect of the input density distribution on kernel-based classifiers," in Proc. 17th Int. Conf. Mach. Learn., San Francisco, CA, 2000, pp. 1159-1166.
-
-
-
|