메뉴 건너뛰기




Volumn 54, Issue 9, 2008, Pages 4169-4182

Finite-dimensional projection for classification and statistical learning

Author keywords

Classification; Dimension reduction; Kernel principal component analysis (KPCA); Regularization; Statistical learning; Support vector machine (SVM)

Indexed keywords

RISK ANALYSIS;

EID: 51349154782     PISSN: 00189448     EISSN: None     Source Type: Journal    
DOI: 10.1109/TIT.2008.926312     Document Type: Article
Times cited : (18)

References (38)
  • 3
    • 0033459856 scopus 로고    scopus 로고
    • Risk bounds for model selection via penalization
    • A. Barron, L. Birgé, and P. Massart, "Risk bounds for model selection via penalization," Probab. Theory Relat. Fields, vol. 113, pp. 301-413, 1999.
    • (1999) Probab. Theory Relat. Fields , vol.113 , pp. 301-413
    • Barron, A.1    Birgé, L.2    Massart, P.3
  • 4
    • 0036997840 scopus 로고    scopus 로고
    • Model selection for regression on a random design
    • Y. Baraud, "Model selection for regression on a random design," ESAIM Probab. Statist., vol. 6, pp. 127-146, 2002.
    • (2002) ESAIM Probab. Statist , vol.6 , pp. 127-146
    • Baraud, Y.1
  • 5
    • 34247553430 scopus 로고    scopus 로고
    • Concentration Inequalities and Model Selection
    • Berlin, Germany: Springer-Verlag
    • P. Massart, Concentration Inequalities and Model Selection, ser. Lecture Notes in Mathematics. Berlin, Germany: Springer-Verlag, 2007, vol. 1896.
    • (2007) ser. Lecture Notes in Mathematics , vol.1896
    • Massart, P.1
  • 6
    • 0000595627 scopus 로고    scopus 로고
    • Some applications of concentration inequalities to statistics
    • P. Massart, "Some applications of concentration inequalities to statistics," Annales de la Faculté des Sciences de Toulouse, vol. IX, pp. 245-303, 2000.
    • (2000) Annales de la Faculté des Sciences de Toulouse , vol.9 , pp. 245-303
    • Massart, P.1
  • 7
    • 26444592981 scopus 로고    scopus 로고
    • Local Rademacher complexities
    • P. Bartlett, O. Bousquet, and S. Mendelson, "Local Rademacher complexities," Ann. Statist., vol. 33, no. 4, pp. 1497-537, 2005.
    • (2005) Ann. Statist , vol.33 , Issue.4 , pp. 1497-1537
    • Bartlett, P.1    Bousquet, O.2    Mendelson, S.3
  • 8
    • 84898928351 scopus 로고    scopus 로고
    • Kernel projection machine: A newtool for pattern recognition
    • G. Blanchard, P. Massart, R. Vert, and L. Zwald, L. K. Saul, Y.Weiss, and L. Bottou, Eds, Cambridge, MA: MIT Press
    • G. Blanchard, P. Massart, R. Vert, and L. Zwald, L. K. Saul, Y.Weiss, and L. Bottou, Eds., "Kernel projection machine: A newtool for pattern recognition," in Advances in Neural Information Processing System 17 Cambridge, MA: MIT Press, 2005, pp. 1649-1656.
    • (2005) Advances in Neural Information Processing System 17 , pp. 1649-1656
  • 9
    • 0036258405 scopus 로고    scopus 로고
    • Support vector machines and the Bayes rule in classification
    • Y. Lin, "Support vector machines and the Bayes rule in classification," Data Mining Knowl. Disc., vol. 6, pp. 259-275, 2002.
    • (2002) Data Mining Knowl. Disc , vol.6 , pp. 259-275
    • Lin, Y.1
  • 10
    • 4644257995 scopus 로고    scopus 로고
    • Statistical behavior and consistency of classification methods based on convex risk minimization
    • T. Zhang, "Statistical behavior and consistency of classification methods based on convex risk minimization," Ann. Statist., vol. 32, no. 1, pp. 56-85, 2004.
    • (2004) Ann. Statist , vol.32 , Issue.1 , pp. 56-85
    • Zhang, T.1
  • 11
    • 0003624357 scopus 로고    scopus 로고
    • A Distribution-Free Theory of Nonparametric Regression
    • New York: Springer Verlag
    • L. Györfi, M. Kohler, A. Krzyÿzak, and H. Walk, A Distribution-Free Theory of Nonparametric Regression, ser. Statistics. New York: Springer Verlag, 2002.
    • (2002) ser. Statistics
    • Györfi, L.1    Kohler, M.2    Krzyÿzak, A.3    Walk, H.4
  • 12
    • 0032028728 scopus 로고    scopus 로고
    • The sample complexity of pattern classification with neural networks: The size of the weights is more important than the size of the network
    • Mar
    • P. L. Bartlett, "The sample complexity of pattern classification with neural networks: The size of the weights is more important than the size of the network," IEEE Trans. Inf. Theory, vol. 44, no. 2, pp. 525-536, Mar. 1998.
    • (1998) IEEE Trans. Inf. Theory , vol.44 , Issue.2 , pp. 525-536
    • Bartlett, P.L.1
  • 13
    • 84879394399 scopus 로고    scopus 로고
    • Support vector machine soft margin classifiers: Error analysis
    • D.-R. Chen, Q.Wu, Y. Ying, and D.-X. Zhou, "Support vector machine soft margin classifiers: Error analysis," J. Mach. Learn. Res., vol. 5, pp. 1143-1175, 2004.
    • (2004) J. Mach. Learn. Res , vol.5 , pp. 1143-1175
    • Chen, D.-R.1    Wu, Q.2    Ying, Y.3    Zhou, D.-X.4
  • 14
    • 33746243474 scopus 로고    scopus 로고
    • Risk bounds for statistical learning
    • P. Massart and E. Nédélec, "Risk bounds for statistical learning," Ann. Statist., vol. 34, no. 5, pp. 2326-2366, 2006.
    • (2006) Ann. Statist , vol.34 , Issue.5 , pp. 2326-2366
    • Massart, P.1    Nédélec, E.2
  • 15
    • 21844497119 scopus 로고
    • Asymptotical minimax recovery of sets with smooth boundaries
    • E. Mammen and A. B. Tsybakov, "Asymptotical minimax recovery of sets with smooth boundaries," Ann. Statist., vol. 23, no. 2, pp. 502-524, 1995.
    • (1995) Ann. Statist , vol.23 , Issue.2 , pp. 502-524
    • Mammen, E.1    Tsybakov, A.B.2
  • 16
    • 3142725508 scopus 로고    scopus 로고
    • Optimal aggregation of classifiers in statistical learning
    • A. Tsybakov, "Optimal aggregation of classifiers in statistical learning," Ann. Statist., vol. 32, no. 1, pp. 135-166, 2004.
    • (2004) Ann. Statist , vol.32 , Issue.1 , pp. 135-166
    • Tsybakov, A.1
  • 17
    • 23744505130 scopus 로고    scopus 로고
    • Square root penalty: Adaptation to the margin in classification and in edge estimation
    • A. Tsybakov and S. van de Geer, "Square root penalty: Adaptation to the margin in classification and in edge estimation," Ann. Statist., vol. 33, no. 3, pp. 1203-1224, 2005.
    • (2005) Ann. Statist , vol.33 , Issue.3 , pp. 1203-1224
    • Tsybakov, A.1    van de Geer, S.2
  • 19
    • 33746043351 scopus 로고    scopus 로고
    • J.-Y. Audibert, A randomized online learning algorithm for better variance control, in Proc. 19th Conf. Comput. Learn. Theory, ser. Lecture Notes in Computer Science, G. Lugosi and H. U. Simon, Eds. Berlin, Germany: Springer-Verlag, 2006, 4005, pp. 392-407.
    • J.-Y. Audibert, "A randomized online learning algorithm for better variance control," in Proc. 19th Conf. Comput. Learn. Theory, ser. Lecture Notes in Computer Science, G. Lugosi and H. U. Simon, Eds. Berlin, Germany: Springer-Verlag, 2006, vol. 4005, pp. 392-407.
  • 20
    • 33645724205 scopus 로고    scopus 로고
    • Minimax optimal classification with dyadic decision trees
    • Apr
    • C. Scott and R. Nowak, "Minimax optimal classification with dyadic decision trees," IEEE Trans. Inf. Theory, vol. 52, no. 4, pp. 1335-1353, Apr. 2006.
    • (2006) IEEE Trans. Inf. Theory , vol.52 , Issue.4 , pp. 1335-1353
    • Scott, C.1    Nowak, R.2
  • 21
    • 34247197035 scopus 로고    scopus 로고
    • Fast rates for support vector machines using Gaussian kernels
    • I. Steinwart and C. Scovel, "Fast rates for support vector machines using Gaussian kernels," Ann. Statist., vol. 35, pp. 575-607, 2007.
    • (2007) Ann. Statist , vol.35 , pp. 575-607
    • Steinwart, I.1    Scovel, C.2
  • 23
    • 34547418100 scopus 로고    scopus 로고
    • Statistical performance of support vector machines
    • to be published
    • G. Blanchard, O. Bousquet, and P. Massart, "Statistical performance of support vector machines," Ann. Statist., to be published.
    • Ann. Statist
    • Blanchard, G.1    Bousquet, O.2    Massart, P.3
  • 24
    • 0002439508 scopus 로고    scopus 로고
    • Regularization networks and support vector machines
    • A. J. Smola, P. L. Bartlett, B. Schölkopf, and D. Schuurmans, Eds. Cambridge, MA: MIT Press
    • T. Evgeniou, M. Pontil, and T. Poggio, "Regularization networks and support vector machines," in Advances in Large Margin Classifiers A. J. Smola, P. L. Bartlett, B. Schölkopf, and D. Schuurmans, Eds. Cambridge, MA: MIT Press, 2000, pp. 171-203.
    • (2000) Advances in Large Margin Classifiers , pp. 171-203
    • Evgeniou, T.1    Pontil, M.2    Poggio, T.3
  • 25
    • 24044515976 scopus 로고    scopus 로고
    • On a kernel-based method for pattern recognition, regression, approximation and operator inversion
    • A. J. Smola and B. Schölkopf, "On a kernel-based method for pattern recognition, regression, approximation and operator inversion," Algorithmica, vol. 22, pp. 211-231, 1998.
    • (1998) Algorithmica , vol.22 , pp. 211-231
    • Smola, A.J.1    Schölkopf, B.2
  • 27
    • 0347243182 scopus 로고    scopus 로고
    • Nonlinear component analysis as a kernel eigenvalue problem
    • B. Schölkopf, A. J. Smola, and K.-R. Müller, "Nonlinear component analysis as a kernel eigenvalue problem," Neural Comput., vol. 10, pp. 1299-1319, 1998.
    • (1998) Neural Comput , vol.10 , pp. 1299-1319
    • Schölkopf, B.1    Smola, A.J.2    Müller, K.-R.3
  • 28
    • 9444244014 scopus 로고    scopus 로고
    • Asymptotics of spectral projections of some random matrices approximating integral operators
    • V. Koltchinskii, "Asymptotics of spectral projections of some random matrices approximating integral operators," Progr. Probab., vol. 43, pp. 191-227, 1998.
    • (1998) Progr. Probab , vol.43 , pp. 191-227
    • Koltchinskii, V.1
  • 29
    • 0000957849 scopus 로고
    • Asymptotic theory for the principal component analysis of a vector random function: Some applications to statistical inference
    • J. Dauxois, A. Pousse, and Y. Romain, "Asymptotic theory for the principal component analysis of a vector random function: Some applications to statistical inference," J. Multivariate Anal., vol. 12, pp. 136-154, 1982.
    • (1982) J. Multivariate Anal , vol.12 , pp. 136-154
    • Dauxois, J.1    Pousse, A.2    Romain, Y.3
  • 30
    • 84864041215 scopus 로고    scopus 로고
    • On the convergence of eigenspaces in kernel principal component analysis
    • Y. Weiss, L. Bottou, and J. Platt, Eds. Cambridge, MA: MIT Press
    • L. Zwald and G. Blanchard, "On the convergence of eigenspaces in kernel principal component analysis," in Advances in Neural Information Processing Systems 18, Y. Weiss, L. Bottou, and J. Platt, Eds. Cambridge, MA: MIT Press, 2006, pp. 1649-1656.
    • (2006) Advances in Neural Information Processing Systems 18 , pp. 1649-1656
    • Zwald, L.1    Blanchard, G.2
  • 32
    • 41549168778 scopus 로고    scopus 로고
    • Convergence rates of general regularization methods for statistical inverse problems and applications
    • N. Bissantz, T. Hohage, A. Munk, and F. Ruymgaart, "Convergence rates of general regularization methods for statistical inverse problems and applications," SIAM J. Numer. Anal., vol. 45, pp. 2610-2636, 2007.
    • (2007) SIAM J. Numer. Anal , vol.45 , pp. 2610-2636
    • Bissantz, N.1    Hohage, T.2    Munk, A.3    Ruymgaart, F.4
  • 33
    • 0003969585 scopus 로고
    • Estimation of Dependences Based on Empirical Data
    • New York: Springer-Verlag
    • V. Vapnik, Estimation of Dependences Based on Empirical Data, ser. Statistics. New York: Springer-Verlag, 1982.
    • (1982) ser. Statistics
    • Vapnik, V.1
  • 34
    • 0142095807 scopus 로고    scopus 로고
    • Concentration inequalities and empirical processes theory applied to the analysis of learning algorithms,
    • Ph.D. dissertation, Dept. Appl. Math, École Polytechnique, Palaiseau, France
    • O. Bousquet, "Concentration inequalities and empirical processes theory applied to the analysis of learning algorithms," Ph.D. dissertation, Dept. Appl. Math., École Polytechnique, Palaiseau, France, 2002.
    • (2002)
    • Bousquet, O.1
  • 37
    • 0039813137 scopus 로고    scopus 로고
    • Gaussian regression and optimal finite dimensional linear models
    • C. M. Bishop, Ed. New York: Springer-Verlag
    • H. Zhu, C. Williams, R. Rohwer, and M. Morciniec, "Gaussian regression and optimal finite dimensional linear models," in Neural Networks and Machine Learning, C. M. Bishop, Ed. New York: Springer-Verlag, 1998, pp. 167-184.
    • (1998) Neural Networks and Machine Learning , pp. 167-184
    • Zhu, H.1    Williams, C.2    Rohwer, R.3    Morciniec, M.4
  • 38
    • 51349149556 scopus 로고    scopus 로고
    • C. K. I. Williams and M. Seeger, P. Langley, Ed., The effect of the input density distribution on kernel-based classifiers, in Proc. 17th Int. Conf. Mach. Learn., San Francisco, CA, 2000, pp. 1159-1166.
    • C. K. I. Williams and M. Seeger, P. Langley, Ed., "The effect of the input density distribution on kernel-based classifiers," in Proc. 17th Int. Conf. Mach. Learn., San Francisco, CA, 2000, pp. 1159-1166.


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.