메뉴 건너뛰기




Volumn 22, Issue 3, 2010, Pages 793-829

Regularization techniques and suboptimal solutions to optimization problems in learning from data

Author keywords

[No Author keywords available]

Indexed keywords

ALGORITHM; ARTICLE; ARTIFICIAL INTELLIGENCE; COMPARATIVE STUDY; STATISTICAL MODEL; TIME;

EID: 77952978333     PISSN: 08997667     EISSN: 1530888X     Source Type: Journal    
DOI: 10.1162/neco.2009.05-08-786     Document Type: Article
Times cited : (36)

References (59)
  • 1
    • 5844297152 scopus 로고
    • Theory of reproducing kernels
    • Aronszajn, N. (1950). Theory of reproducing kernels. Trans. of AMS, 68, 337-404.
    • (1950) Trans. of AMS , vol.68 , pp. 337-404
    • Aronszajn, N.1
  • 3
    • 0038453192 scopus 로고    scopus 로고
    • Rademacher and gaussian complexities: Risk bounds and structural results
    • Bartlett, P. L., & Mendelson, S. (2002). Rademacher and gaussian complexities: Risk bounds and structural results. J. Machine Learning Research, 3, 463-482.
    • (2002) J. Machine Learning Research , vol.3 , pp. 463-482
    • Bartlett, P.L.1    Mendelson, S.2
  • 7
    • 0036071370 scopus 로고    scopus 로고
    • On the mathematical foundations of learning
    • Cucker, F., & Smale, S. (2001). On the mathematical foundations of learning. Bulletin of AMS, 39, 1-49.
    • (2001) Bulletin of AMS , vol.39 , pp. 1-49
    • Cucker, F.1    Smale, S.2
  • 8
    • 0036436325 scopus 로고    scopus 로고
    • Best choices for regularization parameters in learning theory: On the bias-variance problem
    • Cucker, F., & Smale, S. (2002). Best choices for regularization parameters in learning theory: On the bias-variance problem. Foundations of Computational Mathematics, 2, 413-428.
    • (2002) Foundations of Computational Mathematics , vol.2 , pp. 413-428
    • Cucker, F.1    Smale, S.2
  • 11
    • 62549127689 scopus 로고    scopus 로고
    • Elastic-net regularization in learning theory
    • De Mol, C., De Vito, E., & Rosasco, L. (2009). Elastic-net regularization in learning theory. J. Complexity, 25, 201-230.
    • (2009) J. Complexity , vol.25 , pp. 201-230
    • De Mol, C.1    De Vito, E.2    Rosasco, L.3
  • 14
    • 29244453931 scopus 로고    scopus 로고
    • On the Nyström method for approximating a Gram matrix for improved kernel-based learning
    • Drineas, P., & Mahoney, M.W. (2005). On the Nystr̈om method for approximating a Gram matrix for improved kernel-based learning. J. Machine Learning Research, 6, 2153-2175.
    • (2005) J. Machine Learning Research , vol.6 , pp. 2153-2175
    • Drineas, P.1    Mahoney, M.W.2
  • 20
    • 0000065292 scopus 로고
    • Regularization theory, radial basis functions and networks
    • V. Cherkassky, J. H. Friedman, & H. Wechsler (Eds.) Berlin: Springer
    • Girosi, F. (1994). Regularization theory, radial basis functions and networks. In V. Cherkassky, J. H. Friedman, & H. Wechsler (Eds.), From statistics to neural networks: Theory and pattern recognition applications (pp. 166-187). Berlin: Springer.
    • (1994) From statistics to neural networks: Theory and pattern recognition applications , pp. 166-187
    • Girosi, F.1
  • 21
    • 0000249788 scopus 로고    scopus 로고
    • An equivalence between sparse approximation and support vector machines
    • Girosi, F. (1998). An equivalence between sparse approximation and support vector machines. Neural Computation, 10, 1455-1480.
    • (1998) Neural Computation , vol.10 , pp. 1455-1480
    • Girosi, F.1
  • 22
    • 32944458952 scopus 로고    scopus 로고
    • On the exponential convergence of matching pursuits in quasi-incoherent dictionaries
    • Gribonval, R.,&Vandergheynst, P. (2006).On the exponential convergence of matching pursuits in quasi-incoherent dictionaries. IEEE Trans. on Information Theory, 52, 255-261.
    • (2006) IEEE Trans. on Information Theory , vol.52 , pp. 255-261
    • Gribonval, R.1    Vandergheynst, P.2
  • 26
    • 79957486179 scopus 로고    scopus 로고
    • Theory of matching pursuit
    • D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou (Eds.) Cambridge, MA: MIT Press
    • Hussain, Z., & Shawe-Taylor, J. (2009). Theory of matching pursuit. In D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou (Eds.), Advances in neural information processing systems, 21. Cambridge, MA: MIT Press.
    • (2009) Advances in neural information processing systems, 21
    • Hussain, Z.1    Shawe-Taylor, J.2
  • 29
    • 18144423627 scopus 로고    scopus 로고
    • Learning from data as an inverse problem
    • J. Antoch (Ed.) Heidelberg: Physica/ Springer
    • K urkov́a, V. (2004). Learning from data as an inverse problem. In J. Antoch (Ed.), Proc. Int. Conf. on Computational Statistics (pp. 1377-1384). Heidelberg: Physica/ Springer.
    • (2004) Proc. Int. Conf. on Computational Statistics , pp. 1377-1384
    • Kůrková, V.1
  • 30
    • 33646033297 scopus 로고    scopus 로고
    • Neural network learning as an inverse problem
    • Kůrková, V. (2005). Neural network learning as an inverse problem. Logic Journal of IGPL, 13, 551-559.
    • (2005) Logic Journal of IGPL , vol.13 , pp. 551-559
    • Kůrková, V.1
  • 31
    • 24544481301 scopus 로고
    • Uniqueness of functional representations by gaussian basis function networks
    • New York: Springer
    • Kůrková, V., & Neruda, R. (1994). Uniqueness of functional representations by gaussian basis function networks. In Proc. ICANN'94 (pp. 471-474). New York: Springer.
    • (1994) Proc. ICANN'94 , pp. 471474
    • Kůrková, V.1    Neruda, R.2
  • 32
    • 0035443484 scopus 로고    scopus 로고
    • Bounds on rates of variable-basis and neuralnetwork approximation
    • Kůrková, V., & Sanguineti, M. (2001). Bounds on rates of variable-basis and neuralnetwork approximation. IEEE Trans. on Information Theory, 47, 2659-2665.
    • (2001) IEEE Trans. on Information Theory , vol.47 , pp. 2659-2665
    • Kůrková V1    Sanguineti, M.2
  • 33
    • 0036165028 scopus 로고    scopus 로고
    • Comparison of worst case errors in linear and neural network approximation
    • Kůrková, V., & Sanguineti, M. (2002). Comparison of worst case errors in linear and neural network approximation. IEEE Trans. on Information Theory, 48, 264-275.
    • (2002) IEEE Trans. on Information Theory , vol.48 , pp. 264-275
    • Kůrková, V.1    Sanguineti, M.2
  • 34
    • 18744392546 scopus 로고    scopus 로고
    • Error estimates for approximate optimization by the extended Ritz method
    • Kůrková, V., & Sanguineti,M. (2005a). Error estimates for approximate optimization by the extended Ritz method. SIAM J. Optimization, 15, 461-487.
    • (2005) SIAM J. Optimization , vol.15 , pp. 461-487
    • Kůrková, V.1    Sanguineti, M.2
  • 35
    • 18144390163 scopus 로고    scopus 로고
    • Learning with generalization capability by kernel methods of bounded complexity
    • Kůrková, V., & Sanguineti, M. (2005b). Learning with generalization capability by kernel methods of bounded complexity. J. Complexity, 21, 350-367.
    • (2005) J. Complexity , vol.21 , pp. 350-367
    • Kůrková, V.1    Sanguineti, M.2
  • 37
    • 0030119952 scopus 로고    scopus 로고
    • Random approximants and neural networks
    • Makovoz, Y. (1996). Random approximants and neural networks. J. Approximation Theory, 85, 98-109.
    • (1996) J. Approximation Theory , vol.85 , pp. 98-109
    • Makovoz, Y.1
  • 38
    • 35248851077 scopus 로고    scopus 로고
    • A few notes on statistical learning theory
    • S. Mendelson & A. Smola (Eds.) New York: Springer
    • Mendelson, S. (2003). A few notes on statistical learning theory. In S. Mendelson & A. Smola (Eds.), Lecture notes in computer science (pp. 1-40). New York: Springer.
    • (2003) Lecture notes in computer science , pp. 1-40
    • Mendelson, S.1
  • 39
    • 0002902522 scopus 로고
    • Least squares methods for ill-posed problems with a prescribed bound
    • Miller, K. (1970). Least squares methods for ill-posed problems with a prescribed bound. SIAM J. Mathematical Analysis, 1, 52-74.
    • (1970) SIAM J. Mathematical Analysis , vol.1 , pp. 52-74
    • Miller, K.1
  • 40
    • 17444401898 scopus 로고    scopus 로고
    • Regression and classification with regularization
    • D. Denison, M. H. Hansen, C. C. Holmes, B. Mallick, and B. Yu (Eds.) (Proc. MSRI Workshop) New York: Springer
    • Mukherjee, S., Rifkin, R., & Poggio, T. (2002). Regression and classification with regularization. In D. Denison, M. H. Hansen, C. C. Holmes, B. Mallick, and B. Yu (Eds.), Lectures notes in statistics: Nonlinear estimation and classification (Proc. MSRI Workshop) (pp. 107-124). New York: Springer.
    • (2002) Lectures notes in statistics: Nonlinear estimation and classification , pp. 107-124
    • Mukherjee, S.1    Rifkin, R.2    Poggio, T.3
  • 43
  • 48
    • 0002493574 scopus 로고    scopus 로고
    • Sparse greedymatrix approximation formachine learning
    • San Francisco: Morgan Kaufmann
    • Smola, A. J.,&Scḧolkopf, B. (2000). Sparse greedymatrix approximation formachine learning. In Proc. 17th Int. Conf. on Machine Learning (pp. 911-918). San Francisco: Morgan Kaufmann.
    • (2000) Proc. 17th Int. Conf. on Machine Learning , pp. 911-918
    • Smola, A.J.1    Schölkopf, B.2
  • 49
    • 85194972808 scopus 로고    scopus 로고
    • Regression shrinkage and selection via the LASSO
    • Tibshirani, R. (1996). Regression shrinkage and selection via the LASSO. J. Royal Statistical Society, Series B, 58, 267-288.
    • (1996) J. Royal Statistical Society, Series B , vol.58 , pp. 267-288
    • Tibshirani, R.1
  • 51
    • 5444237123 scopus 로고    scopus 로고
    • Greed is good: Algorithmic results for sparse approximation
    • Tropp, J. A. (2004). Greed is good: Algorithmic results for sparse approximation. IEEE Trans. on Information Theory, 50, 2231-2242.
    • (2004) IEEE Trans. on Information Theory , vol.50 , pp. 2231-2242
    • Tropp, J.A.1
  • 53
    • 34250488412 scopus 로고
    • Relationship of several variational methods for the approximate solution of ill-posed problems
    • Vasin, V. V. (1970). Relationship of several variational methods for the approximate solution of ill-posed problems. Mathematical Notes, 7, 161-165.
    • (1970) Mathematical Notes , vol.7 , pp. 161-165
    • Vasin, V.V.1
  • 54
    • 0036643065 scopus 로고    scopus 로고
    • Kernel matching pursuit
    • Vincent, P., & Bengio, Y. (2002). Kernel matching pursuit. Machine Learning, 48, 165-187.
    • (2002) Machine Learning , vol.48 , pp. 165-187
    • Vincent, P.1    Bengio, Y.2
  • 56
    • 0038897234 scopus 로고    scopus 로고
    • Approximation bounds for some sparse kernel regression algorithms
    • Zhang, T. (2002a). Approximation bounds for some sparse kernel regression algorithms. Neural Computation, 14, 3013-3042.
    • (2002) Neural Computation , vol.14 , pp. 3013-3042
    • Zhang, T.1
  • 57
    • 0005085813 scopus 로고    scopus 로고
    • A general greedy approximation algorithm with applications
    • T. G. Dietterich, S. Becker, & Z. Ghahramani (Eds.) Cambridge, MA: MIT Press
    • Zhang, T. (2002b). A general greedy approximation algorithm with applications. In T. G. Dietterich, S. Becker, & Z. Ghahramani (Eds.), Advances in neural information processing systems, 14. Cambridge, MA: MIT Press.
    • (2002) Advances in neural information processing systems, 14
    • Zhang, T.1
  • 58
    • 0037355948 scopus 로고    scopus 로고
    • Sequential greedy approximation for certain convex optimization problems
    • Zhang, T. (2003). Sequential greedy approximation for certain convex optimization problems. IEEE Trans. on Information Theory, 49, 682-691.
    • (2003) IEEE Trans. on Information Theory , vol.49 , pp. 682-691
    • Zhang, T.1
  • 59
    • 16244401458 scopus 로고    scopus 로고
    • Regularization and variable selection via the elastic net
    • Zou, H., & Hastie, T. (2005). Regularization and variable selection via the elastic net. J. Royal Statistical Society B, 67, 301-320.
    • (2005) J. Royal Statistical Society B , vol.67 , pp. 301-320
    • Zou, H.1    Hastie, T.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.