메뉴 건너뛰기




Volumn , Issue , 2009, Pages 977-984

A least squares formulation for a class of generalized eigenvalue problems in machine learning

Author keywords

[No Author keywords available]

Indexed keywords

CANONICAL CORRELATION ANALYSIS; CLASSICAL TECHNIQUES; CONJUGATE GRADIENT; EQUIVALENCE RELATIONSHIP; GENERALIZATION ABILITY; GENERALIZED EIGENVALUE PROBLEMS; LARGE-SCALE PROBLEM; LEAST SQUARES PROBLEMS; LEAST-SQUARES FORMULATION; LINEAR DISCRIMINANT ANALYSIS; MACHINE LEARNING ALGORITHMS; MACHINE-LEARNING; PARTIAL LEAST SQUARES; REGULARIZATION TECHNIQUE; SCALABLE IMPLEMENTATION; SPECTRAL LEARNING;

EID: 71149101160     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (31)

References (24)
  • 2
    • 33750729556 scopus 로고    scopus 로고
    • Manifold regularization: A geometric framework for learning from labeled and unlabeled examples
    • Belkin, M., Niyogi, P., & Sindhwani, V. (2006). Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research, 7, 2399-2434.
    • (2006) Journal of Machine Learning Research , vol.7 , pp. 2399-2434
    • Belkin, M.1    Niyogi, P.2    Sindhwani, V.3
  • 5
    • 33744552752 scopus 로고    scopus 로고
    • For most large underdeter-mined systems of linear equations, the minimal 11-norm near-solution approximates the sparsest near-solution
    • Donoho, D. (2006). For most large underdeter-mined systems of linear equations, the minimal 11-norm near-solution approximates the sparsest near-solution. Communications on Pure and Applied Mathematics, 59, 907-934.
    • (2006) Communications on Pure and Applied Mathematics , vol.59 , pp. 907-934
    • Donoho, D.1
  • 10
    • 69649095451 scopus 로고    scopus 로고
    • Fixed-point continuation for ℓ1-minimization: Methodology and convergence
    • Hale, E., Yin, W., & Zhang, Y. (2008). Fixed-point continuation for ℓ1-minimization: Methodology and convergence. SIAM Journal on Optimization, 19, 1107-1130.
    • (2008) SIAM Journal on Optimization , vol.19 , pp. 1107-1130
    • Hale, E.1    Yin, W.2    Zhang, Y.3
  • 12
    • 0000107975 scopus 로고
    • Relations between two sets of variables
    • Hotelling, H. (1936). Relations between two sets of variables. Biometrika, 28, 312-377.
    • (1936) Biometrika , vol.28 , pp. 312-377
    • Hotelling, H.1
  • 15
    • 0039943513 scopus 로고
    • LSQR: An algorithm for sparse linear equations and sparse least squares
    • Paige, C. C., & Saunders, M. A. (1982). LSQR: An algorithm for sparse linear equations and sparse least squares. ACM Transactions on Mathematical Software, 8, 43-71.
    • (1982) ACM Transactions on Mathematical Software , vol.8 , pp. 43-71
    • Paige, C.C.1    Saunders, M.A.2
  • 16
    • 33745819990 scopus 로고    scopus 로고
    • Overview and recent advances in partial least squares. Subspace, Latent Structure and Feature Selection Techniques
    • Rosipal, R., & Krämer, N. (2006). Overview and recent advances in partial least squares. Subspace, Latent Structure and Feature Selection Techniques, Lecture Notes in Computer Science (pp. 34-51).
    • (2006) Lecture Notes in Computer Science , pp. 34-51
    • Rosipal, R.1    Krämer, N.2
  • 20
    • 56449106936 scopus 로고    scopus 로고
    • A least squares formulation for canonical correlation analysis
    • Sun, L., Ji, S., & Ye, J. (2008b). A least squares formulation for canonical correlation analysis. International Conference on Machine Learning (pp. 1024-1031).
    • (2008) International Conference on Machine Learning , pp. 1024-1031
    • Sun, L.1    Ji, S.2    Ye, J.3
  • 22
    • 85194972808 scopus 로고    scopus 로고
    • Regression shrinkage and selection via the lasso
    • Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. R. Stat. Soc. B, 58, 267-288.
    • (1996) J. R. Stat. Soc. B , vol.58 , pp. 267-288
    • Tibshirani, R.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.