메뉴 건너뛰기




Volumn 14, Issue , 2013, Pages 3129-3152

Large-scale SVD and manifold learning

Author keywords

Large scale matrix factorization; Low rank approximation; Manifold learning

Indexed keywords

COMPUTATIONAL CHALLENGES; LAPLACIAN EIGENMAPS; LOW RANK APPROXIMATIONS; LOW-DIMENSIONAL MANIFOLDS; MANIFOLD LEARNING; MATRIX FACTORIZATIONS; NONLINEAR DIMENSIONALITY REDUCTION; SINGULAR VALUE DECOMPOSITION TECHNIQUE;

EID: 84887492519     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (60)

References (41)
  • 2
    • 0037016775 scopus 로고    scopus 로고
    • The Isomap algorithm and topological stability
    • M. Balasubramanian and E. L. Schwartz. The Isomap algorithm and topological stability. Science, 295, 2002.
    • (2002) Science , vol.295
    • Balasubramanian, M.1    Schwartz, E.L.2
  • 3
    • 0043278893 scopus 로고    scopus 로고
    • Laplacian Eigenmaps and spectral techniques for embedding and clustering
    • M. Belkin and P. Niyogi. Laplacian Eigenmaps and spectral techniques for embedding and clustering. In Neural Information Processing Systems, 2001.
    • (2001) Neural Information Processing Systems
    • Belkin, M.1    Niyogi, P.2
  • 9
    • 2942522546 scopus 로고    scopus 로고
    • Multiple representations to compute orthogonal eigenvectors of symmetric tridiagonal matrices
    • I. Dhillon and B. Parlett. Multiple representations to compute orthogonal eigenvectors of symmetric tridiagonal matrices. Linear Algebra and its Applications, 387:1-28, 2004.
    • (2004) Linear Algebra and Its Applications , vol.387 , pp. 1-28
    • Dhillon, I.1    Parlett, B.2
  • 11
    • 29244453931 scopus 로고    scopus 로고
    • On the Nyström method for approximating a Gram matrix for improved kernel-based learning
    • P. Drineas and M. W. Mahoney. On the Nystr om method for approximating a gram matrix for improved kernel-based learning. Journal of Machine Learning Research, 6:2153-2175, 2005. (Pubitemid 41832630)
    • (2005) Journal of Machine Learning Research , vol.6 , pp. 2153-2175
    • Drineas, P.1    Mahoney, M.W.2
  • 12
    • 33751075906 scopus 로고    scopus 로고
    • Fast Monte Carlo algorithms for matrices ii: Computing a low-rank approximation to a matrix
    • P. Drineas, R. Kannan, and M. W. Mahoney. Fast Monte Carlo algorithms for matrices ii: Computing a low-rank approximation to a matrix. SIAM Journal on Computing, 36(1), 2006.
    • (2006) SIAM Journal on Computing , vol.36 , pp. 1
    • Drineas, P.1    Kannan, R.2    Mahoney, M.W.3
  • 13
    • 0041494125 scopus 로고    scopus 로고
    • Efficient SVM training using low-rank kernel representations
    • S. Fine and K. Scheinberg. Efficient SVM training using low-rank kernel representations. Journal of Machine Learning Research, 2:243-264, 2002.
    • (2002) Journal of Machine Learning Research , vol.2 , pp. 243-264
    • Fine, S.1    Scheinberg, K.2
  • 16
    • 0004236492 scopus 로고
    • Johns Hopkins University Press, Baltimore, 2nd edition
    • G. Golub and C. V. Loan. Matrix Computations. Johns Hopkins University Press, Baltimore, 2nd edition, 1983.
    • (1983) Matrix Computations
    • Golub, G.1    Loan, C.V.2
  • 20
    • 84887422716 scopus 로고    scopus 로고
    • Making large-scale support vector machine learning practical
    • T. Joachims. Making large-scale support vector machine learning practical. In Neural Information Processing Systems, 1999.
    • (1999) Neural Information Processing Systems
    • Joachims, T.1
  • 21
    • 84887453057 scopus 로고    scopus 로고
    • S. Kumar and H. Rowley. People Hopper. http://googleresearch.blogspot. com/2010/03/ hopping-on-face-manifold-via-people.html, 2010.
    • (2010) People Hopper
    • Kumar, S.1    Rowley, H.2
  • 29
    • 0007826188 scopus 로고
    • Uber die praktische auflosung von linearen integralgleichungen mit anwendungen auf randwertaufgaben der potentialtheorie
    • E. Nystr om. Uber die praktische auflosung von linearen integralgleichungen mit anwendungen auf randwertaufgaben der potentialtheorie. Commentationes Physico-Mathematicae, 4(15):1-52, 1928.
    • (1928) Commentationes Physico-Mathematicae , vol.4 , Issue.15 , pp. 1-52
    • Nystr Om, E.1
  • 30
    • 40649120595 scopus 로고    scopus 로고
    • Fast training of support vector machines using sequential minimal optimization
    • J. Platt. Fast training of Support Vector Machines using sequential minimal optimization. In Neural Information Processing Systems, 1999.
    • (1999) Neural Information Processing Systems
    • Platt, J.1
  • 32
    • 0034704222 scopus 로고    scopus 로고
    • Nonlinear dimensionality reduction by Locally Linear Embedding
    • S. Roweis and L. Saul. Nonlinear dimensionality reduction by Locally Linear Embedding. Science, 290(5500), 2000.
    • (2000) Science , vol.290 , pp. 5500
    • Roweis, S.1    Saul, L.2
  • 36
    • 84862294299 scopus 로고    scopus 로고
    • Ph.D. thesis, Computer Science Department, Courant Institute, New York University, New York, NY
    • A. Talwalkar. Matrix Approximation for Large-scale Learning. Ph.D. thesis, Computer Science Department, Courant Institute, New York University, New York, NY, 2010.
    • (2010) Matrix Approximation for Large-scale Learning
    • Talwalkar, A.1
  • 39
    • 0034704229 scopus 로고    scopus 로고
    • A global geometric framework for nonlinear dimensionality reduction
    • J. Tenenbaum, V. de Silva, and J. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500), 2000.
    • (2000) Science , vol.290 , pp. 5500
    • Tenenbaum, J.1    De Silva, V.2    Langford, J.3
  • 40
    • 33750743381 scopus 로고    scopus 로고
    • An introduction to nonlinear dimensionality reduction by maximum variance unfolding
    • K. Weinberger and L. Saul. An introduction to nonlinear dimensionality reduction by maximum variance unfolding. In AAAI Conference on Artificial Intelligence, 2006.
    • (2006) AAAI Conference on Artificial Intelligence
    • Weinberger, K.1    Saul, L.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.