메뉴 건너뛰기




Volumn 11, Issue , 2010, Pages 2287-2322

Spectral regularization algorithms for learning large incomplete matrices

Author keywords

Collaborative filtering; Large scale convex optimization; Netflix prize; Nuclear norm; Spectral regularization

Indexed keywords

COLLABORATIVE FILTERING; LARGE-SCALE CONVEX OPTIMIZATION; NETFLIX PRIZE; NUCLEAR NORM; SPECTRAL REGULARIZATION;

EID: 77956944781     PISSN: 15324435     EISSN: 15337928     Source Type: Journal    
DOI: None     Document Type: Article
Times cited : (1208)

References (39)
  • 1
    • 64149107285 scopus 로고    scopus 로고
    • A new approach to collaborative filtering: Operator estimation with spectral regularization
    • J. Abernethy, F. Bach, T. Evgeniou, and J.-P. Vert. A new approach to collaborative filtering: operator estimation with spectral regularization. Journal of Machine Learning Research, 10:803-826, 2009.
    • (2009) Journal of Machine Learning Research , vol.10 , pp. 803-826
    • Abernethy, J.1    Bach, F.2    Evgeniou, T.3    Vert, J.-P.4
  • 3
    • 55149088329 scopus 로고    scopus 로고
    • Convex multi-task feature learning
    • A. Argyriou, T. Evgeniou, and M. Pontil. Convex multi-task feature learning. Machine Learning, 73(3):243-272, 2008.
    • (2008) Machine Learning , vol.73 , Issue.3 , pp. 243-272
    • Argyriou, A.1    Evgeniou, T.2    Pontil, M.3
  • 4
    • 46249124832 scopus 로고    scopus 로고
    • Consistency of trace norm minimization
    • F. Bach. Consistency of trace norm minimization. Journal of Machine Learning Research, 9:1019-1048, 2008.
    • (2008) Journal of Machine Learning Research , vol.9 , pp. 1019-1048
    • Bach, F.1
  • 7
    • 21644465788 scopus 로고    scopus 로고
    • Local minima and convergence in low-rank semidefinite programming
    • S. Burer and R. D. C. Monteiro. Local minima and convergence in low-rank semidefinite programming. Mathematical Programming, 103(3):427-631, 2005.
    • (2005) Mathematical Programming , vol.103 , Issue.3 , pp. 427-631
    • Burer, S.1    Monteiro, R.D.C.2
  • 10
    • 77951528523 scopus 로고    scopus 로고
    • The power of convex relaxation: Near-optimal matrix completion
    • E. J. Candes and T. Tao. The power of convex relaxation: near-optimal matrix completion. IEEE Transactions on Information Theory, 56(5):2053-2080, 2009.
    • (2009) IEEE Transactions on Information Theory , vol.56 , Issue.5 , pp. 2053-2080
    • Candes, E.J.1    Tao, T.2
  • 24
    • 72549110327 scopus 로고    scopus 로고
    • Interior-point method for nuclear norm approximation with application to system identfication
    • Z. Liu and L. Vandenberghe. Interior-point method for nuclear norm approximation with application to system identfication. SIAM Journal on Matrix Analysis and Applications, 31(3):1235-1256, 2009.
    • (2009) SIAM Journal on Matrix Analysis and Applications , vol.31 , Issue.3 , pp. 1235-1256
    • Liu, Z.1    Vandenberghe, L.2
  • 25
    • 72549092023 scopus 로고    scopus 로고
    • Fixed point and Bregman iterative methods for matrix rank minimization
    • forthcoming
    • S. Ma, D. Goldfarb, and L. Chen. Fixed point and Bregman iterative methods for matrix rank minimization. Mathematical Programming Series A, forthcoming.
    • Mathematical Programming Series a
    • Ma, S.1    Goldfarb, D.2    Chen, L.3
  • 26
    • 77956894153 scopus 로고    scopus 로고
    • Sparsenet: Coordinate descent with non-convex penalties
    • Stanford University
    • R. Mazumder, J. Friedman, and T. Hastie. Sparsenet: coordinate descent with non-convex penalties. Technical report, Stanford University, 2009.
    • (2009) Technical Report
    • Mazumder, R.1    Friedman, J.2    Hastie, T.3
  • 28
    • 67651063011 scopus 로고    scopus 로고
    • Gradient methods for minimizing composite objective function
    • Center for Operations Research and Econometrics CORE, Catholic University of Louvain
    • Y. Nesterov. Gradient methods for minimizing composite objective function. Technical Report 76, Center for Operations Research and Econometrics (CORE), Catholic University of Louvain, 2007.
    • (2007) Technical Report , pp. 76
    • Nesterov, Y.1
  • 32
    • 85031278963 scopus 로고    scopus 로고
    • Soft modelling by latent variables: The nonlinear iterative partial least squares (NIPALS) approach
    • ACM SIGKDD and Netfiix, In, Available at
    • ACM SIGKDD and Netfiix. Soft modelling by latent variables: the nonlinear iterative partial least squares (NIPALS) approach. In Proceedings of KDD Cup and Workshop, 2007. Available at http://www.cs.uic.edu/~liub/KDD-cup-2007/ proceedings.html.
    • (2007) Proceedings of KDD Cup and Workshop
  • 34
    • 26944475424 scopus 로고    scopus 로고
    • Generalization error bounds for collaborative prediction with low-rank matrices
    • MIT Press
    • N. Srebro, N. Alon, and T. Jaakkola. Generalization error bounds for collaborative prediction with low-rank matrices. In Advances in Neural Information Processing Systems 17, pages 5-27. MIT Press, 2005a.
    • (2005) Advances in Neural Information Processing Systems , vol.17 , pp. 5-27
    • Srebro, N.1    Alon, N.2    Jaakkola, T.3
  • 37
    • 0001287271 scopus 로고    scopus 로고
    • Regression shrinkage and selection via the Lasso
    • Series B
    • R. Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society, Series B, 58:267-288, 1996.
    • (1996) Journal of the Royal Statistical Society , vol.58 , pp. 267-288
    • Tibshirani, R.1
  • 39
    • 77649284492 scopus 로고    scopus 로고
    • Nearly unbiased variable selection under minimax concave penalty
    • C. H. Zhang. Nearly unbiased variable selection under minimax concave penalty. Annals of Statistics, 38(2):894-942, 2010.
    • (2010) Annals of Statistics , vol.38 , Issue.2 , pp. 894-942
    • Zhang, C.H.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.