메뉴 건너뛰기




Volumn , Issue PART 3, 2013, Pages 1820-1828

Optimization with first-order surrogate functions

Author keywords

[No Author keywords available]

Indexed keywords

LEARNING SYSTEMS; OPTIMIZATION;

EID: 84897534825     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (63)

References (36)
  • 1
    • 85014561619 scopus 로고    scopus 로고
    • A fast iterative shrinkage-thresholding algorithm for linear inverse problems
    • Beck, A. and Teboulle, M. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci., 2(1):183-202, 2009.
    • (2009) SIAM J. Imaging Sci. , vol.2 , Issue.1 , pp. 183-202
    • Beck, A.1    Teboulle, M.2
  • 3
    • 0037877567 scopus 로고
    • Monotonicity of quadratic-approximation algorithms
    • Böhning, D. and Lindsay, B. G. Monotonicity of quadratic- approximation algorithms. Ann. I. Stat. Math., 40(4):-641-663, 1988.
    • (1988) Ann. I. Stat. Math. , vol.40 , Issue.4 , pp. 641-663
    • Böhning, D.1    Lindsay, B.G.2
  • 5
    • 84904136037 scopus 로고    scopus 로고
    • Large-scale machine learning with stochastic gradient descent
    • Bottou, L. Large-scale machine learning with stochastic gradient descent. In Proc. COMPSTAT, 2010.
    • Proc. COMPSTAT, 2010
    • Bottou, L.1
  • 8
    • 0036643072 scopus 로고    scopus 로고
    • Logistic regression, AdaBoost and Bregman distances
    • DOI 10.1023/A:1013912006537
    • Collins, M., Schapire, R.E., and Singer, Y. Logistic regression, AdaBoost and Bregman distances. Mach. Learn., 48(1):253-285, 2002. (Pubitemid 34247580)
    • (2002) Machine Learning , vol.48 , Issue.1-3 , pp. 253-285
    • Collins, M.1    Schapire, R.E.2    Singer, Y.3
  • 9
    • 7044231546 scopus 로고    scopus 로고
    • An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
    • DOI 10.1002/cpa.20042
    • Daubechies, I., Defrise, M., and De Mol, C. An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Commun. Pur. Appl. Math., 57(11):1413-1457, 2004. (Pubitemid 39427442)
    • (2004) Communications on Pure and Applied Mathematics , vol.57 , Issue.11 , pp. 1413-1457
    • Daubechies, I.1    Defrise, M.2    De Mol, C.3
  • 12
    • 70450245260 scopus 로고    scopus 로고
    • Recovering sparse signals with non-convex penalties and DC programming
    • Gasso, G., Rakotomamonjy, A., and Canu, S. Recovering sparse signals with non-convex penalties and DC programming. IEEE T. Signal Process., 57(12):4686-4698, 2009.
    • (2009) IEEE T. Signal Process. , vol.57 , Issue.12 , pp. 4686-4698
    • Gasso, G.1    Rakotomamonjy, A.2    Canu, S.3
  • 15
  • 16
    • 84865676844 scopus 로고    scopus 로고
    • First order methods for nonsmooth convex large-scale optimization, I: General purpose methods
    • MIT Press
    • Juditsky, A. and Nemirovski, A. First order methods for nonsmooth convex large-scale optimization, I: General purpose methods. In Optimization for Machine Learning. MIT Press, 2011.
    • (2011) Optimization for Machine Learning
    • Juditsky, A.1    Nemirovski, A.2
  • 19
    • 77950023906 scopus 로고    scopus 로고
    • Optimization transfer using surrogate objective functions
    • Lange, K., Hunter, D.R., and Yang, I. Optimization transfer using surrogate objective functions. J. Comput. Graph. Stat., 9(1):1-20, 2000.
    • (2000) J. Comput. Graph. Stat. , vol.9 , Issue.1 , pp. 1-20
    • Lange, K.1    Hunter, D.R.2    Yang, I.3
  • 20
    • 84877725219 scopus 로고    scopus 로고
    • A stochastic gradient method with an exponential convergence rate for finite training sets
    • Le Roux, N., Schmidt, M., and Bach, F. A stochastic gradient method with an exponential convergence rate for finite training sets. In Adv. NIPS, 2012.
    • (2012) Adv. NIPS
    • Le Roux, N.1    Schmidt, M.2    Bach, F.3
  • 21
    • 84898964201 scopus 로고    scopus 로고
    • Algorithms for non-negative matrix factorization
    • Lee, D.D. and Seung, H.S. Algorithms for non-negative matrix factorization. In Adv. NIPS, 2001.
    • (2001) Adv. NIPS
    • Lee, D.D.1    Seung, H.S.2
  • 22
    • 76749107542 scopus 로고    scopus 로고
    • Online learning for matrix factorization and sparse coding
    • Mairal, J., Bach, F., Ponce, J., and Sapiro, G. Online learning for matrix factorization and sparse coding. J. Mach. Learn. Res., 11:19-60, 2010.
    • (2010) J. Mach. Learn. Res. , vol.11 , pp. 19-60
    • Mairal, J.1    Bach, F.2    Ponce, J.3    Sapiro, G.4
  • 23
    • 0002788893 scopus 로고    scopus 로고
    • A view of the EM algorithm that justifies incremental, sparse, and other variants
    • Neal, R.M. and Hinton, G.E. A view of the EM algorithm that justifies incremental, sparse, and other variants. Learning in graphical models, 89:355-368, 1998.
    • (1998) Learning in Graphical Models , vol.89 , pp. 355-368
    • Neal, R.M.1    Hinton, G.E.2
  • 26
    • 84865692149 scopus 로고    scopus 로고
    • Efficiency of coordinate descent methods on huge-scale optimization problems
    • Nesterov, Y. Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optimiz., 22(2):341-362, 2012.
    • (2012) SIAM J. Optimiz. , vol.22 , Issue.2 , pp. 341-362
    • Nesterov, Y.1
  • 27
    • 33646730150 scopus 로고    scopus 로고
    • Cubic regularization of Newton method and its global performance
    • Nesterov, Y. and Polyak, B.T. Cubic regularization of Newton method and its global performance. Math. Program., 108(1):177-205, 2006.
    • (2006) Math. Program. , vol.108 , Issue.1 , pp. 177-205
    • Nesterov, Y.1    Polyak, B.T.2
  • 28
    • 84863986110 scopus 로고    scopus 로고
    • Iteration complexity of randomized block coordinate descent methods for minimizing a composite function
    • Richtárik, P. and Takáč, M. Iteration complexity of randomized block coordinate descent methods for minimizing a composite function. Math. Program., 2012.
    • (2012) Math. Program.
    • Richtárik, P.1    Takáč, M.2
  • 29
    • 85032752036 scopus 로고    scopus 로고
    • Variational Bayesian inference techniques
    • Seeger, M.W. and Wipf, D.P. Variational Bayesian inference techniques. IEEE Signal Proc. Mag., 27(6):81-91, 2010.
    • (2010) IEEE Signal Proc. Mag. , vol.27 , Issue.6 , pp. 81-91
    • Seeger, M.W.1    Wipf, D.P.2
  • 32
    • 46749146509 scopus 로고    scopus 로고
    • A coordinate gradient descent method for nonsmooth separable minimization
    • Tseng, P. and Yun, S. A coordinate gradient descent method for nonsmooth separable minimization. Math. Program., 117:387-423, 2009.
    • (2009) Math. Program. , vol.117 , pp. 387-423
    • Tseng, P.1    Yun, S.2
  • 33
    • 65749118363 scopus 로고    scopus 로고
    • Graphical models, exponential families, and variational inference
    • Wainwright, M.J. and Jordan, M.I. Graphical models, exponential families, and variational inference. Found. Trends Mach. Learn., 1(1-2):1-305, 2008.
    • (2008) Found. Trends Mach. Learn. , vol.1 , Issue.1-2 , pp. 1-305
    • Wainwright, M.J.1    Jordan, M.I.2
  • 34
    • 67650178787 scopus 로고    scopus 로고
    • Sparse reconstruction by separable approximation
    • Wright, S., Nowak, R., and Figueiredo, M. Sparse reconstruction by separable approximation. IEEE T. Signal Process., 57(7):2479-2493, 2009.
    • (2009) IEEE T. Signal Process. , vol.57 , Issue.7 , pp. 2479-2493
    • Wright, S.1    Nowak, R.2    Figueiredo, M.3
  • 35
    • 0037355948 scopus 로고    scopus 로고
    • Sequential greedy approximation for certain convex optimization problems
    • Zhang, T. Sequential greedy approximation for certain convex optimization problems. IEEE T. Inform. Theory, 49(3):682-691, 2003.
    • (2003) IEEE T. Inform. Theory , vol.49 , Issue.3 , pp. 682-691
    • Zhang, T.1
  • 36
    • 84877780790 scopus 로고    scopus 로고
    • Accelerated training for matrix-norm regularization: A boosting approach
    • Zhang, X., Yu, Y., and Schuurmans, D. Accelerated training for matrix-norm regularization: a boosting approach. In Adv. NIPS, 2012.
    • (2012) Adv. NIPS
    • Zhang, X.1    Yu, Y.2    Schuurmans, D.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.