메뉴 건너뛰기




Volumn 25, Issue 2, 2015, Pages 829-855

Incremental majorization-minimization optimization with application to large-scale machine learning

Author keywords

Convex optimization; Majorization minimization; Nonconvex optimization

Indexed keywords

ALGORITHMS; ARTIFICIAL INTELLIGENCE; CONVEX OPTIMIZATION; FUNCTIONS; ITERATIVE METHODS; LEARNING SYSTEMS; PROBLEM SOLVING; SIGNAL PROCESSING;

EID: 84940372324     PISSN: 10526234     EISSN: None     Source Type: Journal    
DOI: 10.1137/140957639     Document Type: Article
Times cited : (284)

References (57)
  • 1
    • 33644620767 scopus 로고    scopus 로고
    • Convergent incremental optimization transfer algorithms: Application to tomography
    • S. Ahn, J. A. Fessler, D. Blatt, and A. O. Hero, Convergent incremental optimization transfer algorithms: Application to tomography, IEEE Trans. Med. Imaging, 25 (2006), pp. 283-296.
    • (2006) IEEE Trans. Med. Imaging , vol.25 , pp. 283-296
    • Ahn, S.1    Fessler, J.A.2    Blatt, D.3    Hero, A.O.4
  • 3
    • 85014561619 scopus 로고    scopus 로고
    • A fast iterative shrinkage-thresholding algorithm for linear inverse problems
    • A. Beck and M. Teboulle, A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., 2 (2009), pp. 183-202.
    • (2009) SIAM J. Imaging Sci. , vol.2 , pp. 183-202
    • Beck, A.1    Teboulle, M.2
  • 4
    • 84892868336 scopus 로고    scopus 로고
    • On the convergence of block coordinate descent type methods
    • A. Beck and L. Tetruashvili, On the convergence of block coordinate descent type methods, SIAM J. Optim., 23 (2013), pp. 2037-2060.
    • (2013) SIAM J. Optim. , vol.23 , pp. 2037-2060
    • Beck, A.1    Tetruashvili, L.2
  • 6
    • 39449100600 scopus 로고    scopus 로고
    • A convergent incremental gradient method with a constant step size
    • D. Blatt, A. O. Hero, and H. Gauchman, A convergent incremental gradient method with a constant step size, SIAM J. Optim., 18 (2007), pp. 29-51.
    • (2007) SIAM J. Optim. , vol.18 , pp. 29-51
    • Blatt, D.1    Hero, A.O.2    Gauchman, H.3
  • 7
    • 0037877567 scopus 로고
    • Monotonicity of quadratic-approximation algorithms
    • D. Böhning and B. G. Lindsay, Monotonicity of quadratic-approximation algorithms, Ann. Inst. Statist. Math., 40 (1988), pp. 641-663.
    • (1988) Ann. Inst. Statist. Math. , vol.40 , pp. 641-663
    • Böhning, D.1    Lindsay, B.G.2
  • 9
    • 0013309537 scopus 로고    scopus 로고
    • Online algorithms and stochastic approximations
    • D. Saad, ed., Cambridge University Press, Cambridge, UK
    • L. Bottou, Online algorithms and stochastic approximations, in Online Learning and Neural Networks, D. Saad, ed., Cambridge University Press, Cambridge, UK, 1998.
    • (1998) Online Learning and Neural Networks
    • Bottou, L.1
  • 11
    • 57349174008 scopus 로고    scopus 로고
    • Enhancing sparsity by reweighted ℓ1 minimization
    • E. J. Candès, M. Wakin, and S. P. Boyd, Enhancing sparsity by reweighted ℓ1 minimization, J. Fourier Anal. Appl., 14 (2008), pp. 877-905.
    • (2008) J. Fourier Anal. Appl. , vol.14 , pp. 877-905
    • Candès, E.J.1    Wakin, M.2    Boyd, S.P.3
  • 13
    • 0036643072 scopus 로고    scopus 로고
    • Logistic regression, AdaBoost and bregman distances
    • M. Collins, R. E. Schapire, and Y. Singer, Logistic regression, AdaBoost and Bregman distances, Mach. Learn., 48 (2002), pp. 253-285.
    • (2002) Mach. Learn. , vol.48 , pp. 253-285
    • Collins, M.1    Schapire, R.E.2    Singer, Y.3
  • 15
    • 30844438177 scopus 로고    scopus 로고
    • Signal recovery by proximal forward-backward splitting
    • P. L. Combettes and V. R. Wajs, Signal recovery by proximal forward-backward splitting, Multiscale Model. Simul., 4 (2005), pp. 1168-1200.
    • (2005) Multiscale Model. Simul. , vol.4 , pp. 1168-1200
    • Combettes, P.L.1    Wajs, V.R.2
  • 16
    • 7044231546 scopus 로고    scopus 로고
    • An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
    • I. Daubechies, M. Defrise, and C. De Mol, An iterative thresholding algorithm for linear inverse problems with a sparsity constraint, Comm. Pure Appl. Math., 57 (2004), pp. 1413-1457.
    • (2004) Comm. Pure Appl. Math. , vol.57 , pp. 1413-1457
    • Daubechies, I.1    Defrise, M.2    De Mol, C.3
  • 18
    • 84919829570 scopus 로고    scopus 로고
    • Finito: A faster, permutable incremental gradient method for big data problems
    • A. J. Defazio, T. S. Caetano, and J. Domke, Finito: A faster, permutable incremental gradient method for big data problems, in Proceedings of ICML, 2014.
    • (2014) Proceedings of ICML
    • Defazio, A.J.1    Caetano, T.S.2    Domke, J.3
  • 20
    • 0002629270 scopus 로고
    • Maximum likelihood from incomplete data via the EM algorithm
    • A. P. Dempster, N. M. Laird, and D. B. Rubin, Maximum likelihood from incomplete data via the EM algorithm, J. Roy. Statist. Soc. Ser. B, 39 (1977), pp. 1-38.
    • (1977) J. Roy. Statist. Soc. Ser. B , vol.39 , pp. 1-38
    • Dempster, A.P.1    Laird, N.M.2    Rubin, D.B.3
  • 21
    • 80052250414 scopus 로고    scopus 로고
    • Adaptive subgradient methods for online learning and stochastic optimization
    • J. Duchi, E. Hazan, and Y. Singer, Adaptive subgradient methods for online learning and stochastic optimization, J. Mach. Learn. Res., 12 (2011), pp. 2121-2159.
    • (2011) J. Mach. Learn. Res. , vol.12 , pp. 2121-2159
    • Duchi, J.1    Hazan, E.2    Singer, Y.3
  • 22
    • 75249102673 scopus 로고    scopus 로고
    • Efficient online and batch learning using forward backward splitting
    • J. Duchi and Y. Singer, Efficient online and batch learning using forward backward splitting, J. Mach. Learn. Res., 10 (2009), pp. 2899-2934.
    • (2009) J. Mach. Learn. Res. , vol.10 , pp. 2899-2934
    • Duchi, J.1    Singer, Y.2
  • 23
    • 0345103297 scopus 로고    scopus 로고
    • Ordered subsets algorithms for transmission tomography
    • H. Erdogan and J. A. Fessler, Ordered subsets algorithms for transmission tomography, Phys. Med. Biol., 44 (1999), pp. 2835-2851.
    • (1999) Phys. Med. Biol. , vol.44 , pp. 2835-2851
    • Erdogan, H.1    Fessler, J.A.2
  • 25
    • 15044364880 scopus 로고    scopus 로고
    • Mean shift is a bound optimization
    • M. Fashing and C. Tomasi, Mean shift is a bound optimization, IEEE Trans. Pattern Anal., 27 (2005), pp. 471-474.
    • (2005) IEEE Trans. Pattern Anal. , vol.27 , pp. 471-474
    • Fashing, M.1    Tomasi, C.2
  • 26
    • 70450245260 scopus 로고    scopus 로고
    • Recovering sparse signals with non-convex penalties and DC programming
    • G. Gasso, A. Rakotomamonjy, and S. Canu, Recovering sparse signals with non-convex penalties and DC programming, IEEE Trans. Signal Process., 57 (2009), pp. 4686-4698.
    • (2009) IEEE Trans. Signal Process , vol.57 , pp. 4686-4698
    • Gasso, G.1    Rakotomamonjy, A.2    Canu, S.3
  • 27
    • 84871576447 scopus 로고    scopus 로고
    • Optimal stochastic approximation algorithms for strongly convex stochastic composite optimization I: A generic algorithmic framework
    • S. Ghadimi and G. Lan, Optimal stochastic approximation algorithms for strongly convex stochastic composite optimization I: A generic algorithmic framework, SIAM J. Optim., 22 (2012), pp. 1469-1492.
    • (2012) SIAM J. Optim. , vol.22 , pp. 1469-1492
    • Ghadimi, S.1    Lan, G.2
  • 28
    • 69649095451 scopus 로고    scopus 로고
    • Fixed-point continuation for l\-minimization: Methodology and convergence
    • E. T. Hale, W. Yin, and Y. Zhang, Fixed-point continuation for l\-minimization: Methodology and convergence, SIAM J. Optim., 19 (2008), pp. 1107-1130.
    • (2008) SIAM J. Optim. , vol.19 , pp. 1107-1130
    • Hale, E.T.1    Yin, W.2    Zhang, Y.3
  • 29
    • 84898979568 scopus 로고    scopus 로고
    • Beyond the regret minimization barrier: An optimal algorithm for stochastic strongly-convex optimization
    • E. Hazan and S. Kale, Beyond the regret minimization barrier: An optimal algorithm for stochastic strongly-convex optimization, in Proceedings of COLT, 2011.
    • (2011) Proceedings of COLT
    • Hazan, E.1    Kale, S.2
  • 32
    • 84865676844 scopus 로고    scopus 로고
    • First order methods for nonsmooth convex large-scale optimization
    • MIT Press, Cambridge, MA
    • A. Juditsky and A. Nemirovski, First order methods for nonsmooth convex large-scale optimization, in Optimization for Machine Learning, MIT Press, Cambridge, MA, 2011.
    • (2011) Optimization for Machine Learning
    • Juditsky, A.1    Nemirovski, A.2
  • 34
    • 84862273593 scopus 로고    scopus 로고
    • An optimal method for stochastic composite optimization
    • G. Lan, An optimal method for stochastic composite optimization, Math. Program., 133 (2012), pp. 365-397.
    • (2012) Math. Program. , vol.133 , pp. 365-397
    • Lan, G.1
  • 35
    • 77950023906 scopus 로고    scopus 로고
    • Optimization transfer using surrogate objective functions
    • K. Lange, D. R. Hunter, and I. Yang, Optimization transfer using surrogate objective functions, J. Comput. Graph. Statist., 9 (2000), pp. 1-20.
    • (2000) J. Comput. Graph. Statist. , vol.9 , pp. 1-20
    • Lange, K.1    Hunter, D.R.2    Yang, I.3
  • 38
    • 84897534825 scopus 로고    scopus 로고
    • Optimization with first-order surrogate functions
    • J. Mairal, Optimization with first-order surrogate functions, in Proceedings of ICML, 2013.
    • (2013) Proceedings of ICML
    • Mairal, J.1
  • 40
    • 76749107542 scopus 로고    scopus 로고
    • Online learning for matrix factorization and sparse coding
    • J. Mairal, F. Bach, J. Ponce, and G. Sapiro, Online learning for matrix factorization and sparse coding, J. Mach. Learn. Res., 11 (2010), pp. 19-60.
    • (2010) J. Mach. Learn. Res. , vol.11 , pp. 19-60
    • Mairal, J.1    Bach, F.2    Ponce, J.3    Sapiro, G.4
  • 41
    • 0041356704 scopus 로고
    • Fonctions convexes duales et points proximaux dans un espace hilbertien
    • J. J. Moreau, Fonctions convexes duales et points proximaux dans un espace hilbertien, C. R. Acad. Sci. Paris Sér. A Math., 255 (1962), pp. 2897-2899.
    • (1962) C. R. Acad. Sci. Paris Sér. A Math. , vol.255 , pp. 2897-2899
    • Moreau, J.J.1
  • 42
    • 0002788893 scopus 로고    scopus 로고
    • A view of the EM algorithm that justifies incremental, sparse, and other variants
    • Kluwer, Dordrecht, the Netherlands
    • R. M. Neal and G. E. Hinton, A view of the EM algorithm that justifies incremental, sparse, and other variants, in Learning in Graphical Models, Kluwer, Dordrecht, the Netherlands, 1998, pp. 355-368.
    • (1998) Learning in Graphical Models , pp. 355-368
    • Neal, R.M.1    Hinton, G.E.2
  • 43
    • 70450197241 scopus 로고    scopus 로고
    • Robust stochastic approximation approach to stochastic programming
    • A. Nemirovski, A. Juditsky, G. Lan, and A. Shapiro, Robust stochastic approximation approach to stochastic programming, SIAM J. Optim., 19 (2009), pp. 1574-1609.
    • (2009) SIAM J. Optim. , vol.19 , pp. 1574-1609
    • Nemirovski, A.1    Juditsky, A.2    Lan, G.3    Shapiro, A.4
  • 45
    • 84879800501 scopus 로고    scopus 로고
    • Gradient methods for minimizing composite objective functions
    • Y. Nesterov, Gradient methods for minimizing composite objective functions, Math. Program., 140 (2012), pp. 125-161.
    • (2012) Math. Program. , vol.140 , pp. 125-161
    • Nesterov, Y.1
  • 47
    • 84880570485 scopus 로고    scopus 로고
    • A unified convergence analysis of block successive minimization methods for nonsmooth optimization
    • M. Razaviyayn, M. Hong, and Z.-Q. Luo, A unified convergence analysis of block successive minimization methods for nonsmooth optimization, SIAM J. Optim., 23 (2013), pp. 1126-1153.
    • (2013) SIAM J. Optim. , vol.23 , pp. 1126-1153
    • Razaviyayn, M.1    Hong, M.2    Luo, Z.-Q.3
  • 53
    • 65749118363 scopus 로고    scopus 로고
    • Graphical models, exponential families, and variational inference
    • M. J. Wainwright and M. I. Jordan, Graphical models, exponential families, and variational inference, Found. Trends Mach. Learn., 1 (2008), pp. 1-305.
    • (2008) Found. Trends Mach. Learn. , vol.1 , pp. 1-305
    • Wainwright, M.J.1    Jordan, M.I.2
  • 55
    • 78649396336 scopus 로고    scopus 로고
    • Dual averaging methods for regularized stochastic learning and online optimization
    • L. Xiao, Dual averaging methods for regularized stochastic learning and online optimization, J. Mach. Learn. Res., 11 (2010), pp. 2543-2596.
    • (2010) J. Mach. Learn. Res. , vol.11 , pp. 2543-2596
    • Xiao, L.1
  • 56
    • 33645035051 scopus 로고    scopus 로고
    • Model selection and estimation in regression with grouped variables
    • M. Yuan and Y. Lin, Model selection and estimation in regression with grouped variables., J. Roy. Statist. Soc. Ser. B, 68 (2006), pp. 49-67.
    • (2006) J. Roy. Statist. Soc. Ser. B , vol.68 , pp. 49-67
    • Yuan, M.1    Lin, Y.2
  • 57
    • 84919792988 scopus 로고    scopus 로고
    • Fast stochastic alternating direction method of multipliers
    • L. W. Zhong and J. T. Kwok, Fast stochastic alternating direction method of multipliers, in Proceedings of ICML, 2014.
    • (2014) Proceedings of ICML
    • Zhong, L.W.1    Kwok, J.T.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.