메뉴 건너뛰기




Volumn , Issue , 2011, Pages

Algorithms for hyper-parameter optimization

Author keywords

[No Author keywords available]

Indexed keywords

ALGORITHMIC APPROACH; COMPUTER CLUSTERS; DEEP BELIEF NETWORKS; FEATURE LEARNING; HYPER-PARAMETER; HYPER-PARAMETER OPTIMIZATIONS; IMAGES CLASSIFICATION; NEURAL-NETWORKS; RANDOM SEARCHES; STATE OF THE ART;

EID: 85162384813     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (4437)

References (23)
  • 1
    • 34547967782 scopus 로고    scopus 로고
    • An empirical evaluation of deep architectures on problems with many factors of variation
    • H. Larochelle, D. Erhan, A. Courville, J. Bergstra, and Y. Bengio. An empirical evaluation of deep architectures on problems with many factors of variation. In ICML 2007, pages 473-480, 2007.
    • (2007) ICML , Issue.2007 , pp. 473-480
    • Larochelle, H.1    Erhan, D.2    Courville, A.3    Bergstra, J.4    Bengio, Y.5
  • 2
    • 33745805403 scopus 로고    scopus 로고
    • A fast learning algorithm for deep belief nets
    • G. E. Hinton, S. Osindero, and Y. Teh. A fast learning algorithm for deep belief nets. Neural Computation, 18:1527-1554, 2006.
    • (2006) Neural Computation , vol.18 , pp. 1527-1554
    • Hinton, G.E.1    Osindero, S.2    Teh, Y.3
  • 3
    • 79551480483 scopus 로고    scopus 로고
    • Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion
    • P. Vincent, H. Larochelle, I. Lajoie, Y. Bengio, and P. A. Manzagol. Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion. Machine Learning Research, 11:3371-3408, 2010.
    • (2010) Machine Learning Research , vol.11 , pp. 3371-3408
    • Vincent, P.1    Larochelle, H.2    Lajoie, I.3    Bengio, Y.4    Manzagol, P.A.5
  • 4
    • 0032203257 scopus 로고    scopus 로고
    • Gradient-based learning applied to document recognition
    • November
    • Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278-2324, November 1998.
    • (1998) Proceedings of the IEEE , vol.86 , Issue.11 , pp. 2278-2324
    • Lecun, Y.1    Bottou, L.2    Bengio, Y.3    Haffner, P.4
  • 5
    • 73449129720 scopus 로고    scopus 로고
    • A high-throughput screening approach to discovering good forms of biologically inspired visual representation
    • e1000579
    • Nicolas Pinto, David Doukhan, James J. DiCarlo, and David D. Cox. A high-throughput screening approach to discovering good forms of biologically inspired visual representation. PLoS Comput Biol, 5(11):e1000579, 11 2009.
    • (2009) PLoS Comput Biol , vol.5 , Issue.11 , pp. 11
    • Pinto, N.1    Doukhan, D.2    Dicarlo, J.J.3    Cox, D.D.4
  • 9
    • 84856930049 scopus 로고    scopus 로고
    • Sequential model-based optimization for general algorithm configuration
    • Extended version as UBC Tech report TR-2010-10
    • F. Hutter, H. Hoos, and K. Leyton-Brown. Sequential model-based optimization for general algorithm configuration. In LION-5, 2011. Extended version as UBC Tech report TR-2010-10.
    • (2011) LION-5
    • Hutter, F.1    Hoos, H.2    Leyton-Brown, K.3
  • 10
    • 0035577808 scopus 로고    scopus 로고
    • A taxonomy of global optimization methods based on response surfaces
    • D.R. Jones. A taxonomy of global optimization methods based on response surfaces. Journal of Global Optimization, 21:345-383, 2001.
    • (2001) Journal of Global Optimization , vol.21 , pp. 345-383
    • Jones, D.R.1
  • 11
    • 59449092765 scopus 로고    scopus 로고
    • An informational approach to the global optimization of expensive-to-evaluate functions
    • J. Villemonteix, E. Vazquez, and E. Walter. An informational approach to the global optimization of expensive-to-evaluate functions. Journal of Global Optimization, 2006.
    • (2006) Journal of Global Optimization
    • Villemonteix, J.1    Vazquez, E.2    Walter, E.3
  • 12
    • 77956501313 scopus 로고    scopus 로고
    • Gaussian process optimization in the bandit setting: No regret and experimental design
    • N. Srinivas, A. Krause, S. Kakade, and M. Seeger. Gaussian process optimization in the bandit setting: No regret and experimental design. In ICML, 2010.
    • (2010) ICML
    • Srinivas, N.1    Krause, A.2    Kakade, S.3    Seeger, M.4
  • 13
    • 0342813049 scopus 로고
    • The application of bayesian methods for seeking the extremum
    • L.C.W. Dixon and G.P. Szego, editors, North Holland, New York
    • J. Mockus, V. Tiesis, and A. Zilinskas. The application of Bayesian methods for seeking the extremum. In L.C.W. Dixon and G.P. Szego, editors, Towards Global Optimization, volume 2, pages 117-129. North Holland, New York, 1978.
    • (1978) Towards Global Optimization , vol.2 , pp. 117-129
    • Mockus, J.1    Tiesis, V.2    Zilinskas, A.3
  • 16
    • 84860615127 scopus 로고    scopus 로고
    • Surrogating the surrogate: Accelerating Gaussian Process optimization with mixtures
    • R. Bardenet and B. Kégl. Surrogating the surrogate: accelerating Gaussian Process optimization with mixtures. In ICML, 2010.
    • (2010) ICML
    • Bardenet, R.1    Kégl, B.2
  • 18
  • 20
    • 0042826822 scopus 로고    scopus 로고
    • Independent component analysis: Algorithms and applications
    • A. Hyvärinen and E. Oja. Independent component analysis: Algorithms and applications. Neural Networks, 13(4-5):411-430, 2000.
    • (2000) Neural Networks , vol.13 , Issue.4-5 , pp. 411-430
    • Hyvärinen, A.1    Oja, E.2
  • 21
    • 84857855190 scopus 로고    scopus 로고
    • Random search for hyper-parameter optimization
    • Accepted
    • J. Bergstra and Y. Bengio. Random search for hyper-parameter optimization. JMLR, 2012. Accepted.
    • (2012) JMLR
    • Bergstra, J.1    Bengio, Y.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.