메뉴 건너뛰기




Volumn , Issue , 2016, Pages 1785-1793

The multi-fidelity multi-armed bandit

Author keywords

[No Author keywords available]

Indexed keywords

STOCHASTIC SYSTEMS;

EID: 85018901939     PISSN: 10495258     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (41)

References (16)
  • 1
    • 0000616723 scopus 로고
    • Sample mean based index policies with O(log n) regret for the multi-armed bandit problem
    • Rajeev Agrawal. Sample Mean Based Index Policies with O(log n) Regret for the Multi-Armed Bandit Problem. Advances in Applied Probability, 1995.
    • (1995) Advances in Applied Probability
    • Agrawal, R.1
  • 2
    • 62949181077 scopus 로고    scopus 로고
    • Exploration-exploitation tradeoff using variance estimates in multi-armed bandits
    • Jean-Yves Audibert, Rémi Munos, and Csaba Szepesvári. Exploration-exploitation Tradeoff Using Variance Estimates in Multi-armed Bandits. Theor. Comput. Sci., 2009.
    • (2009) Theor. Comput. Sci.
    • Audibert, J.-Y.1    Munos, R.2    Szepesvári, C.3
  • 3
    • 0041966002 scopus 로고    scopus 로고
    • Using confidence bounds for exploitation-exploration trade-offs
    • Peter Auer. Using Confidence Bounds for Exploitation-exploration Trade-offs. J. Mach. Learn. Res., 2003.
    • (2003) J. Mach. Learn. Res.
    • Auer, P.1
  • 12
    • 0001395850 scopus 로고
    • On the likelihood that one unknown probability exceeds another in view of the evidence of two samples
    • W. R. Thompson. On the Likelihood that one Unknown Probability Exceeds Another in View of the Evidence of Two Samples. Biometrika, 1933.
    • (1933) Biometrika
    • Thompson, W.R.1
  • 13
    • 84923313538 scopus 로고    scopus 로고
    • Efficient regret bounds for online bid optimisation in budget-limited sponsored search auctions
    • Long Tran-Thanh, Lampros C. Stavrogiannis, Victor Naroditskiy, Valentin Robu, Nicholas R. Jennings, and Peter Key. Efficient Regret Bounds for Online Bid Optimisation in Budget-Limited Sponsored Search Auctions. In UAI, 2014.
    • (2014) UAI
    • Tran-Thanh, L.1    Stavrogiannis, L.C.2    Naroditskiy, V.3    Robu, V.4    Jennings, N.R.5    Key, P.6
  • 15
    • 84949805565 scopus 로고    scopus 로고
    • Thompson sampling for budgeted multi-armed bandits
    • Yingce Xia, Haifang Li, Tao Qin, Nenghai Yu, and Tie-Yan Liu. Thompson Sampling for Budgeted Multi-Armed Bandits. In IJCAI, 2015.
    • (2015) IJCAI
    • Xia, Y.1    Li, H.2    Qin, T.3    Yu, N.4    Liu, T.-Y.5


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.