메뉴 건너뛰기




Volumn 2, Issue , 2008, Pages 1049-1055

Towards faster planning with continuous resources in stochastic domains

Author keywords

[No Author keywords available]

Indexed keywords

CUMULATIVE DISTRIBUTION FUNCTIONS; DISCRETE VARIABLES; DO-MAINS; DUAL SPACES; EXPERIMENTAL EVALUATIONS; FORWARD SEARCHES; MARKOV DECISION PROCESSES; OPTIMALITY; OTHER ALGORITHMS; POLICY GENERATIONS; POOR PERFORMANCES; SCALE-UP; SPEED-UP; STATE SPACES; STOCHASTIC DOMAINS; SUPERIOR PERFORMANCES; TIME PERFORMANCES;

EID: 57749176370     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (6)

References (15)
  • 2
    • 0034248853 scopus 로고    scopus 로고
    • Stochastic dynamic programming with factored representations
    • Boutilier, C.; Dearden, R.; and Goldszmidt, M. 2000. Stochastic dynamic programming with factored representations. Artificial Intelligence 121(1-2):49-107.
    • (2000) Artificial Intelligence , vol.121 , Issue.1-2 , pp. 49-107
    • Boutilier, C.1    Dearden, R.2    Goldszmidt, M.3
  • 4
    • 84880716467 scopus 로고    scopus 로고
    • Stationary deterministic policies for constrained MDPs with multiple rewards, costs, and discount factors
    • Dolgov, D. A., and Durfee, E. H. 2005. Stationary deterministic policies for constrained MDPs with multiple rewards, costs, and discount factors. In Proceedings of IJCAI-05, 1326-1332.
    • (2005) Proceedings of IJCAI-05 , pp. 1326-1332
    • Dolgov, D.A.1    Durfee, E.H.2
  • 6
    • 29344475738 scopus 로고    scopus 로고
    • Solving factored MDPs with continuous and discrete variables
    • Guestrin, C.; Hauskrecht, M.; and Kveton, B. 2004. Solving factored MDPs with continuous and discrete variables. In Proceedings of UAI-04, 235-242.
    • (2004) Proceedings of UAI-04 , pp. 235-242
    • Guestrin, C.1    Hauskrecht, M.2    Kveton, B.3
  • 7
    • 14844352327 scopus 로고    scopus 로고
    • Linear program approximations for factored continuous-state MDPs
    • MIT Press
    • Hauskrecht, M., and Kveton, B. 2004. Linear program approximations for factored continuous-state MDPs. In NIPS 16. MIT Press.
    • (2004) NIPS 16
    • Hauskrecht, M.1    Kveton, B.2
  • 8
    • 4644323293 scopus 로고    scopus 로고
    • Least-squares policy iteration
    • Dec
    • Lagoudakis, M., and Parr, R. 2003. Least-squares policy iteration. In JMLR, volume 4(Dec). 1107-1149.
    • (2003) JMLR , vol.4 , pp. 1107-1149
    • Lagoudakis, M.1    Parr, R.2
  • 9
    • 29344431927 scopus 로고    scopus 로고
    • Lazy approximation for solving continuous finite-horizon MDPs
    • Li, L., and Littman, M. 2005. Lazy approximation for solving continuous finite-horizon MDPs. In Proceedings of AAAI-05, 1175-1180.
    • (2005) Proceedings of AAAI-05 , pp. 1175-1180
    • Li, L.1    Littman, M.2
  • 10
    • 57749203311 scopus 로고    scopus 로고
    • A fast analytical algorithm for solving MDPs with real-valued resources
    • Marecki, J.; Koenig, S.; and Tambe, M. 2007. A fast analytical algorithm for solving MDPs with real-valued resources. In Proceedings of IJCAI-07.
    • (2007) Proceedings of IJCAI-07
    • Marecki, J.1    Koenig, S.2    Tambe, M.3
  • 14
    • 39549106927 scopus 로고    scopus 로고
    • Non-Linear stochastic control in continuous state spaces by exact integration in Bellman's equations
    • Nikovski, D., and Brand, M. 2003. Non-Linear stochastic control in continuous state spaces by exact integration in Bellman's equations. In Proceedings of ICAPS-03: WS2, 91-95.
    • (2003) Proceedings of ICAPS-03: WS2 , pp. 91-95
    • Nikovski, D.1    Brand, M.2
  • 15
    • 34247187490 scopus 로고    scopus 로고
    • Winning back the cup for distributed POMDPs: Planning over continuous belief spaces
    • Varakantham, P.; Nair, R.; Tambe, M.; and Yokoo, M. 2006. Winning back the cup for distributed POMDPs: Planning over continuous belief spaces. In Proceedings of AAMAS-06, 289-296.
    • (2006) Proceedings of AAMAS-06 , pp. 289-296
    • Varakantham, P.1    Nair, R.2    Tambe, M.3    Yokoo, M.4


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.