메뉴 건너뛰기




Volumn , Issue , 2007, Pages 1879-1884

A decision-theoretic model of assistance

Author keywords

[No Author keywords available]

Indexed keywords

COMPUTATIONAL EFFORT; COMPUTER ENVIRONMENTS; DECISION-THEORETIC; GOAL-DIRECTED; HUMAN SUBJECTS; INTELLIGENT ASSISTANTS; KNOWLEDGE WORKERS; OVERALL COSTS;

EID: 80052984053     PISSN: 10450823     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (40)

References (17)
  • 1
    • 0036923278 scopus 로고    scopus 로고
    • Getting from here to there: Interactive planning and agent execution for optimizing travel
    • J. L. Ambite, G. Barish, C. A. Knoblock, M. Muslea, J. Oh, and S. Minton. Getting from here to there: Interactive planning and agent execution for optimizing travel. In IAAI, pages 862-869, 2002.
    • (2002) IAAI , pp. 862-869
    • Ambite, J.L.1    Barish, G.2    Knoblock, C.A.3    Muslea, M.4    Oh, J.5    Minton, S.6
  • 3
    • 13444253014 scopus 로고    scopus 로고
    • Statistical goal parameter recognition
    • N. Blaylock and J. F. Allen. Statistical goal parameter recognition. In ICAPS, 2004.
    • (2004) ICAPS
    • Blaylock, N.1    Allen, J.F.2
  • 6
    • 0042254114 scopus 로고    scopus 로고
    • Policy recognition in the abstract hidden markov models
    • H. Bui, S. Venkatesh, and G. West. Policy recognition in the abstract hidden markov models. JAIR, 17, 2002.
    • (2002) JAIR , vol.17
    • Bui, H.1    Venkatesh, S.2    West, G.3
  • 10
    • 4544318426 scopus 로고    scopus 로고
    • Efficient solution algorithms for factored MDPs
    • C. Guestrin, D. Koller, R. Parr, and S. Venkataraman. Efficient solution algorithms for factored MDPs. JAIR, pages 399-468, 2003.
    • (2003) JAIR , pp. 399-468
    • Guestrin, C.1    Koller, D.2    Parr, R.3    Venkataraman, S.4
  • 11
    • 0001670678 scopus 로고    scopus 로고
    • The lumiere project: Bayesian user modeling for inferring the goals and needs of software users
    • Madison, WI, July
    • E. Horvitz, J. Breese, D. Heckerman, D. Hovel, and K. Rommelse. The lumiere project: Bayesian user modeling for inferring the goals and needs of software users. In In Proc UAI, pages 256-265, Madison, WI, July 1998.
    • (1998) Proc UAI , pp. 256-265
    • Horvitz, E.1    Breese, J.2    Heckerman, D.3    Hovel, D.4    Rommelse, K.5
  • 12
    • 33746064879 scopus 로고    scopus 로고
    • Who's asking for help?: A bayesian approach to intelligent assistance
    • B. Hui and C. Boutilier. Who's asking for help?: a bayesian approach to intelligent assistance. In IUI, pages 186-193, 2006.
    • (2006) IUI , pp. 186-193
    • Hui, B.1    Boutilier, C.2
  • 13
    • 84880649215 scopus 로고    scopus 로고
    • A sparse sampling algorithm for near-optimal planning in large markov decision processes
    • M. J. Kearns, Y. Mansour, and A. Y. Ng. A sparse sampling algorithm for near-optimal planning in large markov decision processes. In IJCAI, 1999.
    • (1999) IJCAI
    • Kearns, M.J.1    Mansour, Y.2    Ng, A.Y.3
  • 15
    • 0027684215 scopus 로고
    • Prioritized sweeping: Reinforcement learning with less data and less time
    • A.W. Moore and C. G. Atkeson. Prioritized sweeping: Reinforcement learning with less data and less time. Machine Learning, 13:103-130, 1993.
    • (1993) Machine Learning , vol.13 , pp. 103-130
    • Moore, A.W.1    Atkeson, C.G.2
  • 16
    • 0042254100 scopus 로고    scopus 로고
    • Probabilistic state-dependent grammars for plan recognition
    • D. V. Pynadath and M. P.Wellman. Probabilistic state-dependent grammars for plan recognition. In UAI, pages 507-514, 2000.
    • (2000) UAI , pp. 507-514
    • Pynadath, D.V.1    Wellman, M.P.2
  • 17
    • 33644814996 scopus 로고    scopus 로고
    • Exploiting belief bounds: Practical pomdps for personal assistant agents
    • P. Varakantham, R. T. Maheswaran, and M. Tambe. Exploiting belief bounds: practical pomdps for personal assistant agents. In AAMAS, 2005.
    • (2005) AAMAS
    • Varakantham, P.1    Maheswaran, R.T.2    Tambe, M.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.