메뉴 건너뛰기




Volumn WS-11-11, Issue , 2011, Pages 113-118

Beyond independent agreement: A tournament selection approach for quality assurance of human computation tasks

Author keywords

[No Author keywords available]

Indexed keywords

HUMAN COMPUTATION; KEY TOPICS; RESEARCH FIELDS; TOURNAMENT SELECTION;

EID: 80055050754     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (10)

References (12)
  • 4
    • 80053360508 scopus 로고    scopus 로고
    • Cheap and Fast - But is it Good? Evaluating Non-Expert Annotations for Natural Language Tasks
    • Snow, R., O'Connor, B., Jurafsky, D., and Ng, A.Y. 2008. Cheap and Fast - But is it Good? Evaluating Non-Expert Annotations for Natural Language Tasks. In Proc. Of EMNLP-2008.
    • (2008) Proc. Of EMNLP-2008
    • Snow, R.1    O'Connor, B.2    Jurafsky, D.3    Ng, A.Y.4
  • 5
    • 80053402398 scopus 로고    scopus 로고
    • Fast, Cheap, and Creative: Evaluating Translation Quality Using Amazon's Mechanical Turk
    • Callison-Burch, C. 2009. Fast, Cheap, and Creative: Evaluating Translation Quality Using Amazon's Mechanical Turk. In Proc. of EMNLP-2009.
    • (2009) Proc. of EMNLP-2009
    • Callison-Burch, C.1
  • 7
    • 80055027910 scopus 로고    scopus 로고
    • Ensuring quality in crowdsourced search relevance evaluation. Workshop on Crowdsourcing for Search Evaluation
    • Le, J., Edmonds, A., Hester, V., and Biewald, Lukas. Ensuring quality in crowdsourced search relevance evaluation. Workshop on Crowdsourcing for Search Evaluation. ACM SIGIR 2010
    • ACM SIGIR 2010
    • Le, J.1    Edmonds, A.2    Hester, V.3    Biewald, L.4


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.