메뉴 건너뛰기




Volumn , Issue , 2014, Pages 37-46

Toward crowdsourcing micro-level behavior annotations: The challenges of interface, training, and generalization

Author keywords

Behavior annotations; Crowdsourcing; Inter rater reliability; Micro level annotations; Training crowd workers

Indexed keywords

BEHAVIOR ANNOTATIONS; CROWDSOURCING; CROWDSOURCING PLATFORMS; HUMAN BEHAVIOR ANALYSIS; INTER-RATER RELIABILITIES; MICRO-LEVEL ANNOTATIONS; OBSERVATIONAL STUDY; RELIABILITY IMPROVEMENT;

EID: 84897748991     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1145/2557500.2557512     Document Type: Conference Paper
Times cited : (24)

References (32)
  • 2
    • 80755144058 scopus 로고    scopus 로고
    • Crowds in two seconds: Enabling realtime crowd-powered interfaces
    • Bernstein, M., Brandt, J., Miller, R., and Karger, D. Crowds in two seconds: Enabling realtime crowd-powered interfaces. Proc. UIST 2011, 33-42.
    • (2011) Proc. UIST , pp. 33-42
    • Bernstein, M.1    Brandt, J.2    Miller, R.3    Karger, D.4
  • 3
    • 84890659691 scopus 로고    scopus 로고
    • The good, the bad, and the angry: Analyzing crowdsourced impressions of vloggers
    • Biel, J. I. and Gatica-Perez, D. The good, the bad, and the angry: Analyzing crowdsourced impressions of vloggers. Proc. ICWSM 2012, 407-410.
    • (2012) Proc. ICWSM , pp. 407-410
    • Biel, J.I.1    Gatica-Perez, D.2
  • 5
    • 77954009273 scopus 로고    scopus 로고
    • Are your participants gaming the system?: Screening mechanical turk workers
    • Downs, J. S., Holbrook, M. B., Sheng, S., and Cranor, L. F. Are your participants gaming the system?: Screening mechanical turk workers. Proc. CHI 2010, 2399-2402.
    • (2010) Proc. CHI , pp. 2399-2402
    • Downs, J.S.1    Holbrook, M.B.2    Sheng, S.3    Cranor, L.F.4
  • 6
    • 84877340564 scopus 로고    scopus 로고
    • Consensus versus expertise: A case study of word alignment with mechanical turk
    • Gao, Q. and Vogel, S. Consensus versus expertise: A case study of word alignment with Mechanical Turk. Proc. CSLDAMT 2010, 30-34.
    • (2010) Proc. CSLDAMT , pp. 30-34
    • Gao, Q.1    Vogel, S.2
  • 7
    • 85171878456 scopus 로고    scopus 로고
    • Data quality from crowdsourcing: A study of annotation selection criteria
    • Hsueh, P. Y., Melville, P., and Sindhwani, V. Data quality from crowdsourcing: A study of annotation selection criteria. Proc. ALLNP 2009, 27-35.
    • Proc. ALLNP , vol.2009 , pp. 27-35
    • Hsueh, P.Y.1    Melville, P.2    Sindhwani, V.3
  • 8
    • 84870554345 scopus 로고    scopus 로고
    • Towards building a virtual counselor: Modeling nonverbal behavior during intimate self-disclosure
    • Kang, S., Gratch, J., Sidner, C., Artstein, R., Huang, L., Morency, L. Towards building a virtual counselor: Modeling nonverbal behavior during intimate self-disclosure. Proc. AAMAS 2012, 63-70.
    • (2012) Proc. AAMAS , pp. 63-70
    • Kang, S.1    Gratch, J.2    Sidner, C.3    Artstein, R.4    Huang, L.5    Morency, L.6
  • 9
    • 85021835557 scopus 로고    scopus 로고
    • Crowdsourcing step-by-step information extraction to enhance existing how-to videos
    • Kim, J., Nguyen, P., Weir, S., Guo, P., Miller, R., and Gajos, K. Crowdsourcing step-by-step information extraction to enhance existing how-to videos. Proc. CHI 2014.
    • Proc. CHI 2014
    • Kim, J.1    Nguyen, P.2    Weir, S.3    Guo, P.4    Miller, R.5    Gajos, K.6
  • 11
    • 84937297201 scopus 로고
    • On the reliability of unitizing contiguous data
    • Krippendorff, K. On the reliability of unitizing contiguous data. Sociological Methodology 25 (1995), 47-76.
    • (1995) Sociological Methodology 25 , pp. 47-76
    • Krippendorff, K.1
  • 12
    • 80052123756 scopus 로고    scopus 로고
    • Ensuring quality in crowdsourced search relevance evaluation: The effects of training question distribution
    • Le, J., Edmonds, A., Hester, V., and Biewald, L. Ensuring quality in crowdsourced search relevance evaluation: The effects of training question distribution. Proc. SIGIR CSE 2010, 21-26.
    • (2010) Proc. SIGIR CSE , pp. 21-26
    • Le, J.1    Edmonds, A.2    Hester, V.3    Biewald, L.4
  • 14
    • 78049407752 scopus 로고    scopus 로고
    • Using the amazon mechanical turk for transcription of spoken language
    • Marge, M., Banerjee, S., and Rudnicky, A. I. Using the Amazon Mechanical Turk for transcription of spoken language. Proc. ICASSP 2010, 5270-5273.
    • (2010) Proc. ICASSP , pp. 5270-5273
    • Marge, M.1    Banerjee, S.2    Rudnicky, A.I.3
  • 15
    • 82655180226 scopus 로고    scopus 로고
    • Conducting behavioral research on amazon's mechanical turk
    • Mason, W. and Suri, S. Conducting behavioral research on Amazon's Mechanical Turk. Behavior Research Methods 44, 1 (2012), 1-23.
    • Behavior Research Methods 44 , vol.1 , Issue.2012 , pp. 1-23
    • Mason, W.1    Suri, S.2
  • 16
    • 78349272018 scopus 로고    scopus 로고
    • The semaine corpus of emotionally coloured character interactions
    • McKeown, G., Valstar, M.F., Cowie, R., and Pantic, M. The SEMAINE corpus of emotionally coloured character interactions. Proc. ICME 2010, 1079-1084.
    • (2010) Proc. ICME , pp. 1079-1084
    • McKeown, G.1    Valstar, M.F.2    Cowie, R.3    Pantic, M.4
  • 17
    • 79958275518 scopus 로고    scopus 로고
    • Cheap, fast and good enough: Automatic speech recognition with non-expert transcription
    • Novotney, S. and Callison-Burch, C. Cheap, fast and good enough: Automatic speech recognition with non-expert transcription. Proc. HLT 2010, 207-215.
    • (2010) Proc. HLT , pp. 207-215
    • Novotney, S.1    Callison-Burch, C.2
  • 18
    • 77952357661 scopus 로고    scopus 로고
    • How reliable are annotations via crowdsourcing: A study about inter-annotator agreement for multi-label image annotation
    • Nowak, S. and Ruger, S. How reliable are annotations via crowdsourcing: A study about inter-annotator agreement for multi-label image annotation. Proc. MIR 2010, 557-566.
    • (2010) Proc. MIR , pp. 557-566
    • Nowak, S.1    Ruger, S.2
  • 19
    • 34547284980 scopus 로고    scopus 로고
    • Human computing and machine understanding of human behavior: A survey
    • Pantic, M., Pentland, A., Nijholt, A., and Huang, T. Human computing and machine understanding of human behavior: A survey. Proc. ICMI 2006, 239-248.
    • Proc. ICMI , vol.2006 , pp. 239-248
    • Pantic, M.1    Pentland, A.2    Nijholt, A.3    Huang, T.4
  • 20
    • 84870517097 scopus 로고    scopus 로고
    • Crowdsourcing micro-level multimedia annotations: The challenge of evaluation and interface
    • Park, S., Mohammadi, G., Artstein, R., and Morency, L.-P. Crowdsourcing micro-level multimedia annotations: The challenge of evaluation and interface. Proc. CrowdMM 2012, 29-34.
    • (2012) Proc. CrowdMM , pp. 29-34
    • Park, S.1    Mohammadi, G.2    Artstein, R.3    Morency, L.-P.4
  • 21
    • 77949275097 scopus 로고    scopus 로고
    • A survey on vision-based human action recognition
    • Poppe, R. A survey on vision-based human action recognition. Image Vision Comput. 28, 6 (2010), 976-990.
    • (2010) Image Vision Comput , vol.28 , Issue.6 , pp. 976-990
    • Poppe, R.1
  • 22
    • 79958083139 scopus 로고    scopus 로고
    • Human computation: A survey and taxonomy of a growing field
    • Quinn, A. J. and Bederson, B. B. Human computation: A survey and taxonomy of a growing field. Proc. CHI 2011, 1403-1412.
    • (2011) Proc. CHI , pp. 1403-1412
    • Quinn, A.J.1    Bederson, B.B.2
  • 23
  • 24
    • 80054844987 scopus 로고    scopus 로고
    • Guess what? a game for affective annotation of video using crowd sourcing
    • Riek, L., O'Connor, M., and Robinson, P. Guess what? A game for affective annotation of video using crowd sourcing. Proc. ACII 2011, 277-285.
    • (2011) Proc. ACII , pp. 277-285
    • Riek, L.1    O'Connor, M.2    Robinson, P.3
  • 28
    • 77956513135 scopus 로고    scopus 로고
    • Hands by hand: Crowd-sourced motion tracking for gesture annotation
    • Spiro, I., Taylor, G., Williams, G., and Bregler, C. Hands by hand: Crowd-sourced motion tracking for gesture annotation. Proc. CVPRW 2010, 17-24.
    • (2010) Proc. CVPRW , pp. 17-24
    • Spiro, I.1    Taylor, G.2    Williams, G.3    Bregler, C.4
  • 29
    • 78149354096 scopus 로고    scopus 로고
    • Efficiently scaling up video annotation with crowdsourced marketplaces
    • Vondrick, C., Ramanan, D., and Patterson, D. Efficiently scaling up video annotation with crowdsourced marketplaces. Computer Vision - ECCV 2010 6314, 610-623.
    • (2010) Computer Vision - ECCV , Issue.6314 , pp. 610-623
    • Vondrick, C.1    Ramanan, D.2    Patterson, D.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.