메뉴 건너뛰기




Volumn 25, Issue 4, 2009, Pages 282-306

Integration of usability evaluation studies via a novel meta-analytic approach: What are significant attributes for effective evaluation?

Author keywords

[No Author keywords available]

Indexed keywords

COGNITIVE WALKTHROUGH; EXPERIMENTAL CONDITIONS; HEURISTIC EVALUATION; META-ANALYSIS; TIME CONSTRAINTS; USABILITY EVALUATION; USABILITY EVALUATION METHODS; USABILITY PROBLEMS;

EID: 70449589332     PISSN: 10447318     EISSN: 15327590     Source Type: Journal    
DOI: 10.1080/10447310802629793     Document Type: Article
Times cited : (9)

References (50)
  • 1
    • 0345098392 scopus 로고    scopus 로고
    • Determining the effectiveness of the usability problem inspector: A theory-based model and tool for finding usability problems
    • Andre, T. S., Hartson, H. R., & Williges, R. C. (2003). Determining the effectiveness of the usability problem inspector: A theory-based model and tool for finding usability problems. Human Factors, 45, 455-482.
    • (2003) Human Factors , vol.45 , pp. 455-482
    • Andre, T.S.1    Hartson, H.R.2    Williges, R.C.3
  • 2
    • 0032281558 scopus 로고    scopus 로고
    • Usability evaluation and prototype fidelity: Users and usability professionals
    • Santa Monica, CA: Human Factors and Ergonomics Society
    • Catani, M. B., & Biers, D. W. (1998). Usability evaluation and prototype fidelity: Users and usability professionals. Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting (pp. 1331-1335). Santa Monica, CA: Human Factors and Ergonomics Society.
    • (1998) Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting , pp. 1331-1335
    • Catani, M.B.1    Biers, D.W.2
  • 4
    • 4644322724 scopus 로고    scopus 로고
    • Understanding inspection methods: Lessons from an assessment of heuristic evaluation
    • In A. Blandford, J. Vanderdonckt, & P. Gray (Eds.), London: Springer
    • Cockton, G., & Woolrych, A. (2001). Understanding inspection methods: Lessons from an assessment of heuristic evaluation. In A. Blandford, J. Vanderdonckt, & P. Gray (Eds.), People and computers (Vol. 15, pp. 171-191). London: Springer.
    • (2001) People and computers , vol.15 , pp. 171-191
    • Cockton, G.1    Woolrych, A.2
  • 5
    • 0002306962 scopus 로고    scopus 로고
    • Comparing usability evaluation principles with heuristics: Problem instances vs. problem types
    • In M. Angela Sasse & C. Johnson(Eds.), Amsterdam: IOS
    • Connell, I. W., & Hammond, N. V. (1999). Comparing usability evaluation principle swith heuristics: Problem instances vs. problem types. In M. Angela Sasse & C. Johnson(Eds.), Proceedings of the IFIP INTERACT '99 Conference on Human-Computer Interaction (pp. 621-629). Amsterdam: IOS.
    • (1999) Proceedings of the IFIP INTERACT '99 Conference on Human-Computer Interaction , pp. 621-629
    • Connell, I.W.1    Hammond, N.V.2
  • 6
    • 18244379599 scopus 로고    scopus 로고
    • Task-selection bias: A case for user-defined tasks
    • Cordes, R. E. (2001). Task-selection bias: A case for user-defined tasks. Human-Computer Interaction, 13, 411-419.
    • (2001) Human-Computer Interaction , vol.13 , pp. 411-419
    • Cordes, R.E.1
  • 7
    • 0012696012 scopus 로고
    • Understanding usability issues addressed by three user-system interface evaluation techniques
    • Cuomo, D. L., & Bowen, C. D. (1994). Understanding usability issues addressed by three user-system interface evaluation techniques. Interacting with Computers, 6(1), 86-108.
    • (1994) Interacting with Computers , vol.6 , Issue.1 , pp. 86-108
    • Cuomo, D.L.1    Bowen, C.D.2
  • 8
    • 0012755963 scopus 로고
    • What is gained and lost when using evaluation methods other than empirical testing
    • In A. Monk, D. Diaper, &M. D. Harrison (Eds.), Cambridge, England: Cambridge University Press
    • Desurvire, H. W., Kondziela, J. M., & Atwood, M. E. (1992). What is gained and lost when using evaluation methods other than empirical testing. In A. Monk, D. Diaper, &M. D. Harrison (Eds.), People and computers (Vol. 7, pp. 89-102). Cambridge, England: Cambridge University Press.
    • (1992) People and computers , vol.7 , pp. 89-102
    • Desurvire, H.W.1    Kondziela, J.M.2    Atwood, M.E.3
  • 11
    • 0036492053 scopus 로고    scopus 로고
    • Effectiveness of user testing and heuristic evaluation as a function of performance classification
    • Fu, L., Salvendy, G., & Turley, L. (2002). Effectiveness of user testing and heuristic evaluation as a function of performance classification. Behaviour & Information Technology, 21, 137-143.
    • (2002) Behaviour & Information Technology , vol.21 , pp. 137-143
    • Fu, L.1    Salvendy, G.2    Turley, L.3
  • 12
    • 0032310074 scopus 로고    scopus 로고
    • Damaged merchandise? A review of experiments that compare usability evaluation methods
    • Gray, W. D., & Salzman, M. C. (1998). Damaged merchandise? A review of experiments that compare usability evaluation methods. Human-Computer Interaction, 13, 203-261.
    • (1998) Human-Computer Interaction , vol.13 , pp. 203-261
    • Gray, W.D.1    Salzman, M.C.2
  • 16
    • 0012750274 scopus 로고    scopus 로고
    • The evaluator effect during first-time use of the cognitive walk through technique
    • In H.-J. Bullinger & J. Ziegler (Eds.), London: Lawrence Erlbaum
    • Hertzum, M., & Jacobsen, N. E. (1999). The evaluator effect during first-time use of the cognitive walk through technique. In H.-J. Bullinger & J. Ziegler (Eds.), Human-computer interaction: Ergonomics and user interfaces (Vol. 1, pp. 1063-1067). London: Lawrence Erlbaum.
    • (1999) Human-computer interaction: Ergonomics and user interfaces , vol.1 , pp. 1063-1067
    • Hertzum, M.1    Jacobsen, N.E.2
  • 17
  • 18
    • 0038726169 scopus 로고    scopus 로고
    • Usability inspections by groups of specialists: Perceived agreement in spite of disparate observations
    • New York: ACM
    • Hertzum, M., Jacobsen, N. E., & Molich, R. (2002). Usability inspections by groups of specialists: Perceived agreement in spite of disparate observations. CHI '02 Extended Abstracts on Human Factors in Computing Systems (pp. 662-663). New York: ACM.
    • (2002) CHI '02 Extended Abstracts on Human Factors in Computing Systems , pp. 662-663
    • Hertzum, M.1    Jacobsen, N.E.2    Molich, R.3
  • 20
    • 80051469788 scopus 로고    scopus 로고
    • Comparing usability problems and redesign proposals as input to practical systems development
    • New York: ACM
    • Hornbæk, K., & Frøkjær, E. (2005). Comparing usability problems and redesign proposals as input to practical systems development. Proceedings of the ACM CHI'05 Conference (pp. 391-400). New York: ACM.
    • (2005) Proceedings of the ACM CHI'05 Conference , pp. 391-400
    • Hornbæk, K.1    Frøkjær, E.2
  • 21
    • 0032283905 scopus 로고    scopus 로고
    • The evaluator effect in usability studies: Problem detection and severity judgments
    • Santa Monica, CA: Human Factors and Ergonomics Society
    • Jacobsen, N. E., Hertzum, M., & John, B. E. (1998). The evaluator effect in usability studies: Problem detection and severity judgments. Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting (pp. 1336-1340). Santa Monica, CA: Human Factors and Ergonomics Society.
    • (1998) Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting , pp. 1336-1340
    • Jacobsen, N.E.1    Hertzum, M.2    John, B.E.3
  • 23
    • 0031271870 scopus 로고    scopus 로고
    • Evaluating a multimedia authoring tool with cognitive walk through and think-aloud user studies
    • John, B. E., & Mashyna, M. M. (1997). Evaluating a multimedia authoring tool with cognitive walk through and think-aloud user studies. Journal of the American Society of Information Science, 48, 1004-1022.
    • (1997) Journal of the American Society of Information Science , vol.48 , pp. 1004-1022
    • John, B.E.1    Mashyna, M.M.2
  • 24
    • 57249098399 scopus 로고
    • Comparison of empirical testing and walk through methods in user interface evaluation
    • New York: ACM
    • Karat, C.-M., Campbell, R., & Fiegel, T. (1992). Comparison of empirical testing and walk through methods in user interface evaluation. CHI Conference on Human Factors in Computing Systems (pp. 397-404). New York: ACM.
    • (1992) CHI Conference on Human Factors in Computing Systems , pp. 397-404
    • Karat, C.-M.1    Campbell, R.2    Fiegel, T.3
  • 27
    • 0031188474 scopus 로고    scopus 로고
    • Comparison of evaluation methods using structured usability problem reports
    • Lavery, D., Cockton, G., & Atkinson, M. P. (1997). Comparison of evaluation methods using structured usability problem reports. Behaviour & Information Technology, 16(4/5),246-266.
    • (1997) Behaviour & Information Technology , vol.16 , Issue.4-5 , pp. 246-266
    • Lavery, D.1    Cockton, G.2    Atkinson, M.P.3
  • 28
    • 77953889660 scopus 로고    scopus 로고
    • Complementarity and convergence of heuristic evaluation and usability test: A case study of universal brokerage platform
    • New York: ACM
    • Law, L.-C., & Hvannberg, E. T. (2002). Complementarity and convergence of heuristic evaluation and usability test: A case study of universal brokerage platform. Proceedings of the Second Nordic Conference on Human-Computer Interaction (pp. 71-80). New York: ACM.
    • (2002) Proceedings of the Second Nordic Conference on Human-Computer Interaction , pp. 71-80
    • Law, L.-C.1    Hvannberg, E.T.2
  • 31
    • 84963240854 scopus 로고
    • Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces
    • New York: ACM
    • Lewis, C., Polson, P., Wharton, C., & Rieman, J. (1990). Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces. Proceedings of the ACM CHI '90Conference (pp. 235-242). New York: ACM.
    • (1990) Proceedings of the ACM CHI '90Conference , pp. 235-242
    • Lewis, C.1    Polson, P.2    Wharton, C.3    Rieman, J.4
  • 32
    • 0028071120 scopus 로고
    • Sample sizes for usability studies: Additional considerations
    • Lewis, J. R. (1994). Sample sizes for usability studies: Additional considerations. Human Factors, 36(2), 368-378.
    • (1994) Human Factors , vol.36 , Issue.2 , pp. 368-378
    • Lewis, J.R.1
  • 33
    • 18244386731 scopus 로고    scopus 로고
    • Evaluation procedures for adjusting problem-discovery rates estimated from small samples
    • Lewis, J. R. (2001). Evaluation procedures for adjusting problem-discovery rates estimated from small samples. International Journal of Human-Computer Interaction, 13(4), 445-479.
    • (2001) International Journal of Human-Computer Interaction , vol.13 , Issue.4 , pp. 445-479
    • Lewis, J.R.1
  • 36
    • 0012652747 scopus 로고
    • Evaluating the thinking aloud technique for use by computer scientists
    • In H. R. Hartson & D. Hix (Eds.), Norwood, NJ: Ablex
    • Nielsen, J. (1990). Evaluating the thinking aloud technique for use by computer scientists. In H. R. Hartson & D. Hix (Eds.), Advances in human-computer interaction (pp. 69-82).Norwood, NJ: Ablex.
    • (1990) Advances in human-computer interaction , pp. 69-82
    • Nielsen, J.1
  • 37
    • 57249090957 scopus 로고
    • Finding usability problems through heuristic evaluation
    • New York: ACM
    • Nielsen, J. (1992). Finding usability problems through heuristic evaluation. CHI Conference on Human Factors in Computing Systems (pp. 373-380). New York: ACM.
    • (1992) CHI Conference on Human Factors in Computing Systems , pp. 373-380
    • Nielsen, J.1
  • 38
    • 0028508420 scopus 로고
    • Estimating the number of subjects needed for a thinking aloud test
    • Nielsen, J. (1994). Estimating the number of subjects needed for a thinking aloud test. International Journal of Human-Computer Studies, 41, 385-397.
    • (1994) International Journal of Human-Computer Studies , vol.41 , pp. 385-397
    • Nielsen, J.1
  • 39
    • 0003336246 scopus 로고    scopus 로고
    • Usability testing
    • In G. Salvendy (Ed.), New York: John Wiley & Sons
    • Nielsen, J. (1997). Usability testing. In G. Salvendy (Ed.), Handbook of human factors and ergonomics(2nd ed., pp. 1543-1568). New York: John Wiley & Sons.
    • (1997) Handbook of human factors and ergonomics(2nd ed , pp. 1543-1568
    • Nielsen, J.1
  • 41
    • 85049653881 scopus 로고
    • Heuristic evaluation of user interface
    • New York: ACM
    • Nielsen, J., & Molich, R. (1990). Heuristic evaluation of user interface. In CHI '90 Conference Proceedings (pp. 249-256), New York: ACM.
    • (1990) CHI '90 Conference Proceedings , pp. 249-256
    • Nielsen, J.1    Molich, R.2
  • 43
    • 21944445728 scopus 로고    scopus 로고
    • Heuristic walkthroughs: Finding the problems without the noise
    • Sears, A. (1997). Heuristic walkthroughs: Finding the problems without the noise. International Journal of Human-Computer Interaction, 9, 213-234.
    • (1997) International Journal of Human-Computer Interaction , vol.9 , pp. 213-234
    • Sears, A.1
  • 44
    • 0033481635 scopus 로고    scopus 로고
    • Cognitive walkthroughs: Understanding the effect of task description detail on evaluator performance
    • Sears, A., & Hess, D. J. (1999). Cognitive walkthroughs: Understanding the effect of task description detail on evaluator performance. International Journal of Human-Computer Interaction, 11, 185-200.
    • (1999) International Journal of Human-Computer Interaction , vol.11 , pp. 185-200
    • Sears, A.1    Hess, D.J.2
  • 45
    • 0025558729 scopus 로고
    • Streamlining the design process: Running fewer subjects
    • Santa Monica, CA: Human Factors and Ergonomics Society
    • Virzi, R. A. (1990). Streamlining the design process: Running fewer subjects. Human Factors Society 34th Annual Meeting (pp. 291-294). Santa Monica, CA: Human Factors and Ergonomics Society.
    • (1990) Human Factors Society 34th Annual Meeting , pp. 291-294
    • Virzi, R.A.1
  • 46
    • 0026757295 scopus 로고
    • Refining the test phase of usability evaluation: How many subjects is enough?
    • Virzi, R. A. (1992). Refining the test phase of usability evaluation: How many subjects is enough? Human Factors, 34, 457-468.
    • (1992) Human Factors , vol.34 , pp. 457-468
    • Virzi, R.A.1
  • 48
    • 0027845878 scopus 로고
    • A comparison of three usability evaluation methods: Heuristic, think-aloud, and performance testing
    • Santa Monica, CA: Human Factors and Ergonomics Society
    • Virzi, R. A., Sorce, J., & Herbert, L. B. (1993). A comparison of three usability evaluation methods: Heuristic, think-aloud, and performance testing. Proceedings of the Human Factors Society 36th Annual Meeting (pp. 309-313). Santa Monica, CA: Human Factors and Ergonomics Society.
    • (1993) Proceedings of the Human Factors Society 36th Annual Meeting , pp. 309-313
    • Virzi, R.A.1    Sorce, J.2    Herbert, L.B.3
  • 49
    • 2642533533 scopus 로고    scopus 로고
    • Evaluating usability methods: Why the current literature fails the practitioner
    • Wixon, D. (2003). Evaluating usability methods: Why the current literature fails the practitioner. interactions, 10(4), 28-34.
    • (2003) Interactions , vol.10 , Issue.4 , pp. 28-34
    • Wixon, D.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.