Nothing Special   »   [go: up one dir, main page]

skip to main content
10.5555/772072.772142guidebooksArticle/Chapter ViewAbstractPublication PagesBookacm-pubtype
chapter

Inspection-based evaluations

Published: 01 January 2002 Publication History

Abstract

No abstract available.

References

[1]
Apple Computer, I. (1992). Macintosh human interface guidelines. Reading, MA: Addison-Wesley.
[2]
Bastien, J. M. C., & Scapin, D. L. (1995). Evaluating a user interface with ergonomic criteria. International Journal of Human-Computer Interaction, 7, 105-121.
[3]
Bellotti, V. M. E. (1990). A framework for assessing applicability of HCI techniques. In D. Diaper, D. Gilmore, G. Cockton, & B. Shackel (Eds.), Proceedings of IFIP INTERACT '90: Human-Computer Interaction (pp. 213-218). Amsterdam: North-Holland.
[4]
Beyer, H., & Holtzblatt, K. (1998). Contextual design: Defining customer-centered systems. San Francisco: Morgan Kaufmann.
[5]
Cockton, G., & Lavery, D. (1999). A framework for usability problem extraction. In M. A. Sasse & C. Johnson (Eds.), IFIP INTERACT "99: Human-Computer Interaction (pp. 344-352). Amsterdam: IOS Press.
[6]
Cockton, G., & Woolrych, A. (2001). Understanding inspection methods: Lessons from an assessment of heuristic evaluation. In A. Blandford & J. Vanderdonckt (Eds.), People and computers XV (pp. 171-192). Berlin: Springer-Verlag.
[7]
Connell, I. W., & Hammond, N. V. (1999). Comparing usability evaluation principles with heuristics: Problem instances vs. problem types. In M. A. Sasse & C. Johnson (Eds.), IFIP INTERACT '99: Human-Computer Interaction (pp. 621-629). Amsterdam: IOS Press.
[8]
Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design and analysis issues for field settings. Chicago, IL: Rand McNally.
[9]
Cuomo, D. L., & Bowen, C. D. (1992). Stages of user activity model as a basis for user-system interface evaluations. Proceedings of the Human Factors Society 36th Annual Meeting (pp. 1254-1258). Santa Monica, CA: Human Factors Society.
[10]
Cuomo, D. L., & Bowen, C. D. (1994). Understanding usability issues addressed by three user-system interface evaluation techniques. Interacting with Computers, 6, 86-108.
[11]
deSouza, F.D., & Bevan, N. (1990). The use of guidelines in menu interface design: Evaluation of a draft standard. In D. Diaper, D. Gilmore, G. Cockton, & B. Shackel (Eds.), Proceedings of IFIP INTERACT '90: Human-Computer Interaction (pp. 435-440). Amsterdam: North-Holland.
[12]
Desurvire, H. W., Kondziela, J. M., & Atwood, M. E. (1992). What is gained and lost when using evaluation methods other than empirical testing. In A. Monk, D. Diaper, & M. D. Harrison (Eds.) Proceedings of the HCI '92 Conference on People and Computers VII (pp. 89-102). Cambridge, England: Cambridge University Press.
[13]
Dutt, A., Johnson, H., & Johnson, P. (1994). Evaluating evaluation methods. In G. Cockton, S. W. Draper, & G. R. S. Weir (Eds.), People and computers IX. Proceedings of BCS HCI 94 (pp. 109-121). Cambridge, England: Cambridge University Press.
[14]
Franzke, M. (1995). Turning research into practice: Characteristics of display-based interaction. In I. Katz, R. Mack, & L. Marks (Eds.), Proceedings of ACM CHI '95 Conference on Human Factors in Computing Systems (pp. 421-428). New York: ACM Press.
[15]
Gram, C., & Cockton, G. (1996). Design principles for interactive software. London: Chapman & Hall.
[16]
Gray, W. D., John, B. E., & Atwood, M. E. (1992). The precis of Project Ernestine, or, an overview of a validation of GOMS. In P. Bauersfeld, J. Bennett, & G. Lynch (Eds.), Proceedings of ACM CHI '92 Conference on Human Factors in Computing Systems (pp. 307-312). New York: ACM Press.
[17]
Gray, W. D., & Salzman, M. (1998). Damaged merchandise? A review of experiments that compare usability evaluation methods. Human-Computer Interaction, 13, 203-261.
[18]
Green, T. R. G., & Petre, M. (1996). Usability analysis of visual programming environments: A "cognitive dimensions" framework. Journal of Visual Languages and Computing, 7, 131-174.
[19]
Jeffries, R. (1994). Usability problem reports: Helping evaluators communicate effectively with developers. In J. Nielsen & R. L. Mack (Eds.), Usability inspection methods (pp. 273-294). New York: John Wiley.
[20]
Jeffries, R., Miller, J. R., Wharton, C., & Uyeda, K. M. (1991). User interface evaluation in the real world: A comparison of four techniques. In S. P. Robertson, G. M. Olson, & J. S. Olson (Eds.), Proceedings of ACM CHI '91 Conference on Human Factors in Computing Systems (pp. 119-124). New York: ACM Press.
[21]
John, B. E., & Marks, S.J. (1997). Tracking the effectiveness of usability evaluation methods. Behaviour and Information Technology, 16, 188-202.
[22]
John, B. E., & Mashyna, M. M. (1997). Evaluating a multimedia authoring tool.Journal of the American Society of Information Systems, 48, 1004-1022.
[23]
John, B. E., &Packer, H. (1995). Learning and using the cognitive walk-through method: A case study approach. In I. Katz, R. Mack, & L. Marks (Eds.), Proceedings of ACM CHI '95 Conference on Human Factors in Computing Systems (pp. 429-436). New York: ACM Press.
[24]
Johnson, H. (1997). Generating user requirements from discount usability evaluations. In D. Harris (Ed.), Engineering psychology and cognitive ergonomics (Vol. 2, pp. 339-357). Aldershot, UK: Ashgate Publishing.
[25]
Karat, C., Campbell, R., & Fiegel, T. (1992). Comparison of empirical testing and walkthrough methods in user interface evaluation. In P. Bauersfeld, J. Bennett, & G. Lynch (Eds.), Proceedings of ACM CHI '92 Conference on Human Factors in Computing Systems (pp. 397-404). New York: ACM Press.
[26]
Lavery, D., & Cockton, G. (1997a). Cognitive Walkthrough: Usability evaluation materials (Technical Report TR-1997-20). Department of Computing Science, University of Glasgow, Scotland.
[27]
Lavery, D., & Cockton, G. (1997b). Representing predicted and actual usability problems. In H. Johnson, P. Johnson, & E. O'Neill (Eds.), Proceedings of International Workshop on Representations in Interactive Software Development (pp. 97-108). London: Queen Mary and Westfield College, University of London.
[28]
Lavery, D., Cockton, G., & Atkinson, M. (1996a). Analytic usability evaluation materials. Retrieved May 2002 from http://www.cet. sunderland.ac.uk/~csOgco/asp.htm
[29]
Lavery, D., Cockton, G., & Atkinson, M. (1996b). Cognitive dimensions: Usability evaluation materials (Technical Report TR-1996-17). Department of Computing Science, University of Glasgow, Scotland.
[30]
Lavery, D., Cockton, G., & Atkinson, M. (1996c). Heuristic evaluation for software visualisation: Usability evaluation materials (Technical Report TR-1996-16). Department of Computing Science, University of Glasgow, Scotland.
[31]
Lavery, D., Cockton, G., & Atkinson, M. (1996d). Heuristic evaluation: Usability evaluation materials (Technical Report TR-1996-15). Department of Computing Science, University of Glasgow, Scotland.
[32]
Lavery, D., Cockton, G., & Atkinson, M. (1997). Comparison of evaluation methods using structured usability problem reports. Behaviour and Information Technology, 16, 246-266.
[33]
Lewis, C., Polson, P., Wharton, C., & Rieman, J. (1990). Testing a walk-through methodology for theory-based design of walk-up-and-use interfaces. In J. Carrasco & J. Whiteside (Eds.), Proceedings of ACM CHI '90 Conference on Human Factors in Computing Systems (pp. 235-242). New York: ACM Press.
[34]
Lewis, C., & Wharton, C. (1997). Cognitive walkthroughs. In M. Helander, T. K. Landauer, & P. Prabhu (Eds.), Handbook of human-computer interaction (2nd ed., pp. 717-732). Amsterdam: Elsevier Science.
[35]
Mack, R., & Montaniz, F. (1994). Observing, predicting and analyzing usability problems. In J. Nielsen & R. L. Mack (Eds.), Usability inspection methods (pp. 295-339). New York: John Wiley.
[36]
Martin, J., Chapman, K. K., & Leben, J. (1991). Systems application architecture: Common user access. Englewood Cliffs, NJ: Prentice-Hall.
[37]
Microsoft Corp. (1999). Microsoft Windows user experience. Redmond, WA: Microsoft Press.
[38]
Molich, R., & Nielsen, J. (1990). Improving a human-computer dialogue. Communications of the ACM, 33, 338-342.
[39]
Muller, M. J., McClard, A., Bell, B., Dooley, S., Meiskey, L., MesKill, J. A., Sparks, R., & Tellam, D. (1995). Validating an extension to participatory heuristic evaluation: Quality of work and quality of life. In I. Katz, R. Mack, & L. Marks (Eds.), Proceedings of ACM CHI '95 Conference on Human Factors in Computing Systems (Conference Companion) (pp. 115-116). New York: ACM Press.
[40]
Nielsen, J. (1990). Paper versus computer implementations as mockup scenarios for heuristic evaluation. In D. Diaper, D. Gilmore, G. Cockton, & B. Shackel (Eds.), Proceedings of IFIP INTERACT '90: Human-Computer Interaction (pp. 315-320). Amsterdam: North-Holland.
[41]
Nielsen, J. (1992). Finding usability problems through heuristic evaluation. In P. Bauersfeld, J. Bennett, & G. Lynch (Eds.), Proceedings of ACM CHI '92 Conference on Human Factors in Computing Systems (pp. 373-380). New York: ACM Press.
[42]
Nielsen, J. (1993a). Iterative user-interface design. IEEE Computer, 26, 32-41.
[43]
Nielsen, J. (1993b). Usability engineering. New York: Academic Press.
[44]
Nielsen, J. (1994). Enhancing the explanatory power of usability heuristics. In B. Adelson, S. Dumais, & J. Olson (Eds.), Proceedings of ACM CHI '94 Conference on Human Factors in Computing Systems (pp. 152-158). New York: ACM Press.
[45]
Nielsen, J. (1995). Getting usability used. in K. Nordby, P. Helmersen, D. Gilmore, & S. Arnesen (Eds.), Proceedings of INTERACT '95: IFIP TC13 Fifth International Conference on Human-Computer Interaction (pp. 3-12). London: Chapman & Hall.
[46]
Nielsen, J., & Molich, R. (1990). Heuristic evaluation of user interfaces. In J. Carrasco & J. Whiteside (Eds.), Proceedings of ACM CHI '90 Conference on Human Factors in Computing Systems (pp. 249-256). New York: ACM Press.
[47]
Norman, D. A. (1986). Cognitive engineering. In D. A. Norman & S. W. Draper (Eds.), User centered system design: New perspectives on human-computer interaction (pp. 31-61). Mahwah, NJ: Lawrence Erlbaum Associates.
[48]
Polson, P. G., & Lewis, C. H. (1990). Theory-based design for easily learned interfaces. Human-Computer Interaction, 5, 191-220.
[49]
Polson, P. G., Lewis, C., Rieman, J., & Wharton, C. (1992). Cognitive Walkthroughs: A method for theory-based evaluation of user interfaces. International Journal of Man-Machine Studies, 36, 741-773.
[50]
Reed, P., Holdaway, K., Isensee, S., Buie, E., Fox, J., williams, J., & Lund, A. (1999). User interface guidelines and standards: Progress, issues, and prospects. Interacting With Computers, 12, 119-142.
[51]
Rieman, J., Davies, S., Hair, D. C., Esemplare, M., Polson, P., & Lewis, C. (1991). An automated Cognitive Walkthrough. In S. P. Robertson, G. M. Olson, & J. S. Olson (Eds.), Proceedings of ACM CHI '91 Conference on Human Factors in Computing Systems (pp. 427-428). New York: ACM Press.
[52]
Rosenbaum, S., Rohn, J. A., & Humburg, J. (2000). A toolkit for strategic usability: Results from workshops, panels, and surveys. In R. Little & L. Nigay (Eds.), Proceedings of ACM CHI 2000 Conference on Human Factors in Computing Systems (pp. 337-344). New York: ACM Press.
[53]
Rowley, D. E., & Rhoades, D. G. (1992). The Cognitive Jogthrough: A fast-paced user interface evaluation procedure. in P. Bauersfeld, J. Bennett, & G. Lynch (Eds.), Proceedings of ACM CHI '92 Conference on Human Factors in Computing Systems (pp. 389-395). New York: ACM Press.
[54]
Sawyer, P., Flanders, A., & Wixon, D. (1996). Making a difference-the impact of inspections. In M. J. Tauber (Ed.), Proceedings of ACM CHI '96 Conference on Human Factors in Computing Systems-Conference Companion (pp. 376-282). New York: ACM Press.
[55]
Sears, A. (1997). Heuristic Walkthroughs: Finding the problems without the noise. International Journal of Human-Computer Interaction, 9, 213-234.
[56]
Sears, A., & Hess, D. (1999). Cognitive Walkthroughs: Understanding the effect of task description detail on evaluator performance. International Journal of Human-Computer Interaction, 11, 185-200.
[57]
Smith, S. L., & Mosier, J. N. (1986). Guidelines for designing user interface software (Technical Report ESD-TR-86-278). Bedford, MA: MITRE Corporation.
[58]
Spencer, R. (2000). The streamlined Cognitive Walkthrough method, working around social constraints encountered in a software development company. In R. Little & L. Nigay (Eds.), Proceedings of ACM CHI 2000 Conference on Human Factors in Computing Systems (pp. 353-359). New York: ACM Press.
[59]
van Schaik, P., Edwards, J., & Petrie, H. (1995). SATURN Project. Deliverable 3c: An initial evaluation of an existing ATM. Technical Report, Psychology Division, University of Hertfordshire, England
[60]
Virzi, R. A., Sorce, J. F., & Herbert, L. B. (1993). A comparison of three usability evaluation methods: Heuristic, think-aloud, and performance testing. Proceedings of the Human Factors and Ergonomics Society 37th Annual Meeting (pp. 309-313). Human Factors and Ergonomics Society.
[61]
Wharton, C. (1992). Cognitive Walkthroughs: Instructions, forms and examples (Technical Report CU-ICS-#92-17). Institute of Cognitive Science, University of Colorado, Boulder.
[62]
Wharton, C., Bradford, J., Jeffries, R., &Franzke, M. (1992). Applying Cognitive Walkthroughs to more complex user interfaces: Experiences, issues, and recommendations. In P. Bauersfeld, J. Bennett. & G. Lynch (Eds.), Proceedings of ACM CHI '92 Conference on Human Factors in Computing Systems (pp. 381-388). New York: ACM Press.
[63]
Wharton, C., Rieman, J., Lewis, C., & Polson, P. (1994). The Cognitive Walkthrough: A practitioner's guide. In J. Nielsen & R. L. Mack (Eds.), Usability inspection methods(pp. 105-140). New York: John Wiley & Sons.

Cited By

View all
  • (2021)On evaluating online teaching and learning experience: A usability evaluation study of synchronous teaching platformsCHI Greece 2021: 1st International Conference of the ACM Greek SIGCHI Chapter10.1145/3489410.3489435(1-7)Online publication date: 25-Nov-2021
  • (2019)Validating a Heuristic Evaluation Method An Application TestProceedings of Mensch und Computer 201910.1145/3340764.3344465(593-597)Online publication date: 8-Sep-2019
  • (2014)Optimizing Usability Studies by Complementary Evaluation MethodsProceedings of the 28th International BCS Human Computer Interaction Conference on HCI 2014 - Sand, Sea and Sky - Holiday HCI10.14236/ewic/hci2014.12(110-119)Online publication date: 9-Sep-2014
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Guide books
The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications
January 2002
1310 pages
ISBN:0805838384

Publisher

L. Erlbaum Associates Inc.

United States

Publication History

Published: 01 January 2002

Qualifiers

  • Chapter

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 17 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2021)On evaluating online teaching and learning experience: A usability evaluation study of synchronous teaching platformsCHI Greece 2021: 1st International Conference of the ACM Greek SIGCHI Chapter10.1145/3489410.3489435(1-7)Online publication date: 25-Nov-2021
  • (2019)Validating a Heuristic Evaluation Method An Application TestProceedings of Mensch und Computer 201910.1145/3340764.3344465(593-597)Online publication date: 8-Sep-2019
  • (2014)Optimizing Usability Studies by Complementary Evaluation MethodsProceedings of the 28th International BCS Human Computer Interaction Conference on HCI 2014 - Sand, Sea and Sky - Holiday HCI10.14236/ewic/hci2014.12(110-119)Online publication date: 9-Sep-2014
  • (2011)A web usability evaluation process for model-driven web developmentProceedings of the 23rd international conference on Advanced information systems engineering10.5555/2026716.2026730(108-122)Online publication date: 20-Jun-2011
  • (2010)Work-domain knowledge in usability evaluationJournal of Systems and Software10.1016/j.jss.2010.02.02683:11(2019-2030)Online publication date: 1-Nov-2010
  • (2010)The usability inspection performance of work-domain expertsInteracting with Computers10.1016/j.intcom.2009.09.00122:2(75-87)Online publication date: 1-Mar-2010
  • (2009)Expert system for supporting conformity inspections of software application interfaces to the ISO 9241Proceedings of the 2009 ACM symposium on Applied Computing10.1145/1529282.1529305(110-115)Online publication date: 8-Mar-2009
  • (2009)First experimentation of the ErgoPNets method using dynamic modeling to communicate usability evaluation resultsProceedings of the 7th FIP WG 13.5 international conference on Human Error, Safety and Systems Development10.1007/978-3-642-11750-3_7(81-95)Online publication date: 23-Sep-2009
  • (2008)Introducing item response theory for measuring usability inspection processesProceedings of the SIGCHI Conference on Human Factors in Computing Systems10.1145/1357054.1357196(893-902)Online publication date: 6-Apr-2008
  • (2008)Metaphors of human thinking for usability inspection and designACM Transactions on Computer-Human Interaction10.1145/1314683.131468814:4(1-33)Online publication date: 19-Jan-2008
  • Show More Cited By

View Options

View options

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media