DA 306: Inspection Methods

Size (in hours): 40

Assignor: Bekker

Assignment Introduction

Determining whether a product is usable can be done in different ways: through usability testing, which involves users, or through inspection methods, which involves experts to predict usability problems. This approach is often a very cost-effective way to find usability problems, before choosing a more costly method such as usability testing. Also it allows evaluation when access to users is problematic. One of the most commonly used inspection methods is the walkthrough method. This period we will apply a walkthrough method to computer games for children and assess them on usability and fun!

Learning Objectives

Learning the advantages and disadvantages of inspection methods. Learning how to conduct one inspection method, and interpreting the results. Building up competency in the "User Focus and Interaction" area.

Learning Activities

         2 hour-lecture on inspection methods (usability inspection methods for non-game products and a specific inspection method for evaluating games)

         Reading for SEEM walkthrough method:

o        Chapter 1,2,3 from Norman, The design of everyday things.

o        Malone, T.W., en Lepper, M.R. (1987) Making learning fun: a taxonomy of intrinsic motivations for learning, In: Snow, R.E., and Farr, M.J. (eds.) Aptitude, learning and interaction III. Connative and affective process analysis. Hillsdale, N.J.: Erlbaum.

o        Manual on walkthrough method.

         Apply SEEM walkthrough method.

         Individually prepare for conducting the evaluation.

         Each individual conducts the evaluation.

         2 hours lecture on combining evaluation data from different experts.

         Combine and discuss the findings from the individual evaluators in a group.

          Write a report on the findings.


Deliver a report of approximately 1500 words including:

Description of usability in this context (to help you interpret the usability trade-offs)

Description of user profile(s), to make explicit what knowledge you assume they have

Description of the tasks that were considered (including possible errors) for each user profile, described in separate steps from start of task to reaching the related goal.

Separate lists of problems found for each evaluator

. Combined list of problems of evaluators in the group

Discussion of differences between evaluators (if any), and discussion of interpretation of SEEM questions.

Suggestions for fixing the problems found.

Presentation of findings per group.