A&MI home
Archives & Museum Informatics
158 Lee Avenue
Toronto Ontario
M4E 2P3 Canada

ph: +1 416-691-2516
fx: +1 416-352-6025

info @ archimuse.com

Search Search

Join our Mailing List.


published: March 2004
analytic scripts updated:
November 7, 2010

Creative Commons Attribution-Noncommercial-No Derivative Works 3.0  License

Investigating Heuristic Evaluation: A Case Study
Laura Bendoly, Atlanta History Center, USA
Kate Haley Goldman, Institute for Learning Innovation, USA

Session: Evaluation

When museum professionals speak of evaluating a web site, they primarily mean formative evaluation, and by that they primarily mean testing the usability of the site. In the for-profit world, usability testing is a multi-million dollar industry, while in non-profits we often rely on far too few dollars to do too much. Hence, heuristic evaluation is one of the most popular methods of usability testing in museums.

Previous research has shown that the ideal usability evaluation is a mixed-methods approach, using both qualitative and quantitative, expert-focused and user-focused methods. But some within the online museum field have hypothesized that heuristic evaluation alone is sufficient to recognize most usability issues. To date there has been no studies on how reliable or valid heuristic evaluation is for museum web sites. This is critical if heuristic evaluation is to be used alone rather than in tandem with other methods.

This paper will focus on work being done at the Atlanta History Center as a case study for the effectiveness of heuristic evaluation in a museum web site setting. It is a project currently in the beginning stages of development. The Center is applying a thorough mixed-methods approach to evaluation, including heuristic evaluation. The results of this project will assess how complete and how useful a rigorous heuristic evaluation is alone and in conjunction with other methods in the development and implementation of an online educational resource.