Join
our Mailing List.
Published: March 15, 2001. |
![]() Making good use of user input: Incorporation of public evaluation into the system development processFiona Marshall, The British Museum, UKAbstract:Predicting the structure and function of a public system without user input is fraught with danger, but this user input is very difficult to obtain without a working prototype. Traditional system design methods make little provision for evaluation and testing by public users. Weaving the results of evaluation and testing into the end of the system development process presents risks: how, for example, can we address unforeseen navigation problems identified during user evaluation, while at the same time ensuring that tight deadlines are met? Another challenge is that we often work with teams of specialists (interface designers, HTML developers, database specialists, content creators etc): alteration of one element often has knock-on effects on other team members. Drawing on experience with the COMPASS project at the British Museum, the author will present practical ideas for incorporating evaluation into the system development process.Keywords: Evaluation, Prototyping, Usability testing, User interface design, COMPASS, British Museum. IntroductionIn this paper, I shall consider how the results of usability and prototype testing can be fed into the development process most effectively. Much has been written about the need for user input in system development, but do we test our systems with prospective users at the optimum time in the development process? I shall discuss a number of alternative methodologies for evaluation and prototyping, highlighting the challenges they present. Finally, I shall suggest a possible way forward, based on my experience with the COMPASS Project. User interface design and testing A plethora of published information exists on the design of user interfaces. A gem amongst these is Nielsen (2000), which provides excellent advice - including many examples - on designing usable websites. Schofield & Flute (1997) provide similarly practical advice on designing kiosk software and hardware. The case for usability testing and prototyping is well made by Nielsen (1993) and Preece (1994), both of which describe alternative procedures. Mitchell and Economou (2000) advise on the practice of formative evaluation, prototyping and summative evaluation in museums. These references show that there are many methods of evaluation and prototyping; Shackel in Preece (1994) explains that you will probably need to use different methods at different stages of the project. There are two basic types of prototype: evolutionary and throw-away. The best prototypes are probably those that are produced quickly and thrown away. They are produced near the start of the development cycle, before too much costly work has been invested in interface design. In the case of complex systems using a range of software tools, the further you get into development without prototyping, the less chance you have to change the interface and the greater the risk of failure. Costs are always a major factor in any project, but they seem to be dwelt on most particularly when working with a commercial supplier. My perception from talking to a number of suppliers is that they are often reluctant to prototype. In the context of competitive bids and cost cutting, they may feel that prototyping is too expensive. Some developers seem to think that the client should know how the end users of their system will want it to function. Other developers claim they know what system users want, based on previous projects. Neither assumption is true: both client and developer needs to find out what prospective users want and the best way to do that is by prototype testing. Below I shall describe different types of usability testing and prototyping and examples of how they were applied to the COMPASS project at the British Museum. First, though, some brief information about COMPASS. COMPASSCOMPASS (Collections Multimedia Public Access System) is a database-driven multimedia guide to a selection of around 5,000 of the Museum's objects. The system is aimed at adults with no prior knowledge of the collections, as well as school children aged 8-11 and family groups. More information is given in the Appendix to this paper. The full COMPASS system will be available in the British Museum on wide screen format touchscreens linked to the Museum's internal high speed network. The content includes high quality images, 3d reconstructions, animations and specially-written text narratives. The interface - designed to be used from a touchscreen only - uses DHTML templates accessing System Simulation Ltd's Index+ database. A web version, using a different design of user interface (HTML templates), was launched on the web in June 2000 ( (Note that this is currently separate from the British Museum's main website, although linked - if you can find the link. The Museum plans to bring the two systems closer together.) The internet interface is deliberately low tech, ensuring that it can be used without plug-ins on low-end PCs with low-speed internet links, as are still widely used in UK schools and homes. Accessibility for users with visual impairments was a key factor in design of the graphical and text-only web interfaces. The commercial software supplier System Simulation Ltd (SSL) provided the database (based on Index+) and user interfaces for the Reading Room and internet. Due to the number of contributors involved in the project, development occurred in three separate, but linked, strands: content production, development of the production database and specification and design of the user interface. All content on COMPASS has been produced with our different audiences' needs in mind. On many public systems, the user needs prior knowledge of the content to know what they can search for. To make things easier, we are providing indexes which use terms in common usage - rather than technical/curatorial terms. On the touchscreen version, users just point at the word they want. In addition, on both versions we provide virtual 'tours' of groups of objects in familiar themes, objects of the month available from the Home page and thousands of links between the objects and to background information. The Museum's emphasis has been on developing high quality content rather than a high tech user interface. We wanted a clean design, through which our rich content would shine through, without the need for plug-ins and frames. Intuitive and readable were two key requirements. The Museum worked closely with SSL, different target groups and experts in evaluation in an effort to realise its aims of ease of use and accessibility. Development and revision of the user interface has been one of the most challenging aspects of the project. Based in part on this experience, I shall now describe some evaluation and prototyping methods and the challenges they present. Evaluation and prototypingInformal interviewsThe first stage in any system development is, of course, to find out about the target users and what they want from your system. This could involve talking to prospective users (eg teachers, visitors) direct and referring to organisations (including museums) who have already done this type of research. Challenges: The main challenge with asking potential users what they want is trying to get a reliable response:
SurveysA more reliable approach is to analyse things like public enquiries to your Information desk. With COMPASS, we were able to use the results of general visitor surveys which indicated that a significant proportion of visitors to the Museum would like more information about the collections - including online.Formative prototypeIn an ideal world, everyone would produce prototypes before they embark on any development work at all, in order to see whether the public like their initial ideas. In 1998, the British Museum produced a 'throw-away' prototype COMPASS system which was evaluated by museum visitors and professional colleagues. The main emphasis of this early prototype was the evaluation of different technical options for delivery of multimedia. The prototype enthused curators about the potential of multimedia and helped to obtain management - and, most importantly, donor - support. Challenges:
Written specificationsThese are supposed to describe exactly what the system must and should do, often for the purposes of producing a contract between museum and supplier. Challenges: Designed to ensure that the client and the developer knows what to expect and that neither side changes the requirement halfway through the contract, a written specification for user interfaces and screen designs can cause all sorts of misunderstandings and problems.
Mock-up screensMy own preference is for sketches produced as a result of brainstorming sessions, but some designers will prefer to work up full graphical designs on screen. Challenges:
Horizontal prototypes A horizontal prototype provides most of the user interface, but without the depth (eg content, links). It could be useful for testing usability. Challenges: The lack of content could distract testers. (With COMPASS, the developer produced a full working system, to which we added limited content for testing with visitors (see below). Although the tests were generally useful, testers were frustrated that their efforts frequently yielded 'no hits'.) Vertical prototypes A vertical prototype is a working system developed for one specific part of the required functionality, eg searching. It is good for testing particularly complex functionality. Challenges: Production of such a prototype could be very time consuming for complex systems. Scenarios A scenario is a way of describing how a system is used in a particular situation, i.e. when a user carries out a specific task or set of tasks. It might be used to demonstrate a system or to see whether testers can carry out one particular course of action. Scenarios can be presented in text format or can be built in systems such as Macromedia Director. We used the latter approach to test the COMPASS touchscreen searches. The COMPASS designer mocked up screen designs, initially in Photoshop, and then stitched them together into a Director movie. We asked visitors to try to search for specific things using these; if they pressed the correct buttons in the correct order, they found the records in question. This test yielded very useful results about usability in this specific area. Challenges:
Full working prototypes/Observation and interviews A full working prototype generally uses the same software tools as the final version and is not thrown away at the end of testing. Challenges: With a very complex system, much effort goes into production of this prototype. Subsequent changes can have major implications for the developer, who would have to redesign screens and HTML templates and recode links to the database. Changes can also affect content developers. For the COMPASS website, our developer produced full graphical designs, converted them to HTML and linked the templates to the database. This full working system was released for evaluation, complete in most respects apart from the content. The COMPASS team set up standalone PCs in one of the Museum galleries. Visitors were asked to try out the system and were observed (at distance) and then interviewed by COMPASS staff. We found the observation results particularly useful. Both the graphical interface and the text-only version were also evaluated by blind and partially sighted users using screen reader software at the Sensory Disability Research Unit at Hertfordshire University, where testers were interviewed and recorded on video as they used the system. Results were generally positive, but many useful observations were made. Many of the required changes were addressed, although this required screen redesigns and much effort from the developer. The complexity of producing a working prototype of a system that knits together a database, HTML templates, designs and evolving content, cannot be underestimated. Despite some minor issues, the team was keen to release COMPASS on the web, partly to elicit feedback (via a questionnaire and email) which would inform ongoing content development and design of the touchscreen interface. Secondly, we were able to test our publication process from database to live site before the greater complexity of publishing from database to two live sites (internet and intranet). Thirdly, difficulties with the user interface design process for the web system led us to try alternative methodologies for the touchscreen development, including the scenarios explained above. We are now planning the development of the Educational COMPASS website. What lessons have we learnt and how will we change the methodology for the Education site? Advice on the system development process Although the COMPASS team wants to make some changes to the web interface, user reaction so far has been very positive. The team is very pleased with the touchscreen interface and early testing with potential users was encouraging. However, the actual process of user interface design and development has been very challenging and quite painful for both client and developer. If we were starting again today, we would do things differently in places. Some of the following is therefore based on experience of trying it another way (in other words: do as I say, not as I did...!).
Again, observe testers trying the system and interview them afterwards. Final sign off of the prototype screens should be conditional on the developer ensuring that the live system will reflect the results of usability testing. If sign off has to be done by senior managers, they must be kept up to date with the test results.
ConclusionThe above process is not, of course, the only way to develop a system. I hope, however, that this paper will encourage museums to think about the process and how best to work with their software developer. Whether to test at all is no longer in question, but museums should consider what sort of prototype to develop and when to test it with prospective users. Too early and the target audience may not understand what you can offer them. Too late and it can cost a fortune to put your system right. The timing of the usability testing may be up for discussion, but one thing is not: go live without testing at all and you could end up with an ignominious flop on your hands ReferencesMarshall, F. (2001). COMPASS at the British Museum: Tailoring content for particular audiences. Multimedia and Information Technology, March 2001 (forthcoming). Mitchell, W.L. & Economou, D. (2000). Methodological Approaches to Developing Content for the Cultural Grid. mda Information 5(1), 5-10. Nielsen, J. (1993). Usability Engineering. San Diego, Academic Press. Nielsen, J. (2000). Designing Web Usability: The Practice of Simplicity. Indianapolis: New Riders Publishing. Petrie, H. et al (2000). An Evaluation of the British Museum COMPASS website with visually impaired people. University of Hertfordshire, Sensory Disabilities Research Unit. (Unpublished) Preece, J. et al (1994). Human-computer interaction. Harlow, England: Addison-Wesley. Schofield, J. & Flute, J (1997). Use and Usability: A guide to Designing Interactive Multimedia for the Public. Melbourne: Multimedia Victoria. CreditsI should like to thank the following British Museum staff for their invaluable help and advice on this paper: David Jillings (COMPASS Project Manager), Mark Timson (Senior Designer, COMPASS), Matthew Cock (Creative Editor, COMPASS), Carolyn Howitt (Education Officer, COMPASS), Sue Gordon (Editorial Assistant, COMPASS) and Ray Devlin (Collections Documentation Management Section). APPENDIX: COMPASSCOMPASS is a database-driven multimedia guide to a selection of around 5,000 of the Museum's objects, mainly those on display. The system is aimed at adults with no prior knowledge of the collections as well as school children aged 8-11 and family groups. The full COMPASS system will be available in the Museum's recently restored Reading Room from Spring 2001, where visitors will be able to use the 50 wide screen format touchscreens linked to the Museum's internal high speed network. The content includes high quality images, 3d reconstructions, animations and specially-written text narratives. The interface uses DHTML templates accessing the Index+ database. Fig.1: An object record from the touchscreen version of COMPASS. Fig.2: General Index from the touchscreen version of COMPASS. Fig.3: Date search from the touchscreen version of COMPASS. Fig.4: Freetext search from the touchscreen version of COMPASS. A web version, using a different design of user interface (HTML templates), was launched on the web in June 2000 ( The internet interface is deliberately low tech, ensuring that it can be used without plug-ins on low-end PCs with low speed modems as are still widely used in UK schools and homes. Accessibility for users with visual impairments was a key factor in the design of the graphical and text-only interfaces. We have incorporated several features to simplify access for users with no prior knowledge of the collections. Online 'tours', for example, group together objects into familiar themes. Some have been written for children aged 8-11 and for family groups; one (on the Year of the Dragon) was written by pupils from a Chinese school in London. A full educational version of COMPASS is being developed for use on the web. As well as using pre-set groups of objects assembled into subject areas to support the UK National Curriculum, teachers are expected to make strong use of the 'personal folder' facility. This will enable all users - both on the web and in the Museum - to store personal selections of objects in their own folder. Selections made on the web will be retrievable in the Museum and can be printed on purchase of a BM smartcard. The COMPASS project has been wholly funded by a very generous donation from the Annenberg Foundation in the US. As well as funding hardware and software, this donation paid for all content creation. A team of specialists in text editing, imaging, educational multimedia, 3d reconstruction and animation has been appointed and the COMPASS budget compensates curatorial departments, photographers etc for their time. All content has been produced by or under the direction of over 90 museum curators and must be 'signed off' by the curators before it is published. Guidance on text production etc was provided by the COMPASS team, who edit all material prior to sign off. For more information on COMPASS content creation and audiences, see Marshall (2001). |