这是我在上"Evaluation of Technology-Based Learning"这门课的时候所写的一篇关于Best Managers on the Net这个网络课程的可用性评价的小文章，完稿于2009年3月23日。
Usability Evaluation of E-Learning Environment
of Best Managers on the Net ®
Faculty of Education, University of Wollongong, Australia
Best Managers on the Net ® (Best Managers for short below) is a self-paced online course for potential or existing small business mangers and owners who want to build or improve their business on the Internet, which is offered by Technology Business Services, Inc. and taught by Prof. Richard Dowell. The e-learning environment of the course is built at bestmanagers.net where all the learning materials and support are provided. In addition, e-mail consultation and video-conferencing session are also available to subscribers. The author subscribed to this course for a 30-day risk-free trial so that he was able to have full access to the e-learning environment. And then he conducted a usability evaluation using two instruments—Cognitive Walkthrough and Heuristic Evaluation. The procedures and results of the evaluation are documented in this paper, followed by the author’s reflection on the evaluation experience and the discussion on the two instruments.
2 Selection of Evaluation Instruments
The human resource for the usability evaluation that the author had was just himself, and there was additional users available (e.g., there were no multiple testers available). This limitation narrowed the scope of instruments that the author could choose from. On the other hand, the author’s account in the e-learning environment was at an end-user level, which means that he did not have administrator privilege to change anything on the website (e.g., to release a questionnaire), or to access to other users’ information (e.g., activity logs, contact details, etc.). Based on such preconditions and according to the requirements of various usability evaluation instruments (Nielsen 1993; Nielsen and Mack 1994), the author finally selected Cognitive Walkthrough and Heuristic Evaluation as the two main instruments used for the usability evaluation. This is because these two instruments do not require multiple actual users, testers or website developers and administrators to participate. By using these instruments, the evaluator does not need to have administrator privilege over the website.
3 Cognitive Walkthrough
Cognitive walkthrough involves one or a group of evaluators inspecting a user interface by going through a set of tasks and evaluate its understandability and ease of learning. The user interface is often presented in the form of a paper mock-up or a working prototype, but it can also be a fully developed interface. The input to the walkthrough also includes the user profile and the task cases. The evaluators may include human factors engineers, software developers, or people from marketing, documentation, etc. As the walkthrough proceeds, the evaluator asks four questions: (1) Will the users try to achieve the right effect? (2) Will the user notice that the correct action is available? (3) Will the user associate the correct action with the effect to be achieved? (4) If the correct action is performed, will the user see that progress is being made toward solution of the task? (Nielsen and Mack 1994).
The users of the online course were defined as educated adult learners from business fields, some of who may only have basic computer skills while others may have intermediate e-business skills (e.g., building and operating an online store).
Five typical tasks that most users would do during participating in the online course were identified as: (1) log on student account, (2) get access to the course material, (3) join the online conference, (4) participate in group discussion and (5) communicate with the instructor. The rest of this section documents the correct action sequence for each task and their evaluation result.
3.1 Log on Student Account
To log on student account, a user needs to (1) locate the log on portal and (2) enter the username and password. As for Best Managers, the “Login Form” is located on the left side below the main menu, which is apparent to users. Those who have previous login experience in other online service will easily identify that this is the first step before they can access to the membership area. Once they enter the correct login detail, they are NOT directed to a different page. Rather, they are still on the same page but the “Login Form” is changed to “User Menu”. As a result, the user may not realize whether they have successfully logged in or not.
3.2 Get access to the course material
To get access to the course material, a user needs to (1) locate the link to the course material, (2) click the link, (3) identify the sequencing of material, (4) open the presentations orderly, and (5) play the instructor’s voice recording at each slide. After logging in Best Managers, users may not immediately realize the location of course material because, instead of displaying a “course material” link or section for navigation, the system directly provides users with the unit names of course. “Internet Success” and “Business Success Tips” are actually two links to the course material (see Figure 1). Users may not know about this until they try to click the links. In terms of the sequencing of material, basically it is easy to understand. The system suggests users use Firefox browser because the presentation and the voice recording can only display properly in Firefox, which is helpful for the users. However, traditional IE users may not know what to response to this warning because the system does not provide a link to download Firefox.
Figure 1: User Menu
3.3 Join the online conference
To join the online conference, a user needs to (1) locate the link to the third part online conferencing solution website, (2) click the link, and (3) wait for the conference organizer to approve the access. The link “Join Class Meeting Online” is apparent, but it is associated with the same icon as for course material, which is like a document (see Figure 2). This is not easy for users to quickly identify that the link is for the synchronous conference.
Figure 2: Links in Course Material
3.4 Participate in group discussion
To participate in group discussion, a user needs to (1) locate the link to group discussion, (2) apply for joining the Google Group (if there is no Gmail account, sign up a Gmail account first), (3) wait for instructor’s approval, and (4) log in the Google Group. Instead of display the group discussion link in the user menu, the system puts the link at the bottom of the page of course contents. Users may not realize that there is this service until they scroll down to the bottom. Those who have no previous experience of using Google Group or even have no Gmail account may feel confused or frustrated because there is no step-by-step instruction for signing up this service provided.
3.5 Communicate with the instructor
To communicate with the instructor, a user needs to (1) identify the available communication methods, and (2) communicate with the instructor using those methods. Best Managers provides users with two main methods for the communication—e-mail and telephone. The contact detail is located in the middle of the course material contents page and the last slide of each session. It is easy for users to find this information and use these two simple communication tools. However, once a user communicates with the instructor using e-mail or telephone, s/he may be requested to provide further details for verification, because these two methods do not utilize the information of login status. The users may introduce themselves every time they contact the instructor. However, this is not advised on the system. First-time users may not know they have to include their personal information (e.g., username) in their first e-mail to the instructor.
4 Heuristic Evaluation
Heuristic Evaluation is a technique for identifying usability issues. It is cheaper to conduct than formal usability testing and can be completed in a very short period at any stage of the design. A heuristic is a guideline or general principle or rule of thumb that can guide a design decision or be used to critique a decision that has already been made. Heuristic evaluation, developed by Jakob Nielsen and Rolf Molich, is a method for structuring the critique of a system using a set of relatively simple and general heuristics (Nielsen and Mack 1994; Gaffney 2000). For the evaluation of Best Managers, the author adopted and revised the heuristic evaluation instrument developed by Reeves and Hedberg (Reeves and Hedberg 2008).
4.1 Visibility of system status
Guideline: The e-learning program keeps the learner informed about what is happening, through appropriate feedback within reasonable time.
Comments: Best Managers basically does not inform users about any update through the website. Instead, the instructor sent e-mails to the users to inform them the updates. Usually in the e-mail, the instructor asked the users to click some links to get access to the content that the instructor would like them to get access to. However, the users may not know how to get there from the website step-by-step without knowing the exact URL.
4.2 User control and freedom
Guideline: The e-learning program allows the learner to recover from input mistakes and provides a clearly marked “emergency exit” to leave an unwanted state without having to go through an extended dialogue.
Comments: Basically, Best Managers does not require users to input any information. It allows users to move around in the website, such as going back and review previous sections of the course materials. Users can leave the website or a session whenever desired. There is not any restriction or an extended dialogue. However, as the system does not remember a user’s learning progress (e.g., which session has been learned), it is not easy to return to the closest logical point in the course.
4.3 Recognition rather than recall
Guideline: The e-learning program makes objects, actions, and options visible so that the user does not have to remember information from one part of the program to another. Instructions for use of the program are visible or easily retrievable.
Comments: For each session, users of Best Managers can easily recognise how to go through the materials from the beginning to the end as expected. However, the system does not provide hints or directions when the user requests assistance. Although contextual assistance is not available, a section called “Ask Sarah” can answer users’ questions and it will send any unanswered questions to the instructor. This is supported by Virtual Smart Agent. However, the robot could not answer many questions properly as the author tried to ask her some question about the course materials.
Guideline: The e-learning program provides content-related interactions and tasks that support meaningful learning.
Comments: Best Managers does not have many interactions that support meaningful learning. The main learning activity is to watch the instructor’s PowerPoint presentation associating with his voice recording. Users and the instructor do not interact except for e-mails and online conference. It appears that there is no human-machine interaction through out the course.
4.5 Message Design
Guideline: The e-learning program presents information in accord with sound principles of information-processing theory.
Comments: According to Cognitive Load Theory (Sweller 1988; Sweller 1989), learning will become effective if different sources of information are related but not exactly the same. The presentations of Best Managers are in accord with this theory. The instructor’s speech elaborates what a user is watching, no matter the user is watching a slide or the screen recording of operating on a software or website interface. Such combination of information can facilitate learning effectively.
4.6 Learning Management
Guideline: The e-learning program enables learners to monitor their progress through the material.
Comments: Best Managers does not enable users to monitor their progress through the material. Users may not possess an adequate understanding of what they have completed and what remains to be done within any specific session. Assessment is also not available.
Guideline: The e-learning program provides feedback that is contextual and relevant to the problem or task in which the learner is engaged.
Comments: In Best Managers, feedback is provided when users ask the instructor questions via e-mail, within Google Group or in online conference. As users’ learning progress is not monitored nor recorded, feedback may not be given at a specific time tailored to the content being studied, problem being solved or task being completed by the users.
5 Reflection and Discussion
5.1 Cognitive Walkthrough
Cognitive Walkthrough can be implemented by one usability expert. It is independent from end users and helps designers to take on a potential user’s perspective. By using this instrument, the author could effectively identify the problems of the e-learning environment that are directly related to the typical tasks that the end users will perform. Such problems may be fixed at an early stage and prevent end users from meeting critical system errors. The instrument also helps define users’ goals and assumptions when the author applied it (i.e., to identify the typical tasks). This method uses fewer resources (time, personnel and equipment) than actual user performance testing. However, Cognitive Walkthrough is more dependent upon evaluators’ biases and subjective comments. When the author applied this instrument, sometimes he was not completely sure whether his subjective comments were correct or not. The author would wonder if the errors and problems found using the Cognitive Walkthrough may be less severe than those found in user testing. Previous experiences of similar situation will always influence the evaluator’s decision, but the influence will not always lead to a correct conclusion. Or if the evaluator’s memory is incorrect or incomplete, potential errors may not be found or in some cases new errors may be introduced. Furthermore, untrained evaluators have been found to produce poorer results (Polson, Lewis et al. 1992; Lewis and Wharton 1997; Barnum 2002).
5.2 Heuristic Evaluation
Heuristic Evaluation can identify problems very early in the development of the documentation. The heuristic statements or guidelines are based on experts’ experience. By using a heuristic evaluation to identify problem areas in the documentation, it can provide a focus for a later user performance evaluation. Heuristic evaluations can produce high quality results in a limited time because this method does not involve detailed scripting or time-consuming participant recruiting. Instead, the author could quickly comment on Best Managers from various aspects which was found important and critical by previous usability evaluation experts. However, Desurvire (1994) points out that experts using heuristic evaluation found 80% of the minor annoyances that users might experience, but only 29% of the problems that were likely to cause task failure (the most severe problems). Expert evaluation of the documentation produces results that are not actual "primary" user data. Real users often have problems we do not expect and do not have problems where experts might expect them; therefore, it does not necessarily indicate which problems users will encounter most frequently (Desurvire 1994; Sears 1997). Furthermore, as the author used the instrument, he found it difficult to determine which heuristic statements are really suitable for the case, and the author was also worried about some heuristic statements might be missing but essential for Best Manager.
By using the two instruments, the author found that Best Manager had a number of problems. However, both of the instruments involved much subjective decision. The result may be different from end user testing. This is the common drawback of the two evaluation methods. As a result, further actual testing is needed for more accurate evaluation results.
Barnum, C. M. (2002). Usability testing and research. New York, Longman Publishing Group.
Desurvire, H. W. (1994). Faster cheaper! Are usability inspection methods as effective as empirical testing? Usability inspection methods. J. Nielsen and R. L. Mack. New York, John Wiley & Sons: 185.
Lewis, C. and C. Wharton (1997). Cognitive Walkthroughs. Handbook of Human-Computer Interaction. M. Helander. Amsterdam, Elsevier: 717-732.
Nielsen, J., Ed. (1993). Usability Engineering. London, Academic Press.
Nielsen, J. and R. Mack, Eds. (1994). Usability Inspection Methods. New York, John Wiley and Sons, Inc.
Polson, P. G., C. Lewis, et al. (1992). "Cognitive walkthroughs: a method for theory-based evaluation of user interfaces." International Journal of Man-Machine Studies 36: 741-773.
Reeves, T. C. and J. G. Hedberg (2008). Evaluating e-Learning: a user-friendly guide (in press).
Sears, A. L. (1997). "Heuristic walkthroughs: Finding problems without the noise." International Journal of Human-Computer Interaction 9(3): 213-234.
Sweller, J. (1988). "Cognitive load during problem solving: Effects on learning." Cognitive Science 12: 257-285.
Sweller, J. (1989). "Cognitive technology: Some procedures for facilitating learning and problem solving in mathematics and science." Journal of Educational Psychology 81: 457-466.