Heuristic Analysis for current Add-Rule method


On the other side of quantitative analytical methods like GOMS lay many theories and guidelines designed for assessing interfaces qualitatively and assuring usability. One such group of principles is the eight "golden rules" of interface design (Shneiderman, Plaisant Designing the User Interface, 2005 p.74).

The following section will discuss each principle and how well the previously discussed OE task follows it:

  1. Strive for consistency - In most cases, Microsoft applications excel at keeping a consistency of interface semantics between applications. As was mentioned briefly in step 3, though, with the add new mail rule task dialogue box, a high-level semantic rule of the Internet, that of text linking, is superimposed on the low-level language of interface widgets (buttons and boxes).
    Impact assessment: Albeit minor, this swapping of information-space conventions is a mode change. Mode changes interfere with the user creating a static mental model thus increasing time-on-task.
  2. Cater to universal usability - Features are often added to Microsoft applications that are designed to provide access to users with diverse needs (i.e.: Sticky-Keys can assist some users with difficulties related to fine motor skills, nationally localized versions are available for most of their products, etc). However, one of the most common faults of their applications is that the difference between novice and expert users is not sufficiently allowed for. So many features are added that software can take new users a disproportionally long time to complete a task when compared to knowledgeable.
    Impact assessment: As stated in step 4a, novice users are at an especially great disadvantage if they haven't somehow made note of the exact conditions (spelling, grammar, addresses, etc) under which a new rule is to apply. It is nearly impossible to view any model email beneath the task dialogue boxes that will have opened at each stage of the task. Closing each window, finding and making note of the criteria to use, and then returning to the message rule creation is extremely time consuming. As a user to whom this has happened many times, I can vouch for the fact that this can as much as double the time-on-task.
  3. Offer informative feedback - The current task process gives little feedback to the user. That which is given (the rule criteria and action replace the links, 'rule' and' 'do the following,' in the New Rule Screen) is barely noticeable.
    Impact assessment: Lack of feedback may slow the user's creation of a proper and/or useful mental model of the create-rule task. This could result in small delays for all but the most advanced users.
  4. Design dialogs to yield closure - While the main Message Rule window does allow one to view the rules, it might not be evident to all users where to look or even what the message means.
    Impact assessment: There are probably few repercussions for most users. Because of the information clutter in the main rule-making dialogue box, though, and the lack of any real indication of success, novices might feel somewhat lost.
  5. Prevent errors - OE does return error messages if all of the required information isn't filled out correctly. It does not provide any sort of rule conflict notification, which could lead to unexpected results.
    Impact assessment: Error messages that are provided when incomplete information is entered are good, especially the highlighting of uncompleted steps. This is very beneficial to the overall task.
    Rule-overlap conflicts, though, do have the potential to be damaging enough to offset this benefit.
  6. Permit easy reversal of errors - At any stage of the task it is also possible to go back and edit any part of a rule.
    Impact assessment: This is the kind of principle that it is difficult to judge. In the case of a person catching an error before it is submitted, it is beneficial and possibly saves time in and prevents future frustration. If a faulty rule makes it all the way to completion without being noticed, though, it is already bound to have some type of negative effect, the only difference is how bad it is. It would be better to catch the error before it happens but if the error has to be made, error reversal would be beneficial in the end.
  7. Support internal locus of control - This task is very strictly defined and moderated. The OE task interface essentially takes over and doesn't let the application be used for anything else until rule making is successfully completed. While it might be thought that this wizard-like interface would be advantageous to certain beginning users, as has been shown (step 4a), it is actually so strict that it becomes an impediment.
    Impact assessment: This loss of control can be equally as frustrating for advanced users as it is to novices. Because this broken principle negatively effects the interface three times, marring compliance with two other "golden rules," it is fair to say that the breaking of this principle effectively ruins the whole task interface.
  8. Reduce short-term memory load - As mentioned above in principle 2, if the user forgets what the exact words to be filtered are, she is out of luck and must either abandon her goal or start over again from the beginning. This is far too much reliance on human memory to be useful.
    Impact assessment: The negative impression that this kind of reliance on human memory causes cannot be underestimated.

From all of these observations (and more which will be discussed later), it can be seen that there are a number of improvements that can be made in this interface and its task-action sequence. The next task sequence shows one alternative interface that was designed with these improvements in mind. next>>

 

14


Navigation


Introduction


Current version

screen   0a - 0b - 1 - 2 - 3 - 4a - 4b - 5 - 6 - 7a - 7b - GOMS - heuristics


Improvement 1

screen   1a - 1b - 2a - 2b - 2c - 3a - GOMS - heuristics


Improvement 2

screen   1a - 1b - 2a - 2b - 2c - GOMS - heuristics


Conclusion