Toggle Menu

Insights / Digital Service Delivery / Evaluating User Experience with Heuristics

December 19, 2016

Evaluating User Experience with Heuristics

4 mins read

“Designers shooting for usable is like a chef shooting for edible.” – Aaron Walter, VP of Design Education at InVision

Collaboration amongst designers and non-designers alike is a vital component of a team’s success. Amidst a plethora of UX tools, we’ve found that a heuristic evaluation is the golden nugget to (inexpensively) help your designs stay true to user goals and encourage team collaboration. Trent Hone noted in a recent post that a heuristic is “a guide that aids decision making” and “explicitly defining heuristics can accelerate the decision-making process in Agile teams.”

In a heuristic evaluation, a team evaluates an application or design against a group of decision-making guides. Here we’ll discuss the four steps our design team uses to perform a heuristic evaluation:

  1. Context Discovery
  2. Defining Content
  3. Executing the Evaluation
  4. Feedback Implementation

Context Discovery

Identifying user goals lays the groundwork for defining our heuristics. As user advocates, it’s essential we create a list of terms that align with what’s most valuable to our user and usability best practices.

Define the context of the system by forming relationships with your users through interviews, testing, satisfaction surveys, etc. It’s impractical and irresponsible of a User Experience practitioner to make assumptions on what’s most important to the user before speaking to them. Mark Twain says it best, “Supposing is good, but finding out is better.” Immerse yourself in the user’s experience before agreeing on your heuristic guides.

Defining Content

Based on your understanding of user goals and usability best practices, you’ll be able to brainstorm some terms that align with these values. On our team, we’ve identified eight heuristics:

  1. Findable: Can the user find what they’re looking for?
  2. Accessible: Can the target audience use it?
  3. Clear: Are visual cues obvious? Is it easy to understand? Are distractions removed so the user has a clear path to their goal?
  4. Useful: Is it capable of producing the desired or intended result?
  5. Trustworthy: Can your users trust it? Do inconsistencies raise credibility concerns?
  6. Forgiving: How forgiving is the process? Are there warnings before critical actions or potential errors? How much control do your users have over their experience?
  7. Learnable: Can users quickly grasp it?
  8. Delightful: How are user expectations not just met, but exceeded? Are aesthetics polished in a way that brings happiness to the user? Or do these “delightful” additions detract from usability?

These heuristics, complemented by in-depth user research tactics, force our designs to speak the same language as our users.

Executing the Evaluation

The evaluation is a way to measure the effectiveness of your design, or a current application, to the defined heuristics. The process varies across different evaluations. We’ve conducted evaluations on an existing site, incorporated heuristics into a satisfaction survey, and plan to include all Excella UXers in an evaluation in the near future.

At this point, you’ve had a good deal of discussion around the heuristics and have an actual definition tied to each term. Repurpose these definitions into questions to be used in the evaluation.

Here’s an example of what we use for the useful heuristic:

  1. Is it usable? Does it work?
  2. Is the content presented in a way that helps a user finish the task?
  3. Is the user flow efficient? Is the purpose of each element obvious? Are there a few navigational options that lead where users may want to go next? Are they clearly labeled?
  4. Are users able to complete the tasks that they set out to without frustration or abandonment?

Measuring a heuristic evaluation is up to your discretion. We use a Likert-scale for each question and encourage evaluators to provide comments and screenshots. Feel free to keep the questions open-ended, but make sure you have a system in place that brings to surface the most pressing usability concerns. You’ll likely see that each finding of the evaluation can be grouped into one of four buckets:

  1. Non-Issue: Works great!
  2. Minimal: Non-impacting issues that would be nice to fix.
  3. Moderate: Affects the reputation of the experience.
  4. Critical: Is an impediment to users completing their desired task.

Feedback Implementation

Discuss the results of the heuristic evaluation with your team. The evaluation is a conversation-starter, whose intention is to stir the pot just enough to reenergize the focus towards the end-user. Work with your team, both designers and non-designers alike, to determine the most critical issues the team should try and tackle.

As an aside, heuristic evaluations shouldn’t be done in lieu of user engagement efforts (usability testing, user interviews, etc.). Rather, the results of the evaluation will allow the team to have an educated hypothesis for what will be received well by the users. Always, always, always validate this hypothesis.

Strive for excellence when it comes to your heuristics. Aaron Walter, VP of Design Education at InVision, said it best, “Designers shooting for usable is like a chef shooting for edible.” The difference between a usable and delightful application could be the very thing that puts you ahead of your competitors.

You Might Also Like

Resources

Simplifying Tech Complexities and Cultivating Tech Talent with Dustin Gaspard

Technical Program Manager, Dustin Gaspard, join host Javier Guerra, of The TechHuman Experience to discuss the transformative...

Resources

How Federal Agencies Can Deliver Better Digital Experiences Using UX and Human-Centered Design

Excella UX/UI Xpert, Thelma Van, join host John Gilroy of Federal Tech Podcast to discuss...