“Designers shooting for usable is like a chef shooting for edible.” – Aaron Walter, VP of Design Education at InVision Collaboration amongst designers and non-designers alike is a vital component of a team’s success. Amidst a plethora of UX tools, we’ve found that a heuristic evaluation is the golden nugget to (inexpensively) help your designs […]
“Designers shooting for usable is like a chef shooting for edible.” – Aaron Walter, VP of Design Education at InVision
Collaboration amongst designers and non-designers alike is a vital component of a team’s success. Amidst a plethora of UX tools, we’ve found that a heuristic evaluation is the golden nugget to (inexpensively) help your designs stay true to user goals and encourage team collaboration. Trent Hone noted in a recent post that a heuristic is “a guide that aids decision making” and “explicitly defining heuristics can accelerate the decision-making process in Agile teams.”
In a heuristic evaluation, a team evaluates an application or design against a group of decision-making guides. Here we’ll discuss the four steps our design team uses to perform a heuristic evaluation:
Identifying user goals lays the groundwork for defining our heuristics. As user advocates, it’s essential we create a list of terms that align with what’s most valuable to our user and usability best practices.
Define the context of the system by forming relationships with your users through interviews, testing, satisfaction surveys, etc. It’s impractical and irresponsible of a User Experience practitioner to make assumptions on what’s most important to the user before speaking to them. Mark Twain says it best, “Supposing is good, but finding out is better.” Immerse yourself in the user’s experience before agreeing on your heuristic guides.
Based on your understanding of user goals and usability best practices, you’ll be able to brainstorm some terms that align with these values. On our team, we’ve identified eight heuristics:
These heuristics, complemented by in-depth user research tactics, force our designs to speak the same language as our users.
The evaluation is a way to measure the effectiveness of your design, or a current application, to the defined heuristics. The process varies across different evaluations. We’ve conducted evaluations on an existing site, incorporated heuristics into a satisfaction survey, and plan to include all Excella UXers in an evaluation in the near future.
At this point, you’ve had a good deal of discussion around the heuristics and have an actual definition tied to each term. Repurpose these definitions into questions to be used in the evaluation.
Here’s an example of what we use for the useful heuristic:
Measuring a heuristic evaluation is up to your discretion. We use a Likert-scale for each question and encourage evaluators to provide comments and screenshots. Feel free to keep the questions open-ended, but make sure you have a system in place that brings to surface the most pressing usability concerns. You’ll likely see that each finding of the evaluation can be grouped into one of four buckets:
Discuss the results of the heuristic evaluation with your team. The evaluation is a conversation-starter, whose intention is to stir the pot just enough to reenergize the focus towards the end-user. Work with your team, both designers and non-designers alike, to determine the most critical issues the team should try and tackle.
As an aside, heuristic evaluations shouldn’t be done in lieu of user engagement efforts (usability testing, user interviews, etc.). Rather, the results of the evaluation will allow the team to have an educated hypothesis for what will be received well by the users. Always, always, always validate this hypothesis.
Strive for excellence when it comes to your heuristics. Aaron Walter, VP of Design Education at InVision, said it best, “Designers shooting for usable is like a chef shooting for edible.” The difference between a usable and delightful application could be the very thing that puts you ahead of your competitors.
4 Common Misconceptions About Containers
You’re thinking about containers and what they mean for your organization; you want all the...
How Can Independent Validation & Verification (IV&V) Integrate with Agile: Agile IV & V
In our previous post, I discussed the Traditional I&IV approach to Independent Validation and Verification...
Getting Started with Azure Stream Analytics
In my previous post, I walked through common scenarios and attempted to present a high-level...