The Customer Experience Experts

5 Tips for Writing Better Usability Tasks

Home / Blog / 5 Tips for Writing Better Usability Tasks

5 Tips for Writing Better Usability Tasks

Have you ever watched a usability session without much “using” going on? While we encourage all team members to get involved usability testing, we have also learned the importance of experience when it comes to crafting the language of usability tasks and scenarios in a way that promotes users to take action instead of verbally answer a question.

The task or scenario setup by the moderator is one of the most important parts of ensuring accurate, meaningful results, but must be created with careful consideration to ensure research goals are answered and users’ behavior is natural.

But how do you craft a good usability task? Are there really any clear guidelines?

This article will share 5 tips for developing effective usability tasks and scenarios, including illustrative examples.

#1 Establish a Clear Goal for Each Task

A successful usability test will provide actionable insights to help with a current goal and objective; however, without in-depth background into the goal or objective, it is almost impossible to ensure the right data is captured. Depending on the task setup, users can complete the same action but provide insights into completely different areas. Let us share an example.

Let’s imagine we are conducting a study for Chase Quick Pay, this platform enables users to transfer money to friends or businesses using basic information, like their email address or phone number. Below are two different ways to write a task related to a payment transfer feature – what do you think is the impact of using one versus the other?

Image of Index Cards with Task Prompts

These tasks both focus on the same payment feature but will yield different insights; one focusing on navigation, and the other on awareness and expectations.

V1 is most appropriate when the goal is to gain insights into the navigation. For this task, we inform the user that the function exists within the website, but leave out how to get there. Based on the users’ behavior, we can test the effectiveness of the navigation elements and the payment flow within the site as the user attempts to make the payment.

In V2 we do not reveal that this payment function exists within the website, making this the better option for gaining a deeper understanding of users’ awareness and expectations. The task omits any language informing the user that this feature exists. If the user does not look for a way to pay Anna using her phone number, we know awareness is low, or at the very least not top of mind. Additionally we have the potential to gain insights into expectations of how to find new features through their engagement with the website.

#2 Provide Context to Make it Realistic

While it is important to keep tasks brief so that users’ can comprehend them, it is also important to ensure their relevancy. Asking a participant to complete a task they wouldn’t normally engage in will not provide natural insights into the users’ behavior. That is why it is best to provide a bit of context when creating tasks.

Give a Little Background Information 

For example, if testing the flight search function on the Expedia website, instead of asking users to “Please find two tickets roundtrip tickets to New York”, prompt users by saying: “Let’s imagine you are planning a trip for you and a friend to visit New York this Fall. Use this website to find round trip tickets for you and your friend to visit New York in November.” Here users are thinking about their needs and expectations as a traveler, rather than finding a product to complete a task.

Have the User Create the Task 

In some instances, it may be necessary to prime the participants to come up with realistic scenarios. Imagine conducting research on the Pacer Health mobile app, which doubles as a pedometer and weight loss coach. If the Pacer Health team was interested in learning more about use cases and desired functionality for the app, we would recommend that participants create their own tasks. To do this, the moderator would ask the participant to describe when and how they currently use the mobile app at the start of the session. The scenarios users mention might be activities such as “To see how many steps I’ve taken today.” The activities participants mention are then used as a task prompts (e.g. “Show me how you would use this app to check how many steps you have taken so far today”) for the duration of the session. By creating tasks based on participant’s personal use cases, the exploration is more likely to mimic natural usage.

#3 Ask the User to Show Not Tell

The advice for writers to “show not tell” holds true for usability testing as well. The ideal usability task prompts users to show their behavior rather than recite a story of how they would behave. While this behavior also depends on the users’ personality, the researcher can nudge the user to “do” rather than “say.”

For example, if you are conducting research on the Fry’s Electronics website and ask the participant “what would you do if you needed to purchase a new laptop under $1,000 using this website?” the response will likely be a verbal explanation rather than an interaction with the website.

When tasks use language such as “What would you do” or “tell me how you would” etc. the moderator is encouraging the participant to answer their question verbally. Whereas if the task is reworded to “Show me how you would use this website to purchase a laptop under $1,000,” the moderator is encouraging a behavior. In order to capture the most accurate results, the task should prompt users to interact with the website / app rather than to explain how they would use the website.

#4 Remove Any Navigation Clues

One of the benefits of usability testing is seeing first-hand how people navigate through a site; it is not just a click-path diagram, but rather a real-time recording of their actions, which allows researchers, designers, and product teams to observe any hesitations or confusion throughout the users’ experience. However, this benefit is lost if the task set up reveals clues about the navigation or site experience.

When usability tasks provide clues, such as vocabulary that mirrors the navigation, the effectiveness of the usability study goes down since the user’s behavior has been influenced. While there is no way to completely remove influence during a usability test, the level of influence can be managed.

Avoid Using the Same Verbiage as Seen on the Website/App

In testing, we’ve seen that with the same end goal in mind, such as paying a bill or learning about a program, the success rate and level of confusion differ depending on the language used in the task scenario.

Image of Verizon Wireless Homepage

Two tasks for locating the Verizon Trade In Program; Example 1 is leading and uses the exact language of the call-to-action, while example two does not provide any clues about where to find this information.

To demonstrate, imagine we are conducting research on the Verizon Wireless website. Above are two examples of a tasks related to learning more about Verizon’s “Trade In Devices” Program; one that directs users to clear call-to-action by mimicking the language of the website, and another which encourages exploration and natural behavior.

To get a realistic understanding of task success, the set-up should remove any biases by using language and terminology that does not mimic navigation or guide the user into a certain area of the site.

#5 Pay Attention to the Order of Tasks

Clues do not just appear in the language used for the task; the flow of the session itself can also provide hints into different areas or functionality of a digital property. This is especially important to keep in mind in situations where discoverability and/or insights into users’ awareness of content/functionality are of interest.

When deciding on the succession of tasks, we pay special attention to what areas of the website / app will be revealed at each task, to ensure we are able to answer our research questions.

Consider Beginning with Natural Exploration

Imagine we are testing the recently updated Spotify mobile app; the updates range from changes in the navigation structure to completely new features. Our goal is to understand how their customers utilize the app in general as well as the findability/awareness of the recently added features. In this case, as well as any time we are interested in natural exploration, we place it at the beginning of the session.

This ensures that we can elicit feedback on how users utilize the app without providing knowledge to new features. We do this because it is not uncommon that during a usability session participants become aware of new content and or features. We often have participants say to us “I didn’t know this existed” as they discover new features or content.

To gain insights into natural exploration and usage of the website we ensure that any tasks related to specific content and/or features are prompted later in the session as to not influence the users’ experience. This also ensures that users rely on recognition rather than recall, providing additional insights into the overall design and information architecture.

Summary: 5 Tips for Improving Usability Tasks

There are variety of things to keep in mind when creating usability tasks, however we have found that focusing on these 5 areas increases the quality of the results in both the depth of information received as well as their alignment with the research objectives.  Below is a recap of the five essential practices for creating effective usability tasks.

  • Establish a clear goal with all parties involved in the research project
  • Provide context to make it realistic and cover the common use cases
  • Ask the users to show not tell to gain insights into actual behavior
  • Remove any navigation clues in the task setup that will guide / encourage users to complete the task in a specific way
  • Pay attention to the order of tasks so that information revealing the users natural exploration remains truthful