How-to guide for intranet task testing

Over the past several years task testing has become one of the most popular tools for building an intuitive intranet information architecture (sometimes more simply referred to as a sitemap). Task testing can be implemented quickly and easily, and typically follows card sorting in the process of designing a user-centered intranet navigation.

Task testing fills a critical roll for which card sorting was not designed: it evaluates the findability of content. If you are committed to buildling a user-friendly navigation for your employees, intranet task testing plays an important role.

Screenshot: Analysis of a single task
Example of basic analysis of an individual task

What is task testing & when do I use it?

Like intranet card sorting, intranet task testing doubles as a user engagement opportunity and can help build a foundation for sustained intranet adoption.

Task testing is an evaluative exercise used to assess how well a site navigation matches users’ perspectives. Task testing helps answer the question, “Can users navigate the new sitemap to easily find the information they need?” Intranet teams typically employ task testing after first creating a proposed sitemap based on completing a content audit and card sorting, as well as other activities to gather information about content and user needs.

A content audit helps to determine the scope of the intranet and card sorting is an early-stage design tool used to draft a navigation. Task testing is used to evaluate and improve an information architecture rather than create it.

Task testing is implemented using a simplified mockup of a site’s draft navigation. An intranet team comes up with specific tasks related to information on the intranet and participants then identify where they would expect to find the information needed to complete those tasks.

An example task given to participants might look like this: “You just returned from a business trip and you need to document your expenses and submit them for reimbursement. Where would you expect to find the necessary information?

A participant would then click through the navigation mockup looking for the correct content to complete that task. The point of the test is to gauge how well the navigation labelling and organization helps participants complete important tasks.

It’s important to note that task testing is an evaluation of the site navigation, not the participant. The participant is never wrong, and low success rates in testing convey weaknesses in the navigation’s design.

Task testing is also commonly known as “tree testing,” which describes how the activity examines the hierarchical navigation “tree” typical of most information architecture schemes.

Why is task testing important?

Like card sorting, task testing is about building an intranet navigation structure using terms and groupings that match users’ perspectives.

Unlike card sorting, though, task testing can validate an information architecture and provide feedback on how to improve it. Task testing lets you know whether your site navigation makes sense to users in real-world contexts, rather than just from an abstract perspective of organizing information.

Task testing also evaluates if participants can browse to find information, rather than relying on search.  If your intranet IA is built in a way that can accomodate successful browsing, you are on your way to a sustainable sitemap that can easily scale over time.

Task testing also focuses an intranet team on tasks rather than just information. As web usability expert Gerry McGovern says, information is a task. This means that for whatever content you put on your intranet, you must consider why it is there, what tasks employees are trying to complete when using that information, and how to facilitate easier, faster task completion.

A task-based approach to information architecture can hone an intranet team’s focus, connect them to real-world uses, and improve the quality of information and design. Task testing helps get you there.

For intranets: Added benefit of user engagement

Again like card sorting (and other intranet planning activities that involve your users), task testing is an important opportunity for user engagement. This characteristic differentiates it from task testing run on public websites, where the participants may never interact with you ever again.

Every moment in which you engage your intranet users in an activity meaningful to the intranet’s development can build a positive emotional connection to the new intranet. An employee who participates in task testing can feel she had a hand in buliding this new tool. She can feel a sense of involvement and people will collectively feel a shared sense of purpose.

As a rule of thumb, the more you engage your users in building a new intranet, the easier your road to adoption. The engagement builds excitement and shared ownership, and the usability improvements make the new intranet easy to use.

Online & in-person task testing

There are a number of ways to conduct task testing. These include using hard copy index cards, simple HTML mockups, live intranet software with a draft navigation already built, or using online task testing software.

The easiest and most streamlined approach is to use online task testing software. We rely on Optimal Workshop’s online TreeJack tool, which is easy to use and provides oodles of helpful data and analysis. Dave OBrien of Boxes and Arrows wrote an interesting evolution of task testing, which tells the story of task testing’s beginnings and development into an online exercise.

Online task testing lets you engage a geographically dispersed audience much more easily than hard copy, in-person task testing. It also offers greater flexibility and built-in analysis.

Some usability experts prefer in-person task testing for the richer contextual data it can provide. While observing testers live, you are able to ask questions, have the testers narrate their thought processes, and ask for unstructured feedback after the testing. Stephen Byrne of Step Two Designs wrote a helpful guide to tree testing for effective navigation which provides details on index card-based task testing.

The beauty of online task testing is that you can do it in person if you are so inclined. It’s as easy as reserving a small conference room, setting up a laptop, and inviting individual users in to complete the online exercise under your observation.

With ThoughtFarmer it’s so easy to build and change a site navigation you can actually set up your draft sitemap in the live software on a development instance. We’ve seen clients do this and conduct in-person task testing using a ThoughtFarmer test site on their servers.

Most ThoughtFarmer clients have short timelines and limited resources so we often implement only online task testing. Because of this experience and its benefits over hard-copy task testing, we only cover online task testing in this how-to guide.

Step-by-step guide to intranet task testing

Now that we’ve covered the basics, it’s time to provide the details of how to implement successful intranet task testing.

STEP 1: Identify participants

This step could theoretically come later in the process. However, teams that leave it to the last minute may end up with whatever participants they can round up in a hurry, rather than a broad spectrum of employees who accurately represent your user base.

Task testing aims to capture employees’ perspectives on how easy a site navigation is to use, but not all employees have the exact same perspective. The way people see and use a specific piece of content may vary based on their job levels, departments, locations, etc. It is, therefore, important to involve employees who span the spectrum on a number of different criteria.

Employee characteristics to consider:

  • Hierarchy
  • Location
  • Department
  • Role

Try to involve employees from throughout the hierarchy, from admins up to executives, as well as people from every department and location. Not only does this help you capture a full array of perspectives, but it also supports your user engagement efforts. Imagine having a couple employees in every department and location who participated in this fascinating exercise and who can talk up the new intranet project to colleagues.

How many participants do you need?

We recommend a minimum of 20 user testers for task testing. More user testers won’t hurt the results and will broaden your reach for user engagement. Fewer users can result in a lack of clear trends as well as inconclusive results.

STEP 2: Finalize your draft navigation, note assumptions

The whole point of task testing is to validate a draft intranet site navigation. That draft navigation is typically produced through card sorting. (See our how-to guide for intranet card sorting for complete instructions.)

In order to properly conduct task testing, you need to clearly document the assumptions that resulted from card sorting. You’ll use those assumptions to build the draft site navigation.

For example, what clear content groupings did you observe in the card sorting results? How did you translate those observations into a draft site navigation?

Keep in mind that the point of a draft navigation is not to try to get the IA completely right, but rather to implement a clear set of assumptions to test. So, no matter how you’ve drafted your site navigation, document your assumptions and finalize an official first draft of the navigation that you can use for task testing.

Example of a simple draft sitemap, created using a spreadsheet

In order to conduct task testing you’ll need a minimum of two levels deep of navigation, with a third level of navigation in at least some areas of the sitemap.

STEP 3: Identify navigation issues to test

In addition to the resulting assumptions about how content should be grouped, it’s also critical to clearly document the uncertainties and questions that arise from card sorting. Together these form the basis for the tasks you select for the task testing exercise.

Task testing is directly linked to card sorting because it is used to test and validate the assumptions and questions that resulted from the card sorting exercise.

For example: The results of card sorting may suggest that top level navigation sections such as “About the company” and “Administrative tools” would make sense to users. But card sorting may have also shown that “Branding guidelines” didn’t fit conclusively in one of these groupings. So with task testing you would aim to test which location worked better. You could do this by placing “Branding guidelines” under “Administrative tools” in the navigation tree and creating a task around finding that information. The task testing results could illuminate whether or not that location for that information makes sense to users.

So, start your task testing project by listing the key issues with the draft navigation around which you feel the most uncertainty. Alongside those issues list the key assumptions you’ve made about how content should be grouped.

An example assumption might be that “users grouped most HR-related topics together in card sorting, so we need a top-level navigation item for all HR content.” That’s a fairly common assumption, but you may still want to design a task that tests this.

From your list of questions and assumptions make a final list of 8-10 issues to test. You will base your tasks on these issues.

Why test 8-10 issues? 

There are two main reasons to use 8-10 specific testing scenarios for task testing. First, user testers could lose interest if you ask them to complete more than 10 tasks.

Second, after completing 10 tasks against your draft site navigation, a user will likely have explored the navigation enough to be very familiar with it. That familiarity with lead to easier task completion and skewed results.

For these reasons, a collection of 8-10 scenarios for task testing seems to work quite well.

STEP 4: Write task testing scenarios

For each of the 8-10 issues you plan to test write out a task scenario. A task scenario explains a real-world situation which would require an employee to complete a specific task using information on the intranet.

For example, if you want to test the concept of an “HR” section in the global navigation, you could create a task related to a common activity such as annual performance reviews.

A task scenario about annual performance reviews might be written like this: “It is almost time for your annual meeting with your manager to review your work from the past year. Where would you find the information needed to prepare for this?

This example lays out a very specific HR-related task, but it isn’t about that task. Rather it is about whether or not the “HR” term in the global navigation will resonate with users when they are looking for the type of content you plan to put in that section. In order to test that broader navigation scheme, you need a very specific real-world task that makes sense to users.

It is important to note that each of the task scenarios you write should be universally relevant to all employees. You can’t create a “sales” related scenario if some of your testers have no familiarity or involvement with the sales process.

The goal of task testing as we’ve explained it here is to test the top two to three levels of navigation for content that is relevant to all employees. You can also run task testing on a specific section of the intranet and select testers from a related user group. But that would be a separate exercise.

STEP 5: Confim “correct” answers in navigation

Once you have come up with your 8-10 task scenarios, clearly identify the location within your draft navigation where you will put the “correct” content for each scenario. You might complete this as part of STEP 4, but being clear about the location of the “correct” answers is important enough to warrant its own step.

Using the “HR” example listed in STEP 4, you might note that the correct content would live under: Home > HR > Performance Management.

In a simple sitemap, the correct answer could be denoted like this:

  • Home
    • People
    • About Acme
    • Offices
    • Projects
    • Sales & Marketing
    • Admin Resources
    • HR
      • Performance Management
        • Annual Performance Reviews
    • Water Cooler

It could be that as you are completing STEP 5 you discover weaknesses or holes in your navigation. Perhaps as you write out real-world tasks you notice that some important content is not accounted for in your draft navigation.

This happens because the results of card sorting may not lead to a complete site navigation. That’s fine. The process of creating an intranet information architecture is iterative by nature. Discovering holes in the IA while you prepare for task testing can be very helpful.

STEP 6: Examine scenarios for leading language

It’s surprisingly easy to give away the correct answer via the language in the task question itself. Since task testing puts the language of your draft navigation on trial, it’s important to craft questions that don’t lead your testers too much.

Before you set up your online task testing, review each of your task scenarios for words that match the terms in the draft site navigation.

For example, the following task includes topical keywords that could provide too many clues to users: “Your annual performance reviews will be due to the HR Department soon and you must meet with your manager to review your work. Where you find the forms needed to prepare for this?

This task could be re-written to remove navigation keywords and use everyday language that people will still understand: “It is almost time for your annual meeting with your manager to review your work from the past year. Where would you find the information needed to prepare for this?

Removing leading language from your task scenarios may require restructuring sentences and will require some creative thinking. You’ll have to think of lay terms to describe the tasks at hand. This effort can actually help broaden your perspective on the information on the intranet and see it through the lens of a new hire.

Jakob Nielsen wrote an article about “terminology matching” in card sorting, which expands on this topic further.

STEP 7: Set up online task testing

As mentioned before, we like to use TreeJack, the online task testing tool from OptimalWorkshop.

TreeJack, and any other well designed online task testing software, will let you build a draft site navigation that users can click through. You must build your draft site navigation in the software before you create your tasks. In the online software the top level of the navigation will look something like this:

Example of the initial global navigation shown to a task tester

Once you click on one of the main navigation tabs, TreeJack expands the sub-page navigation underneath that tab and hides the other tabs. It will look something like this:

Example of a sub-page navigation expanded during task testing

Once you’ve created the navigation mockup in your online software, you add tasks. For each task you will type in the text of the task scenario and then select the location in the navigation where the “correct” answer lies.

After creating a navigation mockup and adding tasks, you’ll be ready to start the online task testing.

STEP 8: Open task testing & invite users

This step is rather self-explanatory. Agree with your team on the time period during which you will run the task testing exercise, then set it “live” in the online software.

Depending on how responsive your users are you may be able to collect enough responses within just two days. On the other hand, you may need to leave the task testing open for a week to ensure users from distant time zones have time to complete the exercise.

In order to begin gathering responses, craft an email that you can send to all participants with a link to the task testing. This instructional email should include the following information:

  • Introduction to the intranet project
  • Explanation of the important role employees will play in crafting the the structure of the new intranet
  • A link to the task testing itself
  • Expected amount of time required to complete the card sort
  • The deadline by which the card sort must be completed
  • Who to contact with questions

While we often provide clients with a template for this email, it is important to use your own language and style that matches your company culture and norms.

Additionally, it may be useful to send all participants an email (or reach out to them in some other way) a week or two prior to the task testing as a heads up. By alerting colleagues in advance, before sending the invitation, you may be able to discover important questions beforehand, find out which potential participants will be out of the office during the task testing period, and offer yet one more opportunity to engage with your users.

STEP 9: Analyze results & update the navigation

And now it’s time for the big payoff.

As with other information architecture techniques, task testing analysis is both art and science. The sophisticated data visualization tools available in online task testing software make it easier than ever to conduct task testing. However, you’ll still have to make assumptions and guesses about what the results mean and how to address problems.

At the most basic level of analysis you’ll be looking for agreement among users. How many testers succeeded in the task—how many people selected the correct location within the navigation for the information needed to complete the task?

A basic task testing results table looks like this:

Example of a “destination table” showing results for all tasks

Each of the ten tasks are listed at the top, with the navigation listed to the left of the table. Start by reading the table vertically to understand the data presented.

In this table a green box shows the correct answer with the number of correct responses. Each white box is an incorrect answer, and a red box shows an incorrect answer that 10% or more of participants selected.

For task #1 in the left column note that all 17 participants selected the correct answer. These results suggest that the organization of the navigation and the terms related to task #1 are right on the mark.

For task #4, however, note that answers were spread across three different locations in the navigation, with almost a full third of testers selecting the wrong location. While a majority of users selected the correct answer, the related area of the navigation needs more work. It is here where the deeper analysis begins.

For task #4, why did users select the incorrect answers? Were the navigation terms too similar? Are the “incorrect” locations just as reasonable as the “correct” answer? Did people navigate successfully through the top level of the navigation or make a wrong turn with their very first click?

In order to respond to low success rates for a task you may have to simply change the wording for one or several sections of the navigation. You may have to go farther than that and restructure the navigation by moving sections, re-grouping content, or even changing or adding to the global navigation. Or perhaps you simply need to cross link between different sections that will hold related content.

Once you have analyzed the results from card sorting and adjusted your intranet site navigation accordingly, you can either move on with your intranet project or conduct another round of task testing.

Because online task testing is quick and easy to implement, you can run several rounds and make adjustments after each round. As long as you carefully track success rates for your tasks for each iteration of the navigation that you test, you can quickly hone in on clear improvements.

Typically, once you have completed a round of task testing (or several rounds, if you have the time and resources), the next step is to finalize your intranet navigation and move on to content migration. For planning this next stage of your intranet project see Bryan Robertson’s helpful post Content Migration: The Iceberg of Intranet Projects.

A final word on intranet task testing

Task testing is another inexpensive, yet critical piece of the intranet puzzle. It plays a crucial role in building a user-friendly intranet and can aid intranet adoption.

It’s important to highlight that success with task testing lies in the details. If the assumptions about your draft navigation aren’t clearly stated, the purpose of the task scenarios you use will be unclear and the results inconclusive. If the tasks are worded poorly they may give away answers and skew results. Finally, if not combined with effective content audit and card sorting efforts, you may not see the full potential of task testing.

But with these precautions in mind, any intranet team, no matter how small, can implement task testing and gain valuable insights into employees’ perspectives. The resulting improvements in usability will be well worth the effort.

Comments are closed.