Emma Boettcher

Iterative Testing for a Link Resolver

(iterative, guerrilla-style testing of one page in a larger workflow)

the challenge

As students are accessing resources through the University of Chicago, they're more likely than not to stop at the Library's "Find It!" link resolver page. Starting from a record in Google Scholar, WorldCat, or one of the Library's databases, they'll arrive at the Find It! page as they're trying to track down a PDF or physical copy of the resource.

In response, the Library used to provide links to any number of resources on that Find It! page:

Not to mention, the page also linked out to more specialized research tools and had a complicated search form.

Old link resolver
The Library's old "Find It!" page. I did not design this.

To paraphrase Descartes: Apologies for the page being too long. We didn't have time to make it shorter.

But then, three backend services were developed that had the potential to make this page a lot simpler.

  1. First, a programmer within the Library developed logic that would determine whether databases being shown on the page had identical coverage, and hide any redundant options.
  2. Second, a service pre-checking the catalog was introduced, so that students would be able to tell just from the Find It! page if the Library had the item, instead of going into the catalog and meeting with disappointment.
  3. Third, the Library consolidated its three queues for interlibrary loan into one queue. Now, instead of students picking one of three services, the Library would make the choice for them.

Now that it was possible on the backend to simplify the page, how could the UI be adapted to reflect this?

approach

statistics and web analytics

One of the first questions the Library's web team asked was, "How can we simplify this even further?" From the statistics provided by the vendor, we knew that at least three areas of the page (online, in the library, and interlibrary loan) were heavily used. There was no question of taking those options away from our users, and the backend services being developed would go a long way toward making them easier to use.

Other sections of the page were used only a few times a year, and often by librarians, not by general users. These options were probably too confusing to keep on the page, and we decided to remove them.

One section of the page that got a surprising amount of use was the search form, and we didn't have information on how that was being used. I asked for the link resolver to be added to our Google Analytics instance, so that I could track what was being searched from that page. With a year's time between initial analysis and development, I ran reports on how that search was being used. The conclusion? People were using that search box to re-search for the resource that had led them to that page. Either they didn't understand what the search form was doing, or they didn't understand the page they were on. Based on that, I recommended that the search fields be removed from that page.

prototype testing

With this proposal for the scope of the page in mind, I was able to turn my mind to testing it. The page was just one point along a longer journey. It's linked from any number of databases, and it could still link out to many different resources, even with the streamlining.

Paper prototyping seemed too cumbersome. Ideally, the user would be able to just glance over the page to get where they need, and the low-fidelity of paper prototypes might generate closer scrutiny.

Instead, the Library's graphic designer created a mockup in HTML, and put in links to live resources where possible. To cover the part of the workflow when the user arrived at the page, I took screenshots of some of the most popular types of records where people accessed the button, and using InVision, created linked hotspots from the screenshots to the prototype.

A new version of the link resolver page
The revised link resolver page: one column, three sections, and clear Library branding. Designed by Kathy Zadrozny.

I recruited students for a think-aloud usability test, which I documented using Morae. After the students completed the tasks (all along the lines of "Show me how you would get a copy of X resource"), I gave them a short survey about labeling choices.

guerrilla-style usability testing

From that round of usability testing, there were still some flaws in the design, but mostly, the testing uncovered issues in the resources linked to from the link resolver page. These issues were triaged and dealt with at a later date, so the research was valuable, but future rounds of testing had to be more focused to make the best use of resources for this project.

Instead of recruiting participants ahead of time and asking for 30-45 minutes of their time, I set up a table in a heavily-used area and flagged down students as they entered. Since I was recruiting them on the spot, I stopped the session once they had made it to the next step beyond the link resolver page. I also dispensed with Morae and recording. It didn't seem worth it for such a short session, and explaining the recording and its use would take up more time than the test itself. Instead, another librarian helped by taking notes.

While the second round of testing still uncovered flaws - the page's styling, which now clearly signaled a next step, was working almost too well, prompting users to click on links that wouldn't advance their goals - the testing itself worked great for what was needed. I conducted a third round of testing with another, more refined prototype, and after that, the page seemed ready.

reporting and impact

For each round of testing, I created detailed reports that itemized the usability issues uncovered, and posted them to the Library's intranet. However, these reports were mostly for posterity. It was more important to share the results with the web team and with committees that were stakeholders in the redesigned page in person. I updated these groups when I had the opportunity, explaining each time the types of tasks that were used and how those reflected the real-world use of the page.

When we finally launched the redesigned page, more than a year after the backend services were initially proposed, the design was smoothly incorporated into student and staff workflows. In the month after the launch, the Library received only one complaint. The biggest sign of the design's acceptance was the lack of late-stage questions from stakeholders later in the process: as the rest of the web team and I kept presenting refinements at meetings, we showed our colleagues that though these changes were sizable, we were pursuing them on behalf of our users and not just for cosmetic reasons.