Matthew Reidsma

Work Notes

Updates from the GVSU Libraries’ Web Team.
Archive // Subscribe: EmailRSS

Usability Test Roundup, April 2, 2015

Usability Test Roundup, April 2, 2015

For various reasons, we haven’t run a usability test in quite a while. But this morning we had 2 students come in to run our new LibGuides Database A-Z page through the paces. I also had them work through some scenarios on retrieving books while looking at paper wireframes, to better understand how to display location information for items in the retrieval system.

I also tried something new this time: I tailored the questions to each participant’s major, to see if we could avoid some of the “I don’t know enough about this subject to answer the question” responses we’ve gotten in the past. But basically the first three questions were centered around these ideas:

  • You’ve been assigned a paper, and need to find some peer-reviewed sources on a topic.
  • Your professor recommended a particular database, find that database.
  • You heard about a database’s content in your discipline, but don’t know the name, Find the database.

Overall, we didn’t see much out that was different out of the first question. Our two students represented polar opposites in how they handled this research: one didn’t know what peer-reviewed meant, and so hunted around for that specific label. The other went right for the facet and we were able to watch the process of narrowing a large search result down, which was instructive. This is a “warm up” question, one I often start with that helps us make sure one of the most common tasks doesn’t have any huge usability gaps, while also getting the student used to the format of the test.

The questions about the LibGuides database list went over pretty well, with students gravitating toward the new search feature. Unfortunately, the search feature only searches titles, so it isn’t always effective (although one student accidentally lucked out in searching the keyword “demographics” since the database under review was “DemographicsNow.”) I’ll submit a feature request to Springshare to make this search titles, descriptions, and subjects(?) or at least have some options for what it searches.

There were some questions about the Database subjects (whether they should map to GVSU disciplines/department or subjects) but that will come up in the LibGuides Advisory Committee meeting on the 14th. We also talked about reviewing the database descriptions to remove item specific counts and replace specific coverage end dates with relative times if applicable (e.g. instead of “through 2013” say “through 2 years ago.”)

As it stands, we’ll plan on migrating our database list to the LibGuides list at the end of this semester. As we approach that, Jeff and I will finish marking all the databases with the appropriate subjects.

A few things came up that didn’t come directly from the test, but were comments from folks about LibGuides or Summon generally. Bob suggested removing the Summon search header from LibGuides, since it could be confused with a “guide-specific” search. In the test one student gave up looking at a guide, but said she was going to the “everything search” instead, so this wasn’t observed in the test. My own observations suggest that consistency across the systems is important, and our test participants all seem to know that the search box on every screen does the same thing everywhere, but I told Bob that was a decision we could ask the LibGuides owners about, so I’ll send out a survey question. Related was whether to add guide-specific searches back in to the template, another topic I’ll take up with the Advisory Committee.

Gayle also suggested hiding the “Full Text Online” facet in Summon, since the thought was that students gravitate toward it and thus artificially limit their possible results, especially with as fast of a Document Delivery department we have here. (I think the average turnaround time for articles is 6 hours. Go RapidILL!) The sentiment was shared by other instruction librarians in the room, who all teach students not to use the Full Text limiter.

I agreed to put the question about the Full Text limiter in the survey as well, but in playing around a this afternoon with the idea, it doesn’t look particularly feasible. Removing a filter from Summon’s facets affects the way some of the scripts run on the page, so we can’t make it go away or we’ll break all of the facets. Sorry, gang!

The final part of the test was to test alternative location descriptions for items in the ASRS at either Steelcase or Mary Idema Pew. Kristin came to me with an issue recently, where students were coming to the service desk frustrated after wandering the stacks looking for items that were in the ASRS. We decided to try a better way of labeling that.

I gave each student 4 scenarios related to a book they needed, and showed paper wireframes in place of screens. They described what they would do to get the book, and if it involved clicking a link, I was ready with a new piece of paper with that screen drawn on it. Below are the four search results screens:

Mockup of current call number results
Mockup of current ASRS results
Mockup of ASRS without call numbers
Mockup request button right in results

The last two mockups represented two ways we tried to change the labeling. The first was removing the call number, but leaving everything else the same. This seemed to freak out one of our students, who couldn’t understand where the call numbers went. (After a moment, the link was clicked and on the details screen the book was requested, so it wasn’t a deal-breaker.) The last mockup was one where we ditched the tabular item record statements and also added in action buttons right into the results. Both students “clicked” the right request button right away, and seemed excited about not having to look at the full record to do that.

I had a hard time coming up with a better term for the ASRS, though. I ran the test with “Storage” but Brian felt that it implied slowness. “On-site storage” was also suggested, as doing something similar to what we do with Giv Docs Microfiche: “Ask at Service Desk,” but modified to reflect the worflow, like “[Request] and pick up at the service desk.” Another suggestion was just saying “Available at MIP [Request]” with no location listed at all. (I wonder if that would have the same effect as the missing call number? It would be interesting to test that.) In the end, I agreed to do some more work on the label, and I’ll run some informal guerrila style tests over the next week to see what works and what does’t, and I’ll get back to you.

Thanks for everyone that came (we had a full house!) and also thanks to Suzanne and Tori from Lansing Community College for coming up to observe. I grabbed a quick lunch with them before they left, and we compared notes on our very different campuses and schools, yet remarkably similar student research behaviors.

As always, let me know if you have any questions or concerns, and LibGuides owners, keep an eye out for a survey about the Summon search in the header.