Getting useful quantitative data for how library users search online is difficult. The vendors who make our catalogs and discovery services often provide search logs, but that information isn’t useful for knowing how people behave after the search has been performed. After all, what someone searches for doesn’t tell us whether they found any of the results worth looking at or used the sources in their research.

A few months ago, I had an idea for how to get data on what our users were clicking on in Summon. I figured I could assign a unique, incremental value to each result in our search, and then record that value when a link is clicked. If we had that kind of data, we might start to understand our users’ behavior at a point as close as we can get to their use of our materials (without getting creepy or breaking laws).

Since Summon relies on JavaScript, I knew that all of our Summon users would have JavaScript enabled, so we wouldn’t be missing any usage data by relying on it. I wrote a simple script that uses jQuery to assign unique values to each results link. Since Summon has four links for each result that take you to the full-text, I assigned different values to each link. The last step was to add an event listener to each of those links and trigger a function whenever anything is clicked. In the function, I also grab the page number and the type of item the user is clicking on (Book, Journal Article, eBook, etc.). I started running it at the end of April.

It worked pretty well.

We’ve learned a lot about how our users interact with the Summon search results from collecting this data. You can even see a live snapshot of the data we’ve collected since last Monday, if you like1. If you’re a Summon customer, check out the latest revisions to the tracking code. You can grab the source off Github, and with a few quick changes, have it running on your setup.

Dave Pattern from the University of Huddersfield has been running the script almost as long as I have, and he’s done a nice analysis of what he learned after just a week of collecting data. (I’m sure more revelations will be forthcoming.)

Last month, Dave and I were honored to speak at a Summon event at ALA Annual about our experiences with Summon and our little analytics project. I recorded my part of the talk and posted it below for you to watch. In addition, I gave a little lightning talk at Code4Lib Midwest last week on how to make a generic script to work on any system to track this kind of data. In it I talked a bit about how I dealt with doing Cross-domain requests in JavaScript, which is always a headache. There is also a Gist on Github of the code I used for that talk.

If you have any suggestions or feedback, as always, I’d love to hear them. Twitter is the best place, although you are welcome to email me.

GVSU and Summon at Three Years

ALA Annual, June 23, 2012

Watch on Vimeo

Guerrilla Analytics

Code4Lib Midwest, July 24, 2012

Watch on Vimeo


  1. Last Monday I deployed a new version of the script that stores its data in a different format. So until I clean up the previous data, it won’t appear in the live snapshot.