Session Description

Our community college library undertook a usability-testing project running proprietary, subscription-based analytics tools; when these tools did not work as expected the solution was to build our own testing and analytics environment that was easier to maintain and debug. We collected and analyzed custom data to make recommendations on the library’s webpage design.

Type of Library

Community College Library


usability testing, data analysis, library website design, custom-built tools


May 4th, 10:00 AM May 4th, 10:50 AM

Website Usability Testing with Custom Tools in a Community College Environment

Carver Room

Library webpages are at the core of contemporary library services, and should be updated frequently, based on well-defined user-generated data. This means that technical decisions should not be the only the factors considered when updating web content. When building a library web presence, we should aim to overcome the mistaken idea that gathering user input on design is difficult or unwieldly.

To bring this mindset to our community college library, in 2016 we received a grant to fund a usability study. This study aimed at gathering quantitative data to support data-based decisions about our library website. The goal was ultimately to improve the use of our college library webpages measurably.

The methodological approach was to build a testing environment (mostly in JavaScript), where students could interact with prototype webpages to execute a number of pre-defined tasks. We also used a subscription-based analytics tool to record users’ interactions with these prototypes. This allowed us to gather a significant amount of data on how users carry out tasks on the various prototype library webpages.

Yet despite our best efforts at pre-testing our technology setup, the combination of technologies that we initially chose failed dramatically in a real testing environment. Specifically, the proprietary subscription-based analytics software we used did not work as intended, and was almost impossible to debug. Our project came to a sudden stop because of technical issues that we found very difficult to resolve.

After much consternation and some time spent reflecting on the problem, the solution was to build our own tool to gather usability analytics, rather than rely on a third party solution. We built a service that uses Flask, a Python web micro-framework, to transform the data created by participants’ interactions with the prototypes into a CSV file, a format that is easy to open and work with in spreadsheets such as Excel.

We learned several important lessons from this experience: There is value in building your own tools, which are often easier to use, understand and debug than proprietary tools. Moreover, using homegrown tools can also create a more user-friendly environment for participants. Most importantly, homegrown tools can ultimately increase the quality and reliability of the collected data. The tools we built are now openly licensed allowing others to benefit from our work.

Once we had built the technologies we needed, we were able to move forward and successfully complete our grant project. After several rounds of testing and refining our prototypes, we had gathered significant data, and developed recommendations that will ultimately move our library toward a more user-friendly and well-tested web interface that better suits the needs of our stakeholders. Our students, faculty and administrators will all benefit from our study and the improvements implemented as a result.


To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.