Saturday, February 4, 2012

Evaluating Academic Libraries

(Originally posted February 4, 2012) 

Whether simply updating an existing Web site or conducting an overhaul to a new Integrated Library System (ILS), information specialists and library staff make a conscious effort to reach out to users and conduct evaluations in an effort to produce a new Web layout which adds to the user experience. However, depending on factors such as library type, user demographics, and the amount of online content up for revision, evaluation methods will vary. Through the evaluations conducted by Robert Fox & Ameet Doshi, Beth Thomsett-Scott, and Robert L. Tolliver and his colleagues, readers are exposed to different strategies which serve primarily as templates to take into account when planning future evaluations.
 

Note that, as case studies, the articles lack a central argument, but serve as documentation of experiments. Though they formulate conclusions for "what works," much is left to the reader. For instance, Fox & Doshi (2011) functions primarily as a reference source--providing answers to a 31-part survey in which Fox & Doshi sent a 31-part questionnaire out to members of the Association of Research Libraries. In transcribing the answers from the 74 responding participants, Fox & Doshi provide a view of the degree research libraries interact with their patrons and how these libraries incorporate feedback from the public into updates and revisions in their Web presence.

More hands-on are the studies conducted by Thomsett-Scott (2004) and Tolliver et al. (2004). In the case of Thomsett-Scott, undergraduate, graduate, and on-line learners were hired to assist in evaluation the Web site at the University of Northern Texas. Through usability studies, focus groups, and a cognitive walkthrough students provided feedback on what worked and what did not (with a little assistance from chocolate). Ultimately, Thomsett-Scott's focus is on the means of evaluation method moreso than the results, and explains just as much in her closing. "Libraries serving off-campus users will find user satisfaction with their Web site increase through the use of Web site usability testing." (Thomsett-Scott, 483)

Conducting similar-but-different evaluations is Tolliver et al. (2005). With the help of a consultant, the University of Michigan Art, Architecture & Engineering Library examined how to best make the transition from a dated Web site to one which moved all the data to a content management system (CMS) (Tolliver et al., 157). After assessing costs, the library went with a testing approach which included sector interviews, card sorting, and a paper prototype test (Tolliver et al., 161). Comprised of both library staff and students, the testing process effectively sought feedback from the groups who would utilize the Web site. The most confusing and interesting process was the card sorting. Almost a game, it afforded participants with an opportunity to organize cards under subjects (which ultimately would be used in designing the site map). 

Ultimately, Tolliver et al. make an excellent point by noting that consultants can serve as a valuable resource in that they come to the project without bias (Tolliver et al., 165).
The methods practiced in each article are similar to those used by exhibit specialists and museum educators when either anticipating possible revisions to an existing exhibit or creating one from scratch. When creating a new exhibit, specialists go through three steps: the preliminary, formative, and summative evaluations. Specialists conduct interviews with the public at large--based on the targeted demographics, and try to gauge both the interest and level of knowledge in an idea or theme. Then, a mock-up exhibit is created and specialists conduct a secondary survey to see how visitors respond to the collection and display--an effort to see what works and draws interest, and what does not. After looking over the results, the final exhibit is crafted, and, over time, educators conduct studies to examine the number of visitors, the amount of time spent either in the exhibit or at a particular display. Additionally, some visitors are asked to participate in surveys to, again, determine what they liked and what they did not. 

Comparing the task of museum specialists and educators with that of library Web (re)designers, a stronger role--one with greater authority and responsibility--when the designers are implementing a new program/exhibit from scratch and seeing the process through to its final phase. 

For library and information scientists, these articles touch on topics which must always remain imbedded in their minds: just as technology continues to evolve, the public at large continues to adapt and seek out the latest trends--those which fail to stay abreast will be considered obsolete and discarded. With that in mind, because libraries ultimately serve the public, their survival depends on their ability to stay in touch with public tastes and innovation.


_________________________________________________________________

Fox, R., & Doshi, A. (2011). Spec kit 322: Library user experience. (p. 199). Washington, DC: Association  of Research Libraries.
 

Thomsett-Scott, B. (2004). Yeah, I found it! Performing web site usability testing to ensure that off-campus students can find the information they need. Journal of  Library Administration, 41, 471–483.
 

No comments:

Post a Comment