(Originally posted February 18, 2012)
The focus of class this week was understanding the fundamentals behind
the System Development Life Cycle (SDLC). Though there are varying types
of SDLCs, the Traditional method involves five essential steps:
investigation, analysis, detail, implementation, and maintenance/review
(Kim, Lecture Notes, 4/16/12). In his article on SDLCs, H. Frank Servone
(2007) mentions eight steps (expanding on analysis and adding a
construction step) as he examines SDLCs in the context of a digital
library. Pragmatic, Cervone asks the important questions any reasonable
person assessing an information system should take into
account--particularly possible constraints which may hinder a
project--specifically matters such as time and budget constraints, and
whether the public would be able to adapt (Servone, 150). In short,
researchers must simply determine whether a project is feasible.
Zhang et al's "Integrating Human Computer Interaction into the Systems
Development Life Cycle" (2005) does not bring about a particularly
revolutionary idea. In short, end users need to be involved in the SDLC
process. This only makes sense, because just as businesses regularly
seek feedback from their customers on ways to improve service, so too
should IT specialists and developers look to feedback from the patrons
of the information system. What is startling is that this does not
happen enough. Zhang et al provide charts and explanations for why users
should be involved in the experience--making heavy use of the term,
"human computer interaction" (HCI) (Zhang et al, p. 512). Recognizing
that the emphasis was not necessarily on just humans, but the end users,
I was fascinated to see that that term was not used once in article.
Written in 2005, the article is slightly dated, because a new field has
emerged which focuses on such a study of HCI.
Combining qualities of computer and cognitive science is "human
factors." This field does not just study HCI, but the interaction
between humans (with regards to all the senses and the body as a whole)
and all devices. Through this field, examining how a human interacts
with a car can lead to improved ergonomics, increased comfort, and a
safer, more pleasant driving experience. The same can be said with
regards to computers. It's still an emerging field, and there is still a
level of disconnect between developers eager to push a new idea and a
wary public which attempts to adjust.
Tony Drewry's "Data Flow Diagrams" is an online instructional tool the
breaks down the complexities of the information exchange on the Internet
through diagrams. Reassuring readers that there is indeed a system by
which data is received and disseminated after it is sent, Drewry reminds
his readers that the Internet has a magical way of effectively turning
online exchanges into the electronic bureaucracy.
To be honest, much of this information is common sense, and I suppose it
derives from an understanding of physics and an understanding of how
information is exchanged. However, tying a simple idea to computers and
the Internet complicates things and, while the philosophy is easy to
grasp, it becomes unnecessarily complicated by the increasingly
"pedantic semantics." Because IT specialists/Web developers are
anatomically human and naturally involved in the update and revision
process, a more accurate term than HCI would be "EUI" (End User
Involvement).
To paraphrase Mark Twain, "eschew obfuscation."
_______________________________________________________
Cervone, H. (2007) "The system development life cycle and digital
library development", OCLC Systems & Services, Vol. 23 No. 4,
p.348-352.
Drewry, T. (2005). Data flow diagrams.
Zhang, P. et al. (2005). Integrating human computer interaction
development into the systems development life cycle: A methodology. Communications of the Association for Information Systems, No.15, p. 512- 543.
Hi there and thanks for visiting. My name is Sharad and I have been working in libraries, special collections, and museums since 2007. A history buff, my interests in the field cover a broad range of topics, and I simply have an unquenchable thirst for learning more. With a master's in history, I am polishing up on a master's in library and information science, and this blog is designed to focus on both my professional and personal interests.
Saturday, February 18, 2012
Saturday, February 4, 2012
Evaluating Academic Libraries
(Originally posted February 4, 2012)
Whether simply updating an existing Web site or conducting an overhaul to a new Integrated Library System (ILS), information specialists and library staff make a conscious effort to reach out to users and conduct evaluations in an effort to produce a new Web layout which adds to the user experience. However, depending on factors such as library type, user demographics, and the amount of online content up for revision, evaluation methods will vary. Through the evaluations conducted by Robert Fox & Ameet Doshi, Beth Thomsett-Scott, and Robert L. Tolliver and his colleagues, readers are exposed to different strategies which serve primarily as templates to take into account when planning future evaluations.
Note that, as case studies, the articles lack a central argument, but serve as documentation of experiments. Though they formulate conclusions for "what works," much is left to the reader. For instance, Fox & Doshi (2011) functions primarily as a reference source--providing answers to a 31-part survey in which Fox & Doshi sent a 31-part questionnaire out to members of the Association of Research Libraries. In transcribing the answers from the 74 responding participants, Fox & Doshi provide a view of the degree research libraries interact with their patrons and how these libraries incorporate feedback from the public into updates and revisions in their Web presence.
More hands-on are the studies conducted by Thomsett-Scott (2004) and Tolliver et al. (2004). In the case of Thomsett-Scott, undergraduate, graduate, and on-line learners were hired to assist in evaluation the Web site at the University of Northern Texas. Through usability studies, focus groups, and a cognitive walkthrough students provided feedback on what worked and what did not (with a little assistance from chocolate). Ultimately, Thomsett-Scott's focus is on the means of evaluation method moreso than the results, and explains just as much in her closing. "Libraries serving off-campus users will find user satisfaction with their Web site increase through the use of Web site usability testing." (Thomsett-Scott, 483)
Conducting similar-but-different evaluations is Tolliver et al. (2005). With the help of a consultant, the University of Michigan Art, Architecture & Engineering Library examined how to best make the transition from a dated Web site to one which moved all the data to a content management system (CMS) (Tolliver et al., 157). After assessing costs, the library went with a testing approach which included sector interviews, card sorting, and a paper prototype test (Tolliver et al., 161). Comprised of both library staff and students, the testing process effectively sought feedback from the groups who would utilize the Web site. The most confusing and interesting process was the card sorting. Almost a game, it afforded participants with an opportunity to organize cards under subjects (which ultimately would be used in designing the site map).
Ultimately, Tolliver et al. make an excellent point by noting that consultants can serve as a valuable resource in that they come to the project without bias (Tolliver et al., 165).
Comparing the task of museum specialists and educators with that of library Web (re)designers, a stronger role--one with greater authority and responsibility--when the designers are implementing a new program/exhibit from scratch and seeing the process through to its final phase.
For library and information scientists, these articles touch on topics which must always remain imbedded in their minds: just as technology continues to evolve, the public at large continues to adapt and seek out the latest trends--those which fail to stay abreast will be considered obsolete and discarded. With that in mind, because libraries ultimately serve the public, their survival depends on their ability to stay in touch with public tastes and innovation.
_________________________________________________________________
Fox, R., & Doshi, A. (2011). Spec kit 322: Library user experience. (p. 199). Washington, DC: Association of Research Libraries.
Thomsett-Scott, B. (2004). Yeah, I found it! Performing web site usability testing to ensure that off-campus students can find the information they need. Journal of Library Administration, 41, 471–483.
Subscribe to:
Posts (Atom)