Skip to Search

 
#

The Resource Navigation Task Force MetaLib Interim Implementation Report and Recommendations

Status

The Resource Navigation Task Force has met several times in the past weeks to explore the features of MetaLib Version 3 and discuss implementation issues. We have given the interface a more modern tabbed appearance. Although version 3 allows for more customization than earlier versions of MetaLib, we found that substantial changes to the interface are still difficult; thus the look of MetaLib tends to be similar across libraries. Most of our licensed databases have now been created in the UF MetaLib knowledgebase. We created a number of Quick Sets to demonstrate different approaches to the Quick Search function. We developed provisional lists of subject categories and subcategories and began designating databases for each subcategory .

Recommendations

  1. Conduct a period of intense in-house testing and evaluation followed by beta testing open to student and faculty testers.

    • The task force is very concerned about releasing a service that has proven problematic at some of our sister institutions. The University of North Florida’s decision to reject MetaLib for the outsourced Serials Solution product (Central Search) was, according to Bob Jones, driven by frustration with MetaLib’s “terminology, unintuitive navigation, and workflows of the product for public use” ” as well as the tedium and time required to set it up and maintain it. The problems encountered at the University of South Florida remain somewhat vague but the UF task force members were very concerned over the reports of faculty losing search results in “My Space” and the resulting bad publicity for the USF libraries. We felt the product should be thoroughly tested against criteria defined by the Resource Navigation Team to identify any problems that may be lurking within our implementation of MetaLib 3.

    • The task force members did not reach total consensus on how the testing should be conducted. Based on the history of new product implementations at our library, the concerns with this particular product’s problems at the University of South Florida, and the decisions at the University of North Florida and Gulf Coast to use another meta-search product, most members of the task force felt that a period of in-house testing was necessary before debuting the product before the public. Others stressed the importance of eliciting feedback from actual users, that testing by students and faculty would simulate real world conditions and use. The task force recommends a period of intense in-house testing and evaluation. The in-house staff testing will be followed by beta testing open to students and faculty but only if the alpha test shows MetaLib to be acceptable to the library faculty and meeting defined criteria .

    • The in-house testing would begin July 1 and feedback would be requested from the entire staff. In addition, the task force has identified a few likely candidates among the staff to be “super-testers” and we will ask for other volunteers at the staff sessions. While we hope the entire staff will participate in the evaluation and provide feedback, the task force will rely on the super-testers to thoroughly test and document all aspects of the product. The task force is particularly interested in feedback on the areas that were a problem at USF, notably the “My Space” function, the “MetaSearch” portal functions, and the “Quick Search” function. We will also encourage searchers to do comparison searching between the meta-search portal and individual databases in their native interface. Our goal is to identify and correct the most obvious problems before releasing the product. Once a ll ny problems are identified and corrected, we can make the service available to members of the university community for beta testing.

    • It is difficult to project a date for beta testing and the ultimate goal of full implementation for the public. Much depends on the library staff reactions and findings. We recommend spending at least the month of July on this portion of the testing. Few students and faculty will be available in early August. Public service and instruction library staff would be unlikely to support any public usability testing during the intensely busy beginning of the fall semester. A possible public “beta” testing period might be later in the fall semester.

        
  2. Conduct Staff Sessions

    • The task force will conduct at least two sessions open to all staff. At these sessions, we will explain what a federated search is, briefly demonstrate a search across several databases, and demonstrate the other functions of the service. We will stress the importance of exploring the unique capabilities of MetaLib and thinking about how we might integrate it into our instruction and reference activities.  We will also briefly touch on the problems at USF and the UNF decision, explain the need for the in-house test, encourage the staff to participate, and ask for more super-tester volunteers.

    • The task force also hopes to meet with collection management, public services, and the instruction groups. We have already talked to some of the instruction staff and agreed with them that a reinvention of the ENC1101 and ENC1102 class scripts is not advisable at this time. We are particularly interested in input from the collection managers concerning the categories and subcategories as well as which databases should be assigned to each. Our library staff members know a good (or bad) product when they see one and only with their invaluable input can we make the decision on whether to move forward with MetaLib.

  3. Continue to Research MetaLib 3 Results at Other Institutions.

    • During the staff   test evaluation period, the task force will identify criteria for the evaluation of MetaLib and also continue to identify other MetaLib 3 institutions and investigate any testing results or evaluative reports at these institutions. We are interested in discovering if other libraries have had problems similar to those encountered by USF. We would also like to identify any usability testing conducted by others .

LeiLani Freund
Tom Minton
6/20/05

University of Florida Home Page