Skip to Search

diagonal spacer

NFAIS Humanities Roundtable VII

October 20, 2008

9:05am - 9:45am:   Digital Initiatives in the Humanities

The opening keynote will provide an overview of the proliferation of digital information initiatives in the Humanities that is taking place in academic institutions across the United States.  Issues such as the impact of collaboration and social scholarship, the rise of new information tools such as From the Page, and efforts in digital modeling, text mining, and the visualization of search results will be discussed.  Content providers and users of humanities information will be given a glimpse of the new types of content and technologies that will need to be integrated into existing products and services in the not-too-distant future.

Julia Flanders, Director, Women Writers Project, and Associate Director for Textbase Development, Scholars Technology Group, Brown University


History:  The naturalization of digital methods, through the ability to use diacritics and images.  The shift from individual to group and back—from one person doing an idiosyncratic product to groups of scholars back to the aggregate layer and cloud computing.  The emergence of standards—TEI, METS, and MODS.  Eventually there has arisen a digital discourse from within the humanities.


Current: awareness of needs for standards, and an urgency about the need for long-term preservation.  There is a lot of excitement about data mining and social networking, and an interest in credibility and reward.


Key topics today: scale, genres of scholarship, and collaboration.


Scale: Small scale projects are individual scholarly monographs, such as a map of early modern London, which was indexed by one scholar, who then invited other scholars to add their knowledge to the work.  A digitized scholarly edition is another example of a small-scale project.  Also, beyond that is a website on Saint-Jean-des-Vignes, which includes the architecture, archaeology and literature of the area, but its small focus in space and time renders it still a small scale project.


Medium scale projects are thematic in nature: the digital women’s writers’ project, the Walt Whitman archive, the Blake archive, or the Chymistry of Issac Newton.  They operate locally; the standards use is inward-looking, and the habits of scholarship are local rather than global.


Large scale projects function as global initiatives, looking for an aggregator effect.  Some examples are Monk, Fedora Commons, JSTOR, Portico, and Aluka.  Here there is an obviously larger field of study, and diminished access barriers.  There are also types like Nines and Bamboo, which are federated, unified interfaces, or Shibboleth, which identifies and aligns adjudicating access.  Flanders describes products like Nines and Aluka as “tip of the iceberg”; Monk “contributes,” Portico “assimilates,” and Fedora “interconnects.”  The main common thread is that these projects go beyond individual works or themes to work on an umbrella level.


Genres of scholarship: the monograph, the edition (which has always been a database, even in print), the database, structured data (XML), and rhizomatic, which Flanders described as connected from the bottom up, an example being Zotero.

Major problems: poor searching of digital collections, lack of economic sustainability, and the complications of licensing and fundraising.


In response to a question about institutional repositories, Flanders remarked that institutional repositories are a fall-back, but only a fallback.


9:45am - 10:15 am:    Search and Retrieval Expectations in a Google Environment 
 Today’s students and young professionals are digital natives.  Their information search and retrieval skills are based on a lifetime of use of intuitive search engines and web-based information.  What products do they use and why?  What do they see as the barriers to utilizing traditional information products and services?  Do they use libraries?  Learn the answers to these questions and more. Gleaned from the results of surveys and focus groups, the data discussed in this session will provide attendees with a better sense of the information search and retrieval expectations, assumption, and needs of the rising population of Google-trained research students. 

John Law, Vice President, Discovery Services, Proquest


Proquest conducted an 18-month long project to investigate database use.  The major findings were: the user is in the driver’s seat, and Google and Amazon set the bar.  Users want seamless access and continuity of searching.  The online library environment must keep pace with escalating user expectations, or risk losing our audience.  Proquest used an unobtrusive method to observe—the only sign of observation was a toolbar at the top of the users’ screens; the users soon forgot that they were being observed.  Also, there was no identification of the observer with Proquest, or the university library.  Law described it as ethnography, field studies of the users in their native environment.  Users were observed for a total of 90 minutes.


One good finding was that library outreach works.  In a follow-up survey, Proquest (still unidentified) asked a graduate student about whether he had been using scholarly resources in the way he had been observed for a long time.  He replied, no, that a librarian had come to his class six weeks prior.  Unfortunately, sometimes this outreach backfires, as in the case of a student using JSTOR to search for biology resources.  When asked why, the student replied that his English 101 instructor had recommended JSTOR, so he used it for everything—obviously with less than good results.  This last example also shows that students sometimes have brand awareness, but little to no database awareness.


A finding with mixed blessings for libraries was that the vast majority of users attempted to use the library website.  If they could figure out the way in, they tended to have no further problems.  However, most of them had real problems navigating the site in order to find appropriate e-resources.  What Proquest found is that the catalog search box is front and center on most library websites; users did not understand that they have to go to a different search screen in order to access articles.  When they find no articles in their catalog search, they assume the library has nothing of interest to them.  A federated search was far too slow; users move away from it while waiting, and tended not to come back.


A further survey from Proquest investigated how the users were using Google.  Unfortunately for libraries, Google is the primary research tool for users when the library catalog has failed them.  Otherwise, it is used for supplemental research, and except for the previous situation of library failure, for a handy look-up, most frequently to locate known resources, or to get specific answers.  Another finding was a feeling that Google suffices, when quality is not a concern. 


Most important for libraries:  Users are insufficiently aware of library e-resources, and they have had bad experiences with library e-resources. The end-user survey had 10,000 respondents from 7 universities.  62% felt that resources outside the library website were more useful.  90% preferred Google for quick look-up and fact–finding.  While it was true that the library has a vast amount of resources, one needed to work very hard to get to them.  Law then mentioned the Ithaka report (; sharing the news that faculty are decreasingly dependent on libraries, that its use as a gateway to knowledge has fallen due to the ability to accomplish disintermediated searching.  Another report Law mentioned was the Simon Inger report—an audience member thought it was entitled something like “How users navigate scholarly resources” (


What libraries need to do:  make resources discoverable; simplify the e-resources web page design; address the misperception that the catalog has everything; build awareness of the resources.


10:15am - 10:45am:   Article Discovery in a 21st Century Library Environment

Today’s libraries are evolving to meet the needs of a digital information society and they are redesigning the functionality of library information management systems.  As a result, new business opportunities are emerging. By highlighting access to books, videos, music, articles and other media for individual users, OCLC is creating in WorldCat a new kind of resource-sharing facility. The WorldCat platform encourages, incorporates user-generated contributions, and is providing applications for such social networking sites as FaceBook.  Learn more about how a traditional library service can be transformed into an information discovery tool for digital natives in the 21st century.

Janet Weber, OCLC Online Computer Library Center


Users want information where they are; they do not recognize authoritative content.  In 2007, OCLC launched  Now, there are 2 million unique users per month, 13 million page views, and 6 million full views.


Worldcat Local has been customized for local discovery and interoperability.  The University of Washington was a pilot site for Worldcat Local.  They have experienced a 70% increase in borrowing and a 100% increase in ILL requests.  UW experienced 2 million pageviews from Worldcat Local, and is investigating adding a metasearch component.  Worldcat is the second most used e-resource, after Web of Science.  MIT, Harvard, Cornell, UCLA, and Ohio State are adding Worldcat Local in the near future. In July 2009, OCLC will merge four different platforms for reference services into one user interface.


11:00 am - 11:45am:   Building Information Products for Today’s Users: Leveraging Student Input 

With the born digital generation now moving into faculty and research positions, it is essential that information products and services meet the needs and expectations of this new generation of information seekers.  An earlier session in the agenda focused on the information search and retrieval need of young users.  This session will focus on what information resources need to look like in order to engage the modern user and how information providers can elicit useful feedback from advisory groups for their own products and services. 

Kate Wittenberg, Manager Publishing, Center for Digital Research & Scholarship, Columbia University; and Megan French, graduate student in history at Columbia.


Wittenberg wanted to look at student use of scholarly communication; with French, she was able to investigate the usage where the students live (somewhat like the ethnography that Proquest did).  French was able to connect with and ask questions of undergraduates without spoiling the results.  Wittenberg and French found that the library is a social networking and work space; the wireless network is one of the major reasons students go to the library.  It is also a community of their peers.  Wittenberg feels that lessons can be learned from multi-player gaming environments to enable libraries and publishers to meet the needs of digital natives, furthering the use of electronic devices for information access.

Issues: New ways to organize, store, and deliver information

Tools and functionality is as important as the content

Need to rethink publishing models


Scholarly publishing’s past: Control of content discovery and delivery as well as content creation; focus on protection of content for traditional uses; avoidance of partnerships with commercial enterprises; disapproval of students’ use of technology as “entertainment.”


Scholarly publishing’s future:  Digital publications that allow exploration of Web resources with selection and quality guidance; storage and delivery for remote access on multiple devices; interoperability with online networking and gaming communities; close communication with users.


How to move forward: Partnerships among educators, technology creators, and publishing industry; guidance from the user community through intensive observation and conversation; understanding of the users’ other online environments for searching social networking, and gaming.


Of prime importance is to show the credibility of content.  Students are very experienced at finding resources, but much less experienced at assessment.  What is the role, if any, of teachers and librarians? 


What we have at present is top down assessment—experts evaluate the quality of content prior to publication; what we need is peer-to-peer review, where the community decides the value of content, whether it be music, research materials or learning resources.  The traditional model leaves the end user out of the process; the peer-to-peer puts assessment in the hands of the community of users.  Is there some way to combine the approaches, to acknowledge the peer-to-peer as well as the top-down?


Possible approaches to the credibility issue:  educational resources that combine teachers’ materials, digital library holdings, the open Web, and a collaborative community space.  Students gain skills in examining the provenance, authenticity and context of content that they use in their learning.


Case Study: the Amistad Digital Resource.  New York State legislated that black history must be taught on the K-12 level, but there were no materials for the teachers to use.  The Digital Resource Project was created to fill the void.  The project has created 9 modules with 15 units.  It provides a basic narrative, created by experts in the field, as well as local portals for information, allowing for the use of multimedia.  The idea was to create something feasible for the teachers to create a curriculum.


Google is a model, not an end.  We must adapt to the ways of providing feedback that our target audience is used to; we need to work backwards from the student/teacher perspective so that we create new digital models that better serve the needs of our users.  We need to allow the communities of scholars to talk back to the digital products so that they can be modified.


Next Generation Publishing:  It is no longer possible to be a publisher without a close partner in librarianship and technology.  There is a lot of opportunity to understand our users.  We need facilitators, who can work in collaborative partnerships, and translate across the lines.


12:45pm - 1:30pm: Social Networks, Online Communities, and Immersive Virtual Worlds 

Social networks are part of the mainstream, attracting participants across all age groups to a life observed at least partially through such branded services as Facebook, LibraryThing, and Google as well as services offered through such white label platforms as Ning.  Learn about the potential for these services in the context of the Web and your targeted audience.  Jill O'Neill will share observations and insights from one participant's experience in worlds where grey-suited executives and animated figures formed through computer graphics share equally in the benefits of the online environment.  

Jill O'Neill, Director, Planning & Communication, NFAIS

“Participating at the Renaissance Faire”

Some social media are open, such as Amazon, or Blogger; others are closed, like LawLink or the American Chemical Society, where one has to be a member to belong.  Some, like LibraryThing, have a membership, but there are no credentials required for being a member. 


Social networking is moving from the individual level to distributed networking, where there are more aggregated than proprietary platforms; there is also a move to mobile social networking (Twitter can be accessed by mobile phone, and the like). is a social web browser; friendfeed combines several social networking sites into one browser.


Concerns:  Control, ownership, privacy, and governance.  One accretes one’s presence over time and platforms in these social networks.  There has been a move to be more authentic in the interests of productivity and sociability, but there is a loss of privacy.  While privacy controls exist, the data remains in the control of the provider.  However, some users feel that the benefits outweigh these problems.  In 2006, three people on LibraryThing started a pub called the Green Dragon; in 2008, membership has grown to over 1500.  At some point, people felt a level of trust that they began using their real names, and have set up face-to-face meetings.


Metrics are uncertain, although some sites increasingly include some statistics of activity in the user’s profile.


One of the benefits is the ability to communicate asynchronously across time zones and borders; a further benefit is an ability to expand an existing network and exchange information.  The 2008 Digital Future Project showed that, of those surveyed, 54% visited an online community at least once a day; 71% said that the community was very important to them, and 55% said they felt strongly about its importance.  There was no discernible difference between those over 55 years of age and those under that age.


On, one can set up one’s own community.  The major point to remember is that these communities grow organically and cannot be forced.  Half of the impetus can come from the organizer, but the other half will come from the participants, as in the Green Dragon example above.


Questions to ask before setting up an online community:

Why are you creating this? For example, LinkedIn is a community of professionals in a certain discipline.

Who is going to manage this community?  It takes tech support, responding to problems quickly, fostering interactions, and ensuring a pleasant environment (no trolls).  One has to be willing to relinquish control, making the space available to participants, and not getting in their way.

One has to build a compelling site, with easy navigability, functionality, and effective use of graphics.  One must have compelling content, and provide plenty of options that allow users to choose the depth of their involvement and participation.  There is also a need to build a way for people to meet physically if they are interested in doing so.


Immersive virtual worlds like Norien or Second Life:

1.5 million registered users in Second Life; there are 68,000 simultaneous users.  There is more than $1,000,000 in user-to-user transactions.  It has been predicted that 80% of internet users will have a Second Life presence by 2011, although O’Neill thinks that prediction is somewhat inflated.


Current barriers:  There is a steep learning curve for some of these online communities; they lose people in 3.5 hours.

The systems are not yet browser-based.

Metrics need to indicate ROI (rate of increase).

Standards and interoperability need to be hammered out.

Security of data is a major concern.


1:30pm - 3:00pm:   Integrating Web 2.0 with Content Delivery

Web 2.0 technologies are essential channels for both communication and the distribution of information in today’s web-based environment. This session will highlight three innovative organizations that have incorporated Web 2.0 technologies (wikis, blogs, podcasts, etc.) into their regular content delivery and communication channels.  They will discuss the challenges that they faced in the implementation, the opportunities that have resulted, and the response of their users to date.

Scott Jascik, Editor, InsideHigherEd; Michael Ross, Senior Vice President/Education General Manager, Encyclopedia Britannica; John Houser, Senior Technology Consultant, PALINET


InsideHigherEd is a website with news and free job ads for jobseekers to peruse.  The employer pays for the job ads.  It is a community with a diversity of issues; they receive comments on every story. The commenters are overwhelmingly male and conservative.  However, the biggest enhancement has been the addition of blogs on the site, which are not as rigid as the stories.  There is no one approach; some of the bloggers are anonymous and Jascik is the only person who knows who they are.  The most popular are Mama Ph.D., and Confessions of a Community College Dean.  InsideHigherEd has also looked into Facebook, which they use mostly as an interview tool—to get a large audience response to a particular question; and Twitter, which does not garner a big response; and RSS feeds, which are subscribed to by less than 1% of their membership.

      takes a much more traditional approach.  As Ross said, Google and Wikipedia are agnostic about Britannica, but Britannica cannot be agnostic about them.  Trends are critically important, and a disturbing trend is the downward trend of library use.  Ross mentioned that schools are far too wary of the students’ experience with technology, citing that high school students walk into school with more technology than the school has, and are told to put it all away.  He mentioned that there is “50 times more stuff behind firewalls than outside of them.” 


Britannica uses blogs, but in a much less free-form manner.  The blogs are written by experts in the field, so that Britannica’s authority and trustworthiness stand behind it.  The blogs actually function more like the expert narrative that French talked about in the Amistad project at Columbia.  The blogs offer a professional narrative plus websites seen through the filter of authority.  Ross was concerned that users still go to Google; he also voiced his concern that the subscription model may not continue into the future as users become less interested in buying ad-free information.


Houser talked about setting up podcasts at PALINET, a non-profit organization employing about 20 people.   Why podcast?  Their clientele is about 600 libraries in the mid-Atlantic region.  Their clients wanted to be able to access information on a time-shifted schedule, and in a way that appealed to a different learning style than having to all be in one place at one time.


PALINET began interviewing experts in 2005, moving to a blogging formation in 2006.  The podcasts can be accessed via Itunes or RSS feed.  There are 41 podcasts in 6 series; the podcasts range from 3:52 minutes to 82:34 minutes.  To date, PALINET podcasts have been accessed 13,317 times via the web and 1,685 via RSS.  They are distributed through a bog entry: PALINET has 9 blogs, 3 wikis, and one podcast series.


Assumptions about users who access the podcasts: working professionals, older, may not have an MP3 player, unlikely to have a portable video player, overwhelmingly English speakers.  They use the podcasts for lectures, support, and distance education.


Reasons not to podcast:  Make sure you have a user population that wants it, because it is time-consuming to create the podcasts (about 3 hours).  It is not particularly expensive as far as materials—a USB microphone and Audacity record editing software.


Make sure you have a feedback loop—you need a prominent way for people to comment on the usefulness (or not) of the podcasts.


PALINET has heard from its clientele that the podcasts are their most popular offering.  They have had a positive effect on PALINET’s consulting and educational programs.


3:15pm - 4:00pm:    Authoritative Content: The Value of Abstracts

There is much discussion, particularly in the abstracting and indexing community, about the user-perceived value of abstracts and indexes in a world dominated by search engines and the Web.  In this session the results of a recently completed survey regarding the user-perceived value of bibliographic records and abstracts as a contribution to the body of authoritative content will be discussed. The research is based upon a survey of users of the International Bibliography of Art.

 Terence Ford, Head, Research Databases, the Getty Research Institute.


A study was done to justify continuing the work of creating the Bibliography of the History of Art (BHA) to the Getty administrators.  The BHA adds 24,000 records a year, and is 17 years old.  There are articles in journals, monographs, and exhibition catalogues.  It is distributed to OVID, NISC, and OCLC. 


There were over 2000 respondents to the survey; the respondents were museum professionals, scholars, and students.  They indicated the frequent databases they searched, which were: Google, JSTOR, BHA, and Art Index.  Age was not an indicator in use of search tool, nor was role, nor objective of research.  Users indicated that the most useful features were the abstracts and index terms, as well as the source lists; the most used documents were peer-reviewed journals, exhibition catalogues, and books.


The survey results did not surprise the BHA creators, but they found it useful to have numbers to give the administrators in order to justify continuing the BHA.  Ford felt that the results showed that users value authoritative content, and that scholars need scholarly products.


4:00pm - 4:30pm:    Closing Keynote:  Authoritative Content in the Digital Age.

We live in a new information order.  Social networks abound, collaboration among scholars is becoming the norm, and the born-digital generation is now assuming the roles of faculty, scholar, researcher, and business leader.  As a result, the way in which authoritative content is presented, accessed, and retrieved must change to meet the expectations of the new generation who will now use it.  This session will discuss the future of collaborative approaches to information discovery, the design of intuitive and transparent search tools, and the need for information environments that engage the user and encourage information exploration.

Stephen Francoeur, Digital Reference and Information Services Librarian, Newman Library, Baruch College


Francoeur began by delineating the user population he serves at Baruch: mostly commuter, most diverse community in the CUNY system, immigrants or children of immigrants, speaking over 100 different native languages.


Web 2.0 is about sharing, about re-using.  The 2008 Horizon Report from Educause shows a growing use of 2.0 and social networking combined with collective intelligence and mass amateurization, which is gradually but inexorably changing the scholarship.


Web 1.0 gave us the URL, the coin of the realm, which we bookmarked, emailed, cited, saved, or linked to.  We are no longer giving the kind of URLs they need, which is the reason for the rise of TinyURL. Francoeur looked at some of the URLs in our bibliographic databases.  Academic Search Premier and ABI/Inform had huge URLs that were impossible for users to cut and paste efficiently into a browser.  Library Literature was even worse, including the entire citation in the URL.  JSTOR had a small, easily copied URL. has a handy little box on the results page: “bookmark this.” 


The “last mile” problem is the biggest: we need to find ways to get our users to figure out how they can access things through our library.  If Cambridge can tell I cannot access it, why can’t it tell me how I can access it.  We are surrounded by networks like delicious, connotea, and nature network where we can take a tag and subscribe to it, find a community that you trust and add to your network—why can’t our library websites function in the same way? 


Clay Shirkey said: “It’s not information overload, it’s filter failure.”  Publishers used to be the filter, deciding what got published or not.  Now other people’s recommendations become the filter, which is dangerous unless there is a trust community.  Digg shows the ratings of sites; even the New York Times shows the most emailed, the most blogged.  Why not the top ten downloaded articles for industrial psychology?  Amazon gives user reviews, why don’t we?  Users would recognize this kind of filtering and ranking, because they see it every day.


In Bibliocommons, you can find someone with the same tastes—perhaps just in poetry, but not science fiction—and get notified every time they add something that might interest you.  Widgetbox is a clearing house of widgets—we should deploy widgets to get our users to come to the library website; vendors should use search widgets.


Recommendations:  More usable URLs for 2.0 uses

Prepend authentication to URLs rather than stopping users in their tracks

Expose content to the web, then help the users get back to their library

Help searchers make connections


4:30pm:      Adjourn


University of Florida Home Page