Measuring the Quality of OpenURLs: An Interview with Adam Chandler
NISO’s Business Information Topic Committee approved in December 2009 the establishment of a new working group called IOTA—Improving OpenURL Through Analytics. Chaired by Adam Chandler, E-Resources & Database Management Research Librarian in Central Library Operations at Cornell University, the working group will build on work previously conducted by Adam at Cornell. Jim LeBlanc, Director of Delivery & Metadata Management Services and Adam’s colleague at Cornell, talked to him about the work he had already done and the follow-up project at NISO.
Let’s start with something simple, Adam. What are OpenURLs?
Back in the 1990s, the only way to link from an article citation to a full text document was through something called bilateral linking. Each vendor needed to pre-compute and maintain all the links between their site’s content and every other vendor site they linked out to. Then Herbert Van de Sompel and his colleagues at Ghent University came along and figured out a way to pass metadata to software that knows something about a library’s collection, a method to exchange information to help a patron answer the question: does the library have access to this resource—print or electronic—and if so where is it? They essentially moved the job of maintaining the links to a brand new node in the supply chain, one optimized for the task: the “link resolver.” Then they proposed a standard for the syntax of this “OpenURL” that would allow for predictable transfer of the resource’s metadata
The development of OpenURLs was hugely successful, because it addressed what was known as the “appropriate copy problem,” a term that refers to the inadequacy of standard URLs to lead a user from the citation of an article to the most suitable full-text copy of that article. Commercial link resolver software was developed in the early 2000s to take an incoming OpenURL and: (1) determine if the library has a subscription to the journal in question, and (2) if so, present a new URL to the library patron that will connect him or her to full text—or to the library catalog or an interlibrary loan request form, if full text is not available. In 2004, the original OpenURL specification was generalized into a formal standard, ANSI/NISO Z39-88:2004, The OpenURL Framework for Context-Sensitive Services.
What’s your specific interest in OpenURLs and quality metrics?
OpenURL was a genuine breakthrough and innovation for libraries. In 2009, Cornell patrons alone clicked on about half a million OpenURL citation links. In a talk last year, Herbert mentioned that a conservative estimate is that over a billion OpenURL requests are made by library patrons every year. The access these links provide can be very satisfying for library patrons, but bad links can be extraordinarily frustrating. Many vendors offer OpenURL links on their sites, but after the links go out to library link resolvers, the vendors have no idea what happens. They get no systematic feedback and don’t know if library patrons are able to successfully access resources from their links. The aim of my project is to devise a method to provide feedback to vendors regarding the quality of the metadata content they’re sending out, because the reality is OpenURLs don’t work 100% of the time. Some OpenURL providers are better at supplying complete and accurate data than others. Nobody knows how often patrons are successful when they click on an OpenURL.
Download the article PDF to continue reading...
Publication data
DOI10.3789/isqv22n2.2010.10
/sites/default/files/stories/2019-11/NR_LeBlanc_OpenURL_Quality_%20isqv22no2.pdfVolume 22, Issue 2 (Spring 2010)