By Scott Steward
Welcome back to the Gale Technical Solutions blog series!
This is part two in a two part series.
In my last issue (usage part l), I focused on Gale’s definition of the usage metrics we report and how they are counted. If you haven’t read it yet, you can find it here: http://blog.gale.com/raise-your-hand-if-you-want-to-know-more-about-usage/.
For this issue, I will be continuing my discussion about usage, focusing on the factors that affect usage such as discovery services, LMS, openURL and direct links.
So let’s get started, shall we?
In the last issue we learned that the three most common usage metrics for electronic resources are: Sessions, Searches and Retrievals. There are many factors in a library’s ecosystem that affect these numbers, let’s look at a few.
The LMS (learning management systems)
Integrating with an LMS is one the best ways to drive usage. When your teaching staff (K12 or academic) understands the value of using the resources provided by the library, they will use them in their lesson plans. They can add links directly to articles they would like their students to read for their class. Not only will the students have access to great content creating a greater understanding of the course material, but each time a student opens the article to read it, they will generate a session and retrieval.
Make navigating your website easy
Making it as easy as possible for your users to find the resources you chose for them will positively affect usage, and the opposite is true as well for a poorly organized resource page. For tips on organizing your electronic resource page refer back to issue #3 http://blog.gale.com/gale-technical-solutions-organizing-electronic-resource-pages/.
Remove barriers to access
Make sure anyone who is a valid user can access your resources where ever they are (in the library and remotely) and by whatever method they want to use (desktop, tablet, phone). This will increase usage across the board.
Add MARC records for all your resources to your catalog
If you allow your users to search for physical items in your collection, make sure they can find your digital collections as well. You can download your MARC records here: http://support.gale.com/.
You know the old saying about leading a horse to water? If your staff doesn’t know which resource is the best for different user needs, they will not have much success in getting the horse (user) to drink.
Make sure your staff understands how to use the resources and what content they contain. We have online training for almost all our resources available here: http://support.gale.com/training/?auth=1&page=/&loc=training.
So far, I have been talking about things you can do that will (for the most part) positively affect usage. Now let’s head to the “dark side” and look at somethings that negatively affect usage.
Many libraries invest in a discovery service because they know their users have come to expect a Google like, single search box approach to research. But what many libraries don’t know is for any resource that is not provided by the same vendor as the discovery service, usage will go down!
There are a few factors that contribute to this, one factor is due to search engine optimization (SEO). Every provider of a search engine (Google, Gale, Yahoo, Bing, EDS, Summon, etc.) has their own proprietary algorithms (“secret sauce”) that are used to find and order content in a result list.
Discovery services aggregate content from multiple sources (vendors) and in many cases have the same piece of content from several sources. So when they need to choose which resource to push up in the result set, they use their secret sauce to determine which article should be shown. There are many factors (metadata, weighting, source) that are taken into account when determining which article to display (or link to) for any given result set, and if you know what those factors are, you can make sure your content has all of them, and bip a dee bop a dee boo, your article is at the top. So again, in general, any resource that is not provided by the same vendor discovery service, usage will do down. This rule is true for any discovery service.
Another reason why electronic resource usage will go down when a discovery service is being used is because no usage is recorded in the electronic resource until the user actually opens the article in the electronic resource. For example, you have 20 resources that are being cross searched in a discovery service. The user is not a very good researcher and searches five different times to find what they are looking for. At first thought, you would think you would see 5 searches in each of the 20 resources, right? Unfortunately, this is not true if the sources are not provided by the same vendor as the discovery service. Instead, you will not see any usage in any of the 20 individual resources. Additionally, if you use EDS and have smart links turned on, for content that is available in a subscribed EBSCO resource, you will see the EBSCO content by default even if you filter the content set to a specific non-EBSCO resource.
The above link is a screen cast of me searching in EDS for the term “Lincoln” and filtering to a specific Gale resource (Biography In Context). When I click on the title of the article (the most common user behavior), it takes me to a page that shows the article metadata at the top (including database: Biography In Context). Below the metadata is the full text of the document, which came from an EBSCO source. From a user perspective, this is good because they got to the full text (what they most likely wanted). However, from a library perspective, this is bad because NO usage would be recorded in Biography In Context, even though the user filtered to the specific resource. In fact, the only way to see this article in Biography In Context is to click on the link that says “View record from Gale’s Biography In Context”.
Which brings me to the next problem, failed links. In the above example, when I clicked on the “View record in Gale’s Biography In Context” link, I am taken to Biography In Context (records a session), searches for the article (records a search) and if successfully found and clicked on in the result list (creates a retrieval). However, in this example, the open URL is not passing the required pieces of metadata and fails. So the user would not get to the full text.
We have found that approximately 30% of all openURL links fail due to metadata mismatches (difference between the openURL source and article in our system). This causes a decrease in retrievals due to the failed links and then from the users learning that links will fail and choosing different resources. To fix the issue of failing openURL links, we are asking the discovery service vendors to change from openURL to direct links. A direct link is essentially openURL with one piece of metadata being passed, the article ID. Summon has switched over to these links, and we are seeing almost 100% success rates for links between our systems. This is good, due to the user being able to get to the Full text they wanted, but bad (from a usage perspective) because the link goes directly to the article, meaning searches will go down. EDS is in the process of switching to direct links. Our MARC records will be changing to this format in the 856 field as will the returned metadata from a SRU / Z39.50 request.
Factors that decrease usage: Efficiency
As search algorithms become more and more efficient at finding content and pushing the correct content to the top of a result list, usage will go down. Let me say that a different way, if the user has to search less to find what they want, usage will go down, but (in theory) user satisfaction will go up. The problem we all have is for the last 20 or so years we have used sessions, searches and retrievals as a measure of user satisfaction. We are falsely making the assumption that if those metrics are going up so is user satisfaction and therefore the value of the resource.
What are the best measure(s) of user satisfaction and value of a resource? As a vendor, I don’t think we are best suited to make those decisions. We can make recommendations, yes, but I think the decisions should come from the library community.
If you have any questions or comments about what I have discussed, leave me a reply below.