Contents:
Transformative agreements represent a new way of doing business with publishers. Effective cost reduction proposals require deep insight into the value of content to a library’s users and the marketplace in which the publisher is operating. To support negotiations for these dual aims, libraries must develop data analysis tools and strategies that go beyond the standard return on investment that is commonly used to measure the value of traditional subscriptions.
Recommendations
1. Assess your analytics capability and determine whether you need additional help in this area.
Many libraries have limited capacity and expertise to carry out the agile data analytics necessary for developing and supporting transformative open access negotiation goals in a measured, deliberate way. Local resources outside of the library may be available to help with these efforts (e.g., financial analysts from the budget office).
2. Incorporate data analysis early in the strategy-setting process.
Data analysis can play an important role in shaping negotiation goals. The development of a negotiation strategy and data analytics framework should be iterative and mutually reinforcing.
3. Be prepared to gather diverse sets of data.
To create informed and persuasive transformative open access models, negotiation teams and data analysts will need to utilize both data routinely used in licensed content negotiations, as well as less-commonly used types of data. Familiar datasets include COUNTER compliant usage statistics, citation data, and journal subscription costs. Additional data necessary for transformative open access analyses include article-level data (including corresponding author affiliation, grant acknowledgement statements, and open access status) and detailed information about journals (including business models, list-price APCs, and portions available open access).
4. Build flexible and customizable analyses and tools.
To enable iterative and responsive data analyses, develop diverse datasets and flexible, customizable data analytics tools and models, ideally from the start. Dynamic worksheets can be built to support a wide variety of parameters that can be added to and removed from analyses as needed. When built correctly, analysis efforts can be transferable across publishers and scenarios, allowing the data work to evolve as priorities, knowledge, and circumstances change.
5. Consider all perspectives.
It is important to identify and investigate all stakeholder points of view and build outputs and visualizations accordingly. Analyses are most impactful when the outputs are tailored to the audience (for example, separate outputs from a single analysis can be provided to negotiation team members, the publisher, and interested faculty).
6. Verify publisher-provided data with local analyses.
Develop your own approach to traditional metrics, and perform your own analyses to confirm or refute any data the publisher may provide. Usage statistics, publication data, and other data points can be influenced by many factors, and local verification, including processes such as normalization and contextual interpretation, can allow for a deeper, more nuanced understanding of the available information.
7. Be as objective as possible.
Resist the urge to overlook or downplay aspects of data that may not fit a chosen narrative. Address all analysis outcomes — even if they present challenges to the model — so the university and its stakeholders can move forward with confidence in the model
Key UC documents and resources
- UC and Elsevier: A blueprint for publisher negotiations, April 8, 2019
UC’s experience
In preparing for and throughout its negotiations with Elsevier and other publishers, advanced analytics helped UC understand what transformative agreements would mean for various stakeholders, how potential agreements would correspond with local priorities, and where there might be risk or uncertainty to address. Extensive data analysis supported stakeholder engagement (internally, with faculty, library leadership, and administrators, and externally, such as with colleagues at other university libraries) and helped the negotiating team operate on an equal footing with the publishers.
Analyses conducted in support of UC’s negotiation with Elsevier and other publisher negotiations included the following.
Value-based analyses, such as:
- Competitor comparison, between Elsevier and a major competitor with a similar mix of high prestige and more normally-valued journal titles: UC mapped journal cost against usage and impact factor, finding that Elsevier had more high-cost titles than the competitor, particularly in the lower ranges of the scales (less-used titles and lower-impact-factor titles). Elsevier was also compared directly to its competitor in select metrics, including cost per article. Through this analysis, Elsevier’s cost per article was found to be 15% higher than that of its competitor.
- Pricing regression analysis: This was conducted across all of UC’s major journal package holdings to measure and quantify the strength of correlations between various journal quality metrics (including title count, article found, usage, citations, and impact factor) and price.
- Authorship impact analysis: UC used data from Clarivate’s Journal Citation Report and Web of Science to recalculate the impact factor for Elsevier’s top journals with UC-authored papers removed from the calculation, thereby representing the value of UC authorship to Elsevier’s journals. Removing UC-authored content reduced impact factors by an average of 4.4% across Elsevier’s top ten journals, most significantly in Lancet-branded and Cell Press journals.
- Usage and alternative access analysis: Because we had perpetual rights to pre-2019 content for most of the journals to which UC had subscribed, we used JR5 COUNTER reports to isolate current-year usage, normalized to eliminate unusual download activities. Next, we further normalized this data to account for repeat use by individual users based on log analyses performed and reported on by Ted Bergstrom at UC Santa Barbara for other projects,1 as well as some of our own research on user interfaces and the relationship of html and pdf downloads.2 We then studied the literature on Big Deal cancellations to estimate how much of this usage might translate to interlibrary loan requests.3 Projection tools were created to estimate potential post-cancellation access costs. (UC’s estimates have so far been based on the cost of access if supplied through ReprintsDesk.)
Publication modeling, such as:
- Institution publishing analyses: UC observed publisher-specific patterns to inform strategy and choose appropriate publisher partners for these transformative agreements. Data required for this analysis came from Web of Science (for bibliographic data, affiliation data, and grant funding data), Unpaywall (for article open access status), Crossref (to impute data for journals not covered by Web of Science), Essential Science Indicators (for subject data), and a combination of ad hoc sources and manual work to normalize and classify journal and publisher names.
- Offsetting worksheets by publisher: These worksheets combined UC publishing data with journal title lists and business data (such as applicable open access model and list price APC).
- Bergstrom, T., Uhrig, R., & Antelman, K. (2018). Looking under the COUNTER for overcounted downloads. UC Santa Barbara: Department of Economics.
- Li, C., Wilson, J. (2015). Inflated Journal Value Rankings: Pitfalls You Should Know About HTML and PDF Usage. California Digital Library: University of California. ALA Conference, San Francisco, 2015.
- Jonathan Nabe & David C. Fowler (2015) Leaving the “Big Deal”…Five Years Later, The Serials Librarian, 69:1, 20-28, DOI: 10.1080/0361526X.2015.1048037’; Wayne A. Pedersen, Janet Arcand & Mark Forbis (2014) The Big Deal, Interlibrary Loan, and Building the User-Centered Journal Collection: A Case Study, Serials Review, 40:4, 242-250, DOI: 10.1080/00987913.2014.975650.