Library Impact
The following is a summary of notes from a guided discussion about demonstrating library impact from the UNT Library Assessment Committee meeting on April 28, 2025.
What are ways you can demonstrate the impact of your work?
- Public Services
- Subject Librarians
- This often comes in the form of patron feedback about an interaction.
- Example: "I wouldn't have gotten an A on this paper without your help…"
- These are typically kept by one department.
- Feedback is usually received via email or in person.
- If in person, it can be difficult to capture.
- Are these logged in RefStats? No, we haven't been, but the division AUL has asked us to start using the comment tag.
- Front Desk
- At the front desk, they usually don't hear directly from patrons, but sometimes they get someone who comes up and says something.
- Often hear from this sort of feedback from librarians coming back from events.
- These comments will often be shared up the command chain.
- Jennifer Rowe, Julie Leuzinger, Carol Hargis, and Karen Harker’s 2020 study of the Impact of Library Instruction.
- Link to article: https://digital.library.unt.edu/ark:/67531/metadc1852289/
- This compared card swipes from classes that got instruction compared against those who didn't get instruction.
- Looked at differences in traditional student success metrics such as retention, graduation, and grades.
- Discovery Park Library
- For workshops at DP, they do post-workshop surveys to see if it was beneficial.
- This also gives an idea of what students like.
- Digital Libraries
- There are a lot of different approaches to Digi impact as each collection has its own characteristics.
- Metric Examples:
- How many outside groups did we present to?
- How many events?
- Usage numbers, in general
- What courses have been built off collection? Where are the resources embedded?
- Who reaches out via the online feedback portal?
- Scholarly Works - context of things, what types of submissions, usage numbers, level of engagement
- Who do we hear from? Who don't we hear from?
- Collection Management
- Collection Assessment
- The Collection Assessment Department affects selection. We want to do a project to see if the reports improve usage of choices?
- Does this save effort on Collection Development's part?
- We look at the quality of resources selected in order to meet needs.
- We want to try to tie online library resource usage into student success using EXProxy Server Logs.
- We do write accreditation reviews.
- Cataloging and Metadata
- Numbers of resources cataloged = more things that can be discovered
- Want to know if quality affects things? In cataloging there is a tension between timeliness vs quality.
- They are getting feedback from other librarians to see what is helpful particularly about search terms.
What kinds of qualitative and quantitative data do you collect?
- Qualitative Data
- Anecdotes and stories
- Sometimes anecdotal is the best info you have.
- Ask a question on a whiteboard for students to write on after an event, and then take photo afterwards.
- Thank You cards for the Paws and Relax event and take pictures.
- Ask subject librarians if a random sample of records, well-cataloged.
- Quantitative
- Attendance counts for events
- Perhaps, for reference interactions we could do a 1-2 question survey after a reference interaction. It could be automated.
- After implementing a new workflow, send out an email a month afterwards to see how it is going or do time trials to see if it affected the speed of the flow.
- Perhaps send out feedback surveys to subject liaisons to see how they feel about the resources purchased.
- Project Outcome is an option. It is a survey method that includes satisfaction/immediate impact question after event then another question 2-ish weeks after to ask if have put it into action.
Who do you communicate your results with and why?
How do you communicate your results/impact?
- Personal
- Report results to supervisors
- Annual evaluations of staff or faculty
- Internal
- Friday Frags
- Pictures of whiteboards with assessment questions will sometimes go in Friday Frags
- About to assess impact through a survey about the changes made around transparency in communication. Have the changes had an impact? What has changed? To what extent?
- External
- Good for public-facing reports – look at who has benefited and who would be interested in that.
- Conference presentations or journal articles
- Maybe communicate these impacts to the budget office "We saved X amount via negotiations…" "We cataloged X records… so we need a new person"
What challenges have you faced?
- It is tricky to connect action and outcome, particularly as the impact of something like an instruction session might not be immediate.
- It can be tricky to distinguish between outputs and satisfaction/etc.
- It can be difficult to determine the change or desired outcome.
- A lot goes into an outcome like retention. How do we know that it was us that made the difference?
- The connection between library usage and student success can be a tenuous connection.
- It can be tricky to differentiate between correlation and causation.
- For example, is a student doing well academically and using the library because they already have a tendency to do so, or is the library use a meaningful factor in improving their academic performance?
Conclusions
- We should do more of looking at impact via collecting and sharing both qualitative and quantitative data.
- We could do better at communicating why libraries matter.