Streamlining LibGuide Assessment: A Practical Approach Using Artificial Intelligence

Type of Presentation

Individual paper/presentation

Conference Strand

Assessment

Target Audience

Higher Education

Second Target Audience

Other

Any library using LibGuides

Location

Ballroom B

Relevance

This proposal relates to the teaching and learning of information literacy by showing how we can use AI to efficiently improve the quality of a key instructional tool—LibGuides—which are crucial for guiding student learning and reinforcing skills from one-shot instruction.

Proposal

Confronting the abundance of outdated “legacy” LibGuides is a significant challenge for many libraries. LibGuides are a crucial way to deliver targeted information literacy instruction by providing students with curated research strategies and library resources. The decentralized LibGuide creation process is often guided only by a branded template. Many LibGuides are poor-quality instruction tools, with broken links, outdated information, and designs that are not conducive to effective learning strategies. The manual assessment of LibGuides guides is laborious; budget cuts and staffing shortages make large-scale assessment unworkable.

In this presentation, we will detail a project that addresses this challenge by leveraging artificial intelligence (AI) to streamline the assessment process and make it feasible in today’s climate of reduced budgets and manpower. We chose to use the Moukhliss and McCowne (2024) rubric for LibGuide Assessment Standard Rubric for Quality-Checked Review for its comprehensive evaluation of usability and framework for measuring the impact of the changes prescribed. Moukhliss and McCowan interviewed ten students and had them use an original LibGuide and a LibGuide modified to fit the rubric to complete tasks. Their findings showed marked improvement in LibGuide use.

Our project initially used student interns to assess our guides; this was time-intensive and ultimately unsustainable. We pivoted to using an AI-driven approach. We will detail our methodology for transitioning from human labor to AI labor. We defined our project goals, tested several AI tools, and evaluated appropriateness. We will discuss roadblocks and opportunities in project workflow transitions to AI tool use. We will demonstrate effective data input processes and prompt engineering. We will compare AI accuracy to the summer intern’s accuracy and decision-making. This study will provide a valuable case study for librarians facing similar issues of LibGuide evaluation and maintenance. We will present a blueprint for a more sustainable approach to future LibGuide maintenance.

Short Description

Struggling with a large collection of outdated LibGuides? This presentation showcases a practical solution using AI to assess and improve the instructional quality of LibGuides. We'll share our project methodology, key lessons, and how we compare AI accuracy to human assessment, offering a sustainable blueprint for your own library.

Keywords

Artificial Intelligence, LibGuides, Assessment, Library Technology, Project Management, Instructional Design

Publication Type and Release Option

Presentation (Open Access)

Share

COinS
 
Feb 7th, 10:00 AM Feb 7th, 10:45 AM

Streamlining LibGuide Assessment: A Practical Approach Using Artificial Intelligence

Ballroom B

Confronting the abundance of outdated “legacy” LibGuides is a significant challenge for many libraries. LibGuides are a crucial way to deliver targeted information literacy instruction by providing students with curated research strategies and library resources. The decentralized LibGuide creation process is often guided only by a branded template. Many LibGuides are poor-quality instruction tools, with broken links, outdated information, and designs that are not conducive to effective learning strategies. The manual assessment of LibGuides guides is laborious; budget cuts and staffing shortages make large-scale assessment unworkable.

In this presentation, we will detail a project that addresses this challenge by leveraging artificial intelligence (AI) to streamline the assessment process and make it feasible in today’s climate of reduced budgets and manpower. We chose to use the Moukhliss and McCowne (2024) rubric for LibGuide Assessment Standard Rubric for Quality-Checked Review for its comprehensive evaluation of usability and framework for measuring the impact of the changes prescribed. Moukhliss and McCowan interviewed ten students and had them use an original LibGuide and a LibGuide modified to fit the rubric to complete tasks. Their findings showed marked improvement in LibGuide use.

Our project initially used student interns to assess our guides; this was time-intensive and ultimately unsustainable. We pivoted to using an AI-driven approach. We will detail our methodology for transitioning from human labor to AI labor. We defined our project goals, tested several AI tools, and evaluated appropriateness. We will discuss roadblocks and opportunities in project workflow transitions to AI tool use. We will demonstrate effective data input processes and prompt engineering. We will compare AI accuracy to the summer intern’s accuracy and decision-making. This study will provide a valuable case study for librarians facing similar issues of LibGuide evaluation and maintenance. We will present a blueprint for a more sustainable approach to future LibGuide maintenance.