GUEST POST: Leveraging AI in a Research-Driven Way: Augmenting feedback during spaced retrieval practice using ChatGPT

0
695
GUEST POST: Leveraging AI in a Research-Driven Way: Augmenting feedback during spaced retrieval practice using ChatGPT

If you use edtech in your classroom, you’ve probably seen at least one of the tools you use recently advertise their brand new “AI feature.” 

At Podsie, the core of what we’re building has always been research-driven, so as the AI hype rages on, we’ve been asking ourselves, “How can we leverage the recent advances in AI and LLMs in a research-driven way?” Put more concretely, are there research-backed teaching practices that AI makes easier to facilitate? 

Before diving into that question, let me first zoom out a bit to explain what Podsie is.

Podsie is a free web app that allows teachers to provide personalized, automated spiraled review for each student throughout the school year. It’s grounded in the highly effective, evidence-based strategies of spacing (1), retrieval (2), interleaving (3), and personalized review (4)

At a high level, here’s how it works:

  1. On the Podsie web app, teachers assign questions to students on something they’d recently learned.

  2. Students practice these questions (retrieval).

  3. Each question is then inserted into a cumulative Personal Review Deck that tracks student mastery of each question over time. It then determines the optimal next time for the student to review that question (spacing). 

  4. Over time, as students learn more in a class and get assigned more questions on Podsie, each question for that subject cumulatively gets mixed in with other questions they’ve already learned (interleaving).

  5. By the end of the school year, the Personal Review Deck may have hundreds of questions accumulated (depending on how much content your course covers). Still, at any given moment, each student would only focus on practicing the questions that they need to practice (personalized review). 

Quick aside: If you want to learn more about Podsie, feel free to check out any of the following: our  1-minute overview video, our podcast episode on The Learning Scientists, or our former guest post on The Learning Scientists

Podsie’s area of growth: providing better feedback for students

While we think we’re doing a good job of enabling teachers and students to use the strategies mentioned above easily, there’s one other evidence-based principle we’d like to facilitate better: providing feedback.

Research has shown that retrieval practice yields better learning outcomes if relevant feedback, such as the correct answer, is provided (5). That’s why on Podsie, the answer is shown to students after each student’s retrieval attempt. However, further studies indicate that just showing the correct answer isn’t enough when it comes to helping students tackle new inference questions that require a deeper understanding of the original underlying concept (6). Instead, providing an explanation is needed to ensure that students can be successful on other related questions. Concretely, we’ve also noticed many instances on Podsie where even after showing the correct answer, the student will still miss that question on subsequent attempts, especially for questions on a higher order than simple recall on Bloom’s Taxonomy (7)

To address this problem, Podsie allows teachers to add an “explanation” that gets displayed after the student answers the question. Unfortunately, adding an explanation can be time-consuming, so as of right now on Podsie, of the 72k questions teachers have created, only 26% (19k) have explanations.

As a result, we often see cases like the one below, where even though the student is putting forth a lot of effort, the results could be better. In this case, the student has now missed this question 4 times in a row since first seeing the question 19 days ago.