top of page

Advanced Data-Driven Student Feedback

  • stevencmlee72
  • Aug 12, 2025
  • 4 min read

Part 1: The Challenge with Gathering Student-Provided Feedback


In a class of more than 30 students, it can be difficult to check in with every learner and understand how they are progressing each day. Tools like exit tickets, conversations & observations are helpful, but there is only so much time to log and review responses.


Exit tickets and class surveys, while great, have limits. They may not capture how confident a student feels with the material, their mindset toward learning, or subtle gaps in understanding. Research shows that students’ self-perception plays a major role in their learning outcomes (Zimmerman), and without asking them directly, we are missing a piece of the picture.


There is also the issue of perception versus reality. Students can sometimes appear more comfortable with the material than they actually are. As Black and Wiliam note in their influential work on formative assessment, timely and accurate feedback is crucial for adjusting instruction and supporting student growth. Without reliable, honest input from students, our ability to make those adjustments is limited.



Part 2: My Solution


Inspired by the happy and sad signs at the airport bathrooms, I got to thinking that a button system would be an effective way to quickly gather anonymous feedback about lessons. A simple, anonymous feedback system could encourage students to be honest, remove the social pressure of speaking up, and give teachers an easy way to gather feedback that is both measurable and trackable.


At first, I thought that creating an app and using an old phone would be sufficient. I used MITAppInventor to design a simple and easy app that could gather student feedback. However, during testing, the app took a few seconds to register selections, and it compromised anonymity because it was easy to see which button was pressed.



That got me thinking though, that having buttons to press would be both faster and more anonymous. With that idea in mind, I pulled out my ESP-32 and started prototyping.

My ESP32 BreadBoard
My ESP32 BreadBoard

The prototype is simple, but it gets the job done. Students press one of five buttons, each representing a number from 1 to 5. A 1 (on the left) means “I don’t understand,” while a 5 (on the right) means “I’ve got it.” When a button is pressed, the ESP32 records the response directly to a Google Sheet by sending it through a Google Form link. This allows the data to be automatically stored and ready for analysis without any extra work on my part. The simplicity makes it easy for students to respond on their way out the door, so I don't need to worry about giving up class time for them.



Part 3: Analysis of the Data


Now I have a tool that can get student feedback, but I still need to design the best prompt to give me actionable insights. Initially, I thought about asking the generic question, “How well did you understand today’s material?” But then, I realized that the more specific the question, the more meaningful the data would be.


I decided to structure my lessons so that each one has a single clear success criterion. At the end of the lesson, I ask students to rate their ability to meet that success criterion on a scale from 1 to 5. This is in alignment with John Hattie's teacher clarity.

To make the data more useful, I added another sheet to track some details of my lesson plans. This way, when I review the results later, I can see exactly what was taught and connect the feedback to specific lessons. Finally, I built a dashboard to visualize the responses and gather actionable insights.


My dashboard to show student understanding based on what I taught each day
My dashboard to show student understanding based on what I taught each day

The dashboard above shows the metrics and values I thought were most helpful to inform my teaching practice. The slicers on top let me filter the data based on different days, which class I was teaching, what type of lesson it was, and what unit I was in (learning outcomes). Then, I can quickly see the student-reported understanding in contrast to my perception and how many students were responding each class. I then show this understanding by class and by success criteria, which can help me identify which lessons I need to change.


I also added a line chart to show student progression throughout the units. My first unit added the level 3 problems at the end, so it makes sense that the score would actually have gone down. This tells me I probably jumped to too many challenging problems too early.


Lastly, I have some logic to tell me which lessons went well and which lessons didn't go as well. This tells me where I can look for inspiration, and what topics I need to change next year and revisit with this class.


Part 4: Conclusion and Next Steps


I’m really happy with how this tool has come together. It’s given me clear, useful insight into how my students are actually feeling about the material, and it’s made it much easier to spot what I need to review before we hit summative assessments. The dashboard makes trends pop right out, and the daily ratings have pushed me to think more deeply about my teaching choices.


That said, there’s still room to make it better:

  • Hardware durability – Right now, the buttons just sit in a breadboard, so they can get pulled out pretty easily. Not to mention me accidentally damaging it when I go between classrooms. To fix this, I'm looking into the new ESP32 Touchscreen Edition or soldering the buttons in place so they’re more secure.

  • Streamlining lesson data – At the moment, adding lesson plan details is a manual chore. In the next version, I’d love to hook it up to my teaching calendar so that part happens automatically.


Thanks for taking the time to read this! Let me know what you think, if you have any questions, or if you want some help implementing this for yourself!

Comments


bottom of page