Classroom Sentiment Analysis

February 1, 2018 / By: Eric Tornoe

Last May, a team from STELAR research at the University of St. Thomas visited the Microsoft Technology Center in Edina, MN for a Technology Showcase. One thing that caught our attention was a screen that was taking video of the people as they passed, drawing yellow boxes around our faces with labels that indicated our gender and age. We began to experiment and recently presented this technology at Educause Learning Initiative 2018 to great interest. A journalist from Inside Higher Ed was in attendance and wrote this article about the technology: 

Inside Digital Learning Sentiment Analysis Article

After seeing the demonstration at Microsoft, my graduate assistants and I started reading about the underlying algorithm called EmotionAPI, which could be also be used to analyze the emotion displayed on a face. Each face would be analyzed and assigned an emotion score, with the highest positive returned as the result via a JSON (Javascript Object Notation) string. In this case, the emotion is “Happiness”:

 Classroom Sentimate Analysis resultsAfter determining the result, the API draws a bounding box on the original image and adds a label showing the detected emotion, as shown in the first image above. In its current iteration, the software can detect and analyze 42 faces per frame at approximately 10 frames per second. 

Thinking of academic uses for this technology, the team came up with Classroom Sentiment Analysis. Our theory was that you could use this software in a classroom setting to determine the aggregate emotional trajectory of the class over time, which might yield insights as to student engagement with the material being presented. Graduate Assistant Shashi Palle coded the initial version in Python, after first teaching himself C# so he could understand the examples! This code, running on a laptop, captured frames from a video camera and sent those frames to EmotionAPI in the Azure cloud. Emotion API returned the results and the code drew the bounding boxes and the detected emotional state over the original image. 


 

Now, for this to be truly useful, we needed a way to retain and analyze the data. Enter PowerBI, Microsoft’s data analysis and visualization tool. It was relatively straightforward to send the data to PowerBI in Office365 at the same time as we were sending the frames to EmotionAPI for analysis, which allowed us to visualize the data in real-time as well as retain it for future analysis. 


We are currently arranging to have this technology tested in a live classroom setting at the University of St. Thomas, pending approval of the research study. This study will help us evaluate the accuracy of the emotion detection while building a sufficient body of data for meaningful analysis. 

One thing we learned from this experience is that the people have a strong reaction to the technology! This taught us that we need to be careful in how we describe and position technology that interacts with humans, to make sure that the people using it understand exactly what it is doing and what happens to the data collected, and that people are fully informed as to what data is being collected and how it will be used.

STELAR Research will continue to develop Classroom Sentiment Analysis, incorporating feedback from real-world trials to improve the accuracy and range of emotional detection while making functional refinements as well. In the next development cycle, we intend to implement multi-threading, which will allow much faster processing and analysis of video frames. We will also implement time-stamps, which will both speed up frames sent to EmotionAPI and allow us to do more thorough analysis in PowerBI, such as evaluating emotional experience over time. Additionally, we will add the capability to analyze online and hybrid classes and ultimately, we hope to develop post-processing algorithms to detect more subtle emotions, such as comprehension or confusion in order to create an accurate and easy-to-use tool to assist professors in the creation and delivery of educational materials, and the analysis of their reception by a diverse audience of students.