
Students in Carrie Moore’s Nonverbal Communication course watch the facial expressions of a classmate when learning how to use the iMotions software in the Schieffer Media Insights Lab.
“There are three areas of the face that have to be engaged to emote anger.”
Carrie Moore, communication studies instructor, highlighted the importance of facial expression in conveying emotions during an onsite iMotions training session.
“Is that what we see? Did iMotions get it right?” Moore asked her students.
With funding from the dean’s research grants, Moore is using the Schieffer Media Insights Lab to help students grasp nonverbal cues and their broader impact. iMotions, a software that analyzes facial expressions, measures how online media affects audience sentiments.
“We tend to weigh facial expressions most when decoding nonverbal communication. As such, iMotions gives us further insight to identify more complex micro-expressions,” Moore told her class.

A student sits at the iMotions station during the training as a test subject for the course.
Eye Motions
In the lab, a nondescript computer with a camera sits to the right. Despite its unassuming appearance, this station is equipped with iMotions software and tools that record viewers’ responses to digital stimuli. Using AI, the software tracks eye movements, assists in facial expression analysis and monitors nonverbal cues.
“iMotions is a way to quantify your facial emotions. It allows you to see how you feel and put a number on it,”
Moore said about the ability to quantify what has been traditionally a qualitative method.
For Moore, seeing iMotions in action is a dream come true. The software—powered by Affectiva, an AI that recognizes human emotion—has been on Moore’s radar since its 2015 launch. Now that the technology is available in the lab, students can use it to explore its applications.
Actions Speak Louder Than Words
Students are testing their skills against AI. During the training, they watched videos and compared their observations with the software’s data, witnessing the analytics firsthand.
“I had a very stoic, non-expressive face while watching, and yet the iMotions technology was able to pick up on my emotional state,” John Sands said about his experience. “For example, regardless of remaining stoic, the technology was able to record spikes in anger and disgust.”
Volunteers who were filmed expressed initial nervousness but found it fascinating to see their emotions quantified by the technology. While iMotions shows the range of emotions, it does not explain why or the underlying cause.
“Interpretation is still needed for the analysis. We can see there is a response, but we can’t always determine why or what caused it,” Moore said.
Students will use iMotions for their final projects, exploring practical applications such as debate preparation, the impact of video games and identifying hot spots on websites, then create research proposals incorporating the technology.
“This technology was helpful in the nonverbal communication course because it allowed students to see the reading of nonverbal facial cues in action,” Haley Whitworth said. “Students could examine how technology analyzes our faces depending on our emotions when looking at different things.