Affective Artificial Intelligence: Better Understanding and Responding to Students
Artificial intelligence is recognizing and responding to human emotions, oftentimes better than many humans.
As a long-time professor of communication, I am fascinated with the cognitive characteristics of artificial intelligence as they relate to human communication. Image processing, computer vision, speech recognition, and pattern recognition are parts of the sophisticated processes in artificial intelligence communicating with humans. But at the superficial level these comprise only part of the communication process.
One of the challenges in person-to-person communication is recognizing and responding to subtle verbal and non-verbal expressions of emotion. Too often, we fail to pick up one the importance of inflections; word choices; word emphases; and body language that reveal emotions, depth of feelings, and less obvious intent. I have known many of my colleagues who were insensitive to the cues; they often missed nonverbal cues that were obvious to other more perceptive people.
Kendra Cherry from Verywell notes that, “…research has identified several different types of nonverbal communication. In many cases, we communicate information in nonverbal ways using groups of behaviors. For example, we might combine a frown with crossed arms and unblinking eye gaze to indicate disapproval.”
And, that brings me to just how artificial intelligence may soon enhance communication between and among students and instructors. AI in many fields now apply affective communication algorithms that help to respond to humans. Customer service chat boxes can sense when a client is angry or upset; advertising research can use AI to measure emotion responses of viewers; and a mental health app can measure nuances of voice to identify anxiety and mood changes over the phone. “Machines are very good at analyzing large amounts of data,” explained MIT Sloan professor Erik Brynjolfsson. “They can listen to voice inflections and start to recognize when those inflections correlate with stress or anger. Machines can analyze images and pick up subtleties in micro-expressions on humans’ faces that might happen even too fast for a person to recognize.”
Too often we fail to put ourselves in the position of others in order to understand motivations, concerns, and responses. Mikko Alasaarela posits that humans are bad at our current emotional intellgence reasonings: “We don’t try to understand their reasoning if it goes against our worldview. We don’t want to challenge our biases or prejudices. Online, the situation is much worse. We draw hasty and often mistaken conclusions from comments by people we don’t know at all, and lash at them if we believe their point goes against our biases.”
That can be a significant challenge in online classes. Too often, I fear, we miss the true intent, the real motivation, the true meaning of posts in discussion boards and synchronous voice and video discussions. The ability of AI algorithms to tease out these motivations and meanings could provide a much greater depth of understanding (and misunderstanding) in the communication of learners.
Sophie Kleber writes in Harvard Business Review, “In January of 2018, Annette Zimmermann, vice president of research at Gartner, proclaimed: “By 2022, your personal device will know more about your emotional state than your own family.” Just two months later, a landmark study from the University of Ohio claimed that their algorithm was now better at detecting emotions than people are…. Emotional inputs will create a shift from data-driven IQ-heavy interactions to deep EQ-guided experiences, giving brands the opportunity to connect to customers on a much deeper, more personal level.”
With AI mediating our communication, we can look to a future of deeper communication that acknowledges human feelings and emotions. This will be able to enhance our communication in online classes even beyond the quality of face-to-face communication in campus-based classes. Algorithms that enable better “reading” of emotions behind written, auditory and visual communication are already at work in other industries. It will not be long before these will available to enhance our communication in online classes.
Are faculty considering how they might best use this added knowledge? Are you preparing faculty members for this prospect? Is your university prepared to consider the privacy considerations that this technology raises?
This article originally appeared in Inside Higher Ed’s Inside Digital Learning Blog.
Ray Schroeder is Professor Emeritus, Associate Vice Chancellor for Online Learning at the University of Illinois Springfield (UIS) and Senior Fellow at UPCEA. Each year, Ray publishes and presents nationally on emerging topics in online and technology-enhanced learning. Ray’s social media publications daily reach more than 12,000 professionals. He is the inaugural recipient of the A. Frank Mayadas Online Leadership Award, recipient of the University of Illinois Distinguished Service Award, the United States Distance Learning Association Hall of Fame Award, and the American Journal of Distance Education/University of Wisconsin Wedemeyer Excellence in Distance Education Award 2016.
Other UPCEA Updates + Blogs
Whether you need benchmarking studies, or market research for a new program, UPCEA Consulting is the right choice.
We know you. We know the challenges you face and we have the solutions you need. We speak your language and have been serving leaders like you for more than 100 years. UPCEA consultants are current or former continuing and online higher education professionals who are experts in the industry—put our expertise to work for you.
UPCEA is dedicated to advancing quality online learning at the institutional level. UPCEA is uniquely focused on excellence at the highest levels – leadership, administration, strategy – applying a macro lens to the online teaching and learning enterprise. Its engaged members include the stewards of online learning at most of the leading universities in the nation.
We offers a variety of custom research options through a variable pricing model.