9A7E9445-B2D6-4598-B99F-572768104305

EmoNet, a neural network model, was accurately able to pair images to 11 emotion categories.

The EmoNet research study demonstrates how AI can measure emotional significance.

Artificial intelligence might one day start communicating our emotions better than we do. EmoNet, neural network model developed by researchers at the University of Colorado and Duke University, was accurately able to classify images into 11 different emotion categories.

A neural network is a computer model that learns to map input signals to an output of interest by learning a series of filters, according to Philip Kragel, one of the researchers on the study. For example, a network trained to detect bananas would learn features unique to them, such as shape and color.

EmoNet was developed using a database consisting of 2,185 videos that resembled 27 distinct emotion categories, which ranged from anxiety and interest to even sadness and surprise. While the model was able to differentiate images that related to “craving,” “sexual desire” and “horror” at high confidence intervals, it wasn’t as strong in detecting “confusion,” “awe” and “surprise” which were considered more abstract emotions. The neural network used color, spatial power spectra as well as the presence of objects and faces in the images to categorize them. The findings were published in the journal Science Advances last week.

The study might provide value to researchers who were previously dependent on participants self-reporting their emotions. Now instead of only relying on subjective responses, scientists can focus on patterns within the visual cortex using AI to better understand a subject’s feelings. Different patterns will “decode” different emotional experiences.

“When it comes to measuring emotions, we’re typically still limited only to asking people how they feel,” said Tor Wagner, one of the researchers on the study. “Our work can help move us towards direct measures of emotion-related brain processes.”

In addition to new ways to measure emotions, the research team adds that AI could help eliminate labels when it comes to mental health.

“Moving away from subjective labels such as ‘anxiety’ and ‘depression’ towards brain processes could lead to new targets for therapeutics, treatments, and interventions,” said Kragel.

Decoding human emotions is just one of the latest examples of how researchers are exploring AI. Last month, a team of UN researchers trained an open-source language model to write fake, but convincing, UN speeches. And a recent study by MIT suggested that neural networks could be used to make the perfect pizza.

EmoVia CNET