SUBSCRIBE TO OUR FREE NEWSLETTER
PUBLISHED |

Eyes never lie: New AI can predict your personality type based on eye movements

Eyes never lie: New AI can predict your personality type based on eye movements article image

Researchers in Australia and Germany have developed an artificial intelligence that is able to predict a person’s personality type by looking into their eyes.

“Several previous works suggested that the way in which we move our eyes is modulated by who we are – by our personality,” Andreas Bulling, a professor from Germany’s Max Planck Institute for Informatics, told Digital Trends in a recent interview.

“For example, studies reporting relationships between personality traits and eye movements suggest that people with similar traits tend to move their eyes in similar ways,” Professor Bulling said. “Optimists, for example, spend less time inspecting negative emotional stimuli – (such as) skin cancer images – than pessimists.

“Individuals high in openness spend a longer time fixating and dwelling on locations when watching abstract animations.”

But the real challenge for the researchers was finding a way to turn those observations into an artificial intelligence system.

To do so, they turned to a deep learning AI to offer some help.

Positive applications

The researchers asked 42 students to wear an off-the-shelf head-mounted eye tracker as they ran errands. They also had the students’ personality types tested using established self-report questionnaires.

With both the input (the eye data) and output (personality types) gathered, the AI was then able to work out the correlating factors linking the two.

“We found that we were able to reliably predict four of the big five personality traits — neuroticism, extraversion, agreeableness, conscientiousness — as well as perceptual curiosity only from eye movements,” said Professor Bulling.

While there are definitely potential ethical dilemmas involved (including privacy issues), Professor Bulling says there are also many positive applications.

“Robots and computers are currently socially ignorant and don’t adapt to the person’s non-verbal signals,” he said. 

“When we talk, we see and react if the other person looks confused, angry, disinterested, distracted, and so on. Interactions with robots and computers will become more natural and efficacious if they were to adapt their interactions based on a person’s non-verbal signals.”

A paper describing the work was recently published in the journal Frontiers in Human Neuroscience.

related

comments

Leave A Comment
SUBSCRIBE TO OUR FREE NEWSLETTER

Featured Products