AI can manipulate how we vote and who we date

7 minutes ago

In our increasingly digital world, algorithms attempt to exploit our biases and other mental shortcuts to distort our decision-making — and now, a new series of experiments suggests that we are highly suggestible when it comes to dating and electing leaders. 

The study, published Wednesday in PLOS One, shows that we're vulnerable to the influence of machine-driven recommendations in a variety of contexts, said the study's co-author Helena Matute, a professor of psychology at the University of Deusto in Bilbao, Spain. "As psychologists, we're concerned whether people are influenced by algorithms," she told The Academic Times.

The four experiments involved a total of 1,339 participants — a "huge sample for experimental psychology," Matute said — who were presented pictures of fictitious political candidates and potential dating app matches. The same series of photographs were presented to both experimental and control groups. The researchers promoted candidates either explicitly by presenting the photos alongside the text "+90% compatibility," or more covertly by showing photos of a politician or bachelor more often than others. Then participants decided which person they'd prefer to vote into office, or message on a dating app.  

"We've followed the methodologies of experimental psychology, which means that we tried to replicate the same result in several experiments and looked at very specific questions in each of them," Matute said. "The important thing is that we had a result from experiment one and went farther with experiment two, three, four. It provided a unified picture."

The researchers found that subtle forms of persuasion — presenting a candidate's photo multiple times — effectively influenced people's dating decisions, whereas explicitly suggesting that a potential dating app match had high "compatibility" did not. In the political context, by contrast, covert methods did not as effectively sway people's opinions; on the other hand, explicit messaging ("90% compatibility") was more persuasive regarding candidates. 

"We find that the explicit recommendation is more effective in the political case, and the subtle recommendation is more effective in the dating case," Hatute said. "But I don't see that as the important part. I think what's important is that we can see that algorithms influence people's decisions." 

The study builds on previous research establishing that machine-learning intelligence can influence our mood, political preferences, friendships, romantic relationships, online spending habits and how we spend our time. The authors argue this trend is alarming given how much we trust AI: In a phenomenon known as machine bias, we often believe that our smartphone's picks for restaurants, news articles and dating matches are "objective, efficient and reliable," they write. 

"Many times we think that we decide freely, but recommendations from machines influence us because we think it's more objective, more neutral," said Ujué Agudo, a doctoral student with Matute and a co-author of the study. "We think, 'They have our data, they can give us the best options,' but we are not deciding freely."

For example, the order in which political candidates are presented in Google search results could be manipulated, or candidates could appear in search results and social media feeds more frequently than others to boost their familiarity to voters. Both are "strategies that make use of cognitive biases, and thus reduce critical thinking and alerting mechanisms," the study's authors write. 

"I think the implications are huge," Matute said. "On one hand, we need politicians to take this seriously and to regulate what is happening, to take care of people. On the other, you have to teach people to be critical and aware and vigilant about this influence and not to trust machines so blindly as we do." 

The researchers emphasize that we should be wary of obvious and unseen nudges from our devices and avoid automatically believing that machine-generated suggestions are inherently more objective, or based on our best interests. Matute urges people "to be very careful with the decisions they make based on recommendations from algorithms, whether we're talking about political or romantic decisions."

"This is influencing our world a lot," she added. "You should be very careful and critical about what you read, what you do — what you decide."

The study, "The influence of algorithms on political and dating decisions," published April 21 in PLOS One, was authored by Ujué Agudo and Helena Matute, University of Deusto. 

We use cookies to improve your experience on our site and to show you relevant advertising.