No matter how cute present-day robots are designed to look, no matter how smiley their virtual faces and chipper their beeps and boops, they will never love you back.
The stories of people mourning robots like Jibo, a smart home assistant that announced its own “death” when its servers were scheduled to get shut down last month, are heartwarming. But they also reveal a way, according to the Associated Press, that marketers could exploit the emotions of people — especially kids — by programming robots to seem more emotionally savvy than they really are.
Humans will bond with seemingly anything, whether it’s a robotic vacuum cleaner that gets pitifully stuck in a corner or Jibo. But that’s because we tend to ascribe intention and consciousness to things that seem to act with purpose, experts told the AP.
“The performance of empathy is not empathy,” MIT AI researcher Sherry Turkle told the AP. “Simulated thinking might be thinking, but simulated feeling is never feeling. Simulated love is never love.”
When a robot does something adorable or seems to have genuine emotions, Turkle suggests that it’s all because of a human-written script — and unfortunately not the basis of a true friendship.