by Mark Bünger, Lux Research
Today’s Internet is increasingly a privacy-free dystopia of clickbaity hyperlinks leading to ad-infested pages of copy-pasted content, annotated by moronic and even threatening trolls. It’s worlds apart from the pristine cyberspace of the early web – and possibly also a preview of what the physical world will feel like when robots and artificial intelligence pervade our electronic garments, smart homes, connected cars, and intelligent cities – the Internet of Things (IoT). Glimpses of our future co-existence with embodied artificial intelligence can be seen in recent news stories like the Nazi-trolling of Tay, Microsoft’s teen-girl chatbot, who humans taught to spew racist and homophobic rants. Humans showed an outpouring of sympathy for Boston Dynamics’ humanoid robot Atlas as he was bullied and shoved on video – a public-relations problem that prompted Google (the parent company) to send an internal memo recommending the company distance itself from the humanoid bots. And a Stanford researcher found that humans were aroused by touching a robot’s body in areas corresponding to human sensual zones.
In this vein, we recently had one of the most interesting conversations of our life at the Beam robot store in Palo Alto, California. The company, Suitable Technologies, makes a telepresence bot: a robot operated by a remote human, who is “present” via the robot’s screen, speakers, camera, and microphone. Envision an iPad mounted on a cart that a human drives around from wherever they are in the world, running a Skype video conference. The human-robot interactions we witnessed delivered a stream of unexpected impressions:
- The robot is called Beam, and Beams are literally the only animate objects in the store – and right outside, too: two of them were “standing” (?) on the sidewalk, chatting up passersby, like waiters trying to lure tourists into a restaurant. It was entertaining to watch the wide range of human reactions, which spanned from startled and creeped out (“eeew”), to excited (“robot selfies!”), to completely natural (“oh, hai”).
- Intrigued, we went in and spoke with one of the robot “pilots” (the human on the other side of the screen) and asked her about what she had learned in her life as a robot. In verbal interactions, she said that some of the people who talk with her “say and do things that are pretty much like the comments section on an Internet webpage.”Humans hit on her (“wow, you’re cute”), insult her (“back off, weirdo”), and even question her existence (“are you real?”). In other words, in the robotic future, we’ll have to deal with online trolls in real life, too.
- Interestingly, the redefinition of personal interactions extends to the physical realm, too. She said that even piloting a robot near Stanford while sitting in Berkeley, she felt she had a personal space, and it was invaded when people came too close to her cyberphysical embodiment in the store – “It feels like an affront when people touch or even stand too close to the robot.”
We hope that the Beam pilots are documenting all this, because these early, awkward experiences will make fascinating reading as the emotional/psychological interactions between people and robots/AI deepen. These observations corroborate our prediction that the principle that has guided the increasingly realistic depiction of the physical world inside the virtual world, skeuomorphism, will be inverted in artificial intelligence user experiences (see the Lux Research report “Artificial Intelligence User Experience: Identifying Early Leaders in the Interface to the IoT”). The physical world will increasingly work like the virtual one, beginning with the redesign of physical objects, actions, and attributes. As more things look and act like computers, they will learn to read human language and actions in unstructured environments, making the entire world an interface. Rather than making a virtual world that works like the real world, artificial intelligence in the IoT makes the real world work more and more like the virtual one – for better and for worse.