How you look at a robot and how it looks at you can make you more comfortable
I think the idea of designing robots that look like humans to better interactwith humans is a solid “meh.” The concept is good, but the execution is usually horrible, and the more your robot tries to look like a human, the more horrible it gets. Having said that, I think that the idea of using robots with specific human features, like eyes, can be a substantial asset for human-robot interaction, if you know what you’re doing.
Sean Andrist, a PhD student at the University of Wisconsin-Madison (who knows what he’s doing), has been researching social gaze with robots. He’s developed algorithms that help robots look at people at the right times and in the right ways. It’s not just making the robots less creepy, but more helpful as well.