Does AI need a humanoid form to interact with us? Some pros and cons

Could AI like Alexa be on the way to humanoid form factors?

Amazon Alexa already works with multiple devices; could a humanoid form be next? Source: Amazon Developer

When most people think about robots, they think of the humanoids and androids of science fiction. Those with more real-world experience can cite industrial robots, drones, or nascent autonomous vehicles. While an expansive definition of robotics can include artificial intelligence and virtual assistants, do smart appliances such as Amazon Alexa need a humanoid form to better interact with or serve human users?

They do, said Rohit Prasad, head scientist at Amazon’s Alexa AI group, at the EmTech Digital conference this week. “Language is complicated and ambiguous by definition,” he told the MIT Technology Review. “Reasoning and context have to come in.”

Proponents of humanoid robots have long noted that people rely in visual cues in addition to text or audio to convey rich meaning. Anyone who has been frustrated by an e-mail taken the wrong way or didn’t know when to break into an audio-only conference call knows this. As with other aspects of robotics, developers should keep in mind the intended use when deciding on a humanoid form.

Pros for humanoid robots

Despite the so-called uncanny valley, the urge to anthropomorphize is irresistible. People name their cars, assume their pets share their emotional states, and rename and even put clothing on robots. Why not use that to help human-machine interaction (HMI)?

Researchers have found that people respond well to robots that can move and use nonverbal cues in conversation.

The Pepper service robot from SoftBank Robotics (originally Aldebaran) has a humanoid torso, head, and arms. It can recognize people, speak in a variety of languages, and use gestures as well as a screen on its chest. Pepper has been used in retail locations such as banks, and despite some setbacks, it is one of the robots you’re most likely to encounter in public.

Pepper is an example of a humanoid form factor.

Pepper is an example of a humanoid form factor. Credit: Erico Guizzo

Pepper is an example of a robot that’s designed to be nonthreatening, helpful, and more of a high-end informational kiosk than a replacement for humans. It has used AI from IBM Watson, but its physical capabilities are still somewhat limited.

Like Alexa, Pepper is intended to be easy to use, but what Alexa lacks in mobility, it makes up for by being connected to smart-home features and the Web (and e-commerce).

Another example of a humanoid form is Digit from Agility Robotics. It could someday solve the last-step problem and deliver packages up curbs and stairs. It’s explicitly designed to operate in human-centric environments. A humanoid form could also be useful in construction, space exploration, and emergency response.

Making such robots smarter and more autonomous with AI is a no-brainer, right?

Cons against a humanoid form factor

Designing and building truly humanoid robots is still difficult. “Hardware is hard,” and the mishaps at the last DARPA Robotics Challenge demonstrated that combining bipedal locomotion, fine manipulation, and autonomy in dynamic environments is tough. They’ve come long way since then, and Nadia from the Institute for Human & Machine Cognition exemplifies the state of humanoid research.

On the consumer robotics side, recent failures such as that of Jibo show that the market has little patience for AI and social robots that don’t meet expectations. There are lots reasons why robots don’t need to be humanoid, from energy efficiency to being built for specific tasks.

Note that Toyota’s Human Support Robot isn’t humanoid, although the company is also working on the T-HR3, which combines tele-operation with a humanoid form.

Toyota is developing a robot with humanoid form

The Master Maneuvering System and the T-HR3. Source: Toyota

Intuition Robotics’ ElliQ has some gestural interface features, but it is not meant as a substitute for human contact for people aging in place.

Google and Amazon have a clear advantage with AI, with large data sets to train machine learning applications, lots of human coders and money to invest in promising startups, and an interest in household robotics. Google parent Alphabet last fall shut down its humanoid unit, but The New York Times said it’s ready to move forward again.

Robotics suppliers will need to guarantee connectivity, safety, and affordability for smart humanoid robots.

What’s next?

Whoever perfects the humanoid form first could create a huge new market. Reports dating back at least a year suggest that Amazon is looking to pair Alexa with a mobile robot, allegedly code-named “Vesta.”

At CES 2019, UBTech wowed crowds with its Walker robot, but it’s not clear when it will be available to consumers.

SoftBank, which offers Pepper as a service, also owns Boston Dynamics, whose legged Atlas and SpotMini robots are remarkably agile. SpotMini will be commercially available this year.

Robots with a humanoid form factor are coming, and we might expect them in service roles before they’ll be affordable enough for household use. The AI in Alexa/Echo and Google Home is powerful, but going from a smart speaker to a general-purpose humanoid form is a major challenge.

Amazon’s savvy in licensing Alexa and its AWS RoboMaker, which now works with NVIDIA’s Jetson AI platform, show a forward-thinking approach to developing AI and robotics that could help lead to humanoid household robots.

The post Does AI need a humanoid form to interact with us? Some pros and cons appeared first on The Robot Report.