2 mins read

A Peek Inside Google’s Efforts To Create A General-Purpose Robot

Videos of Google-owned robots, some that look like mechanical bulls and others resembling humanoids from Googlesci-fi movies, have been viewed more than 90 million times online. Despite that YouTube fame, Google is reluctant to discuss its robotics operation publicly. Behind the scenes, the company is laying the groundwork for intelligent, multipurpose robots, according to interviews with people familiar with the projects and research published in academic repositories.

As part of a restructuring effort to separate the Google internet services from research arms and unrelated businesses, the company will create a separate division for robotics within the renamed umbrella entity Alphabet, says a person familiar with the plans. Boston Dynamics—a maker of robots for the US military, Sony, and others that developed the robots popular on YouTube—will operate with some independence within the newly formed group, says the person, who requested not to be named because the plans aren’t public.

The autonomy within Alphabet may help the company accelerate efforts to develop robotics systems capable of solving a wide range of everyday problems. Google’s robotics group, which has grown substantially in the last two years thanks to at least eight acquisitions including Boston Dynamics, has already made strides. The company is using its immense resources to build machines that could someday serve as domestic helpers or security droids.

Research published periodically reveals technical breakthroughs and provides a road map for where Google robots are headed. For example, the company is experimenting with feeding short video clips into machines’ electronic brains to improve their vision, according to a Google paper published online September 4 in Cornell University’s Arxiv repository. The technique uses snippets of footage to give computers a better sense of what, say, a banana looks like from multiple angles. The system can also work with video shot on standard mobile phones, the researchers write.

Google hasn’t discussed how its robotics ambitions may someday turn into a moneymaker. Scott Strawn, an analyst at research firm IDC, doesn’t expect Google to commercialize its technology anytime soon. But robots are a natural extension of artificial intelligence, which is a big area Google has invested in for search and other products, he says. “They have many of the world’s experts in AI on their payroll. You can look to them to be at the forefront of that technology, and it’s that which will enable a robotics program.”

Experts in the field convened in San Jose last week at the RoboBusiness conference to discuss developments in machine capabilities, AI, and other topics. Google was on many people’s minds. “They’re working on generic solutions to very hard problems,” says Melonee Wise, the CEO of startup Fetch Robotics. One area where Google is far ahead of the competition is in image and object recognition, says Wise, a former employee of robotics incubator Willow Garage, which Google acquired assets from.

The company’s DeepMind AI team has started applying a version of its software learning system to robots. An earlier incarnation of that system enabled computers to figure out how to play and master retro video games on their own. The robot version, outlined in a paper published September 9, learned how to solve more than 20 simulated tasks, including driving a car, hitting a hockey puck, and walking, by watching low-quality footage. Google recently solved another tricky problem in robotics: teaching robots how to grasp an object. Along with the University of Washington, Google developed a system allowing robots to classify an object within a fraction of a second and find the right spot to grab, according to another recent paper.

Jeff Dean, a senior fellow at Google, says his research team is looking at using “deep learning” techniques, which allow computers to discover patterns within reams of data, to improve how robots see and move around the world. “We have a couple of robots we’re playing with,” Dean says.