When big carmakers talk about using artificial intelligence and advanced robotics to enhance “mobility,” they usually just mean self-driving cars.
Not Toyota.
“We want to move from mobility on the road to mobility in the home,” says Gill Pratt, CEO of the Toyota Research Institute, the Japanese carmaker’s research center which launched last year in Silicon Valley. Speaking at the MIT AI & Machine Learning Disruption Timeline Conference on March 8, Pratt [far right, in the photo], said he expects the same bundle of technologies that make driverless cars a possibility will be put to use in homes first—starting with helping healthy older people do everyday tasks like cleaning up after a meal or putting away groceries.
“As the technology improves, we’ll be able to move on to helping the less-healthy elderly” with complex tasks like bathing and moving around their homes, he says.
The MIT conference, organized by the Initiative on the Digital Economy, focused on the question of how fast these advances would come. Pratt said the first wave of machines for healthy seniors could be available within five years.
Robots that help care for infirm people will take considerably longer, because those machines come in direct contact with humans, which makes the tolerance for error far less. They’re also subject to regulation by the Food & Drug Administration. He sees that next wave of robots coming in 15 years. That’s when the “demographic crisis” of retiring Baby Boomers will peak—when 20% of the U.S. population is projected to be over 65.
Pratt, who formerly headed the Defense Department’s DARPA robotics program, doesn’t worry about these robots taking human jobs. “I have no belief at all that we’ll end up with an excess of labor—no matter what we do,” he says. Home health care workers are already in short supply, he notes, and the work is arduous and low-paying. “It’s just not a very good job,” he says.
Pratt says AI and machine learning have reached a turning point similar in significance to the Cambrian Explosion—the moment 450 million years ago when complex animals first appear in fossil records. Both events are driven by the development of vision, he says. In the ancient past, animals developed eyesight that enabled them to evolve and advance.
“Something similar is happening today,” he says, “because now for the first time, computers can see.”
He notes computers have long been able to recognize basic objects. But deep learning and AI are enabling them to understand what they’re seeing—opening up new frontiers.
Watch the conference session video here.