Drive.ai uses AI to teach self-driving cars and to give them a voice

688

Startup Drive.ai is revealing its product and strategy for the first time, and the autonomous driving tech company is looking not only to create the best hardware and software to enable self-driving cars, but also to make sure those cars communicate with people outside of the car in the most effective way possible.

Deep learning throughout

Core to Drive.ai’s approach is using deep learning across the board in its autonomous driving system, which means they’re teaching their self-driving cars somewhat like how you’d teach a human. That involves providing a host of examples of situations, objects and scenarios and then letting the system extrapolate how the rules it learns there might apply to novel or unexpected experiences. It still means logging a huge number of driving hours to provide the system with basic information, but Carol Reiley, co-founder and president of Drive.ai, explained in an interview that it should also help their self-driving vehicles deal with edge cases better.

“We are using deep learning for more of an end-to-end approach. We’re using it not just for object detection, but for making decisions, and for really asking the question ‘Is this safe or not given this sensor input’ on the road,’” Reiley explained. “A rule-based approach for something like a human on a bicycle will probably break if you see different scenarios or different viewpoints. We think that deep learning is the definitely the key to driving because there are so many different edge cases.”

Reiley says that Drive.ai has seen “millions of these cases,”, including things like people doing cartwheels across roads, running around their test cars in circles, and even dogs on skateboards. She argues you’d never be able to write a comprehensive rulebook that effectively takes all of that into account, so it’s clearly necessary to employ deep learning to really solve the problem.

Read the source article at TechCrunch