The new DRIVE AI platform will make in-car AI assistants far easier to build and deploy, with capabilities that integrate with both exterior and interior sensor data to interact not only with drivers, but also with passengers on the road.
It’s predicted that most automotive manufacturers will likely build their own propriety smart assistant into their vehicles, and Nvidia expects to play a big role in making that happen with their latest platform.
The other area of focus for Nvidia is augmented reality. The firms’ automotive lead, Danny Shapiro, believes that just as AR has become an increasingly fundamental part of the smartphone experience, so too will it become natural and even essential in cars.
Augmented Reality SDK
Drive AR will offer developers an SDK that will allow them to tap into graphics, computer vision and AI capabilities of the new platform to do things like overlay points-of-interest, road conditions and other metrics onto an interactive in-car display.
Nvidia’s latest DRIVE AI platform is built on an updated and revised version of their autonomous taxi brain – ‘Pegasus’. The latest version of Pegasus improves on the previous edition by combining two two Nvidia GPUs with two Xavier SoCs into a package that’s roughly the size of a number plate – down from the boot-filling physical footprint of the original iteration.