Qualcomm has revealed a deep learning SDK for devices powered by its Snapdragon 820 SoC. Called the Snapdragon Neural Processing Engine, and unveiled at the Embedded Vision Summit, the SDK is an attempt by Qualcomm to expand beyond its core smartphone markets – by enticing automotive, industrial, and IoT developers to its Snapdragon SoC platform.
Based on the neuromorphic Zeroth Machine Intelligence Platform, Qualcomm’s equivalent to IBM TrueNorth neurosynaptic chips, the kit aims to bring the hardware-specific functions of those brain chips to a more general purpose Snapdragon platform – so that mobile devices can leverage the incredible potential of silicon that acts in the same manner as a human brain.
While IBM is planning on shipping its brain chips to end-customers, who will put them to use in servers, the Qualcomm approach currently relies on software to bring those capabilities to a much more adaptable chipset – one that can be used in mobile devices.
Notably, Qualcomm has moved away from its initial plans to create this functionality in hardware. Back in May 2014, when Qualcomm first introduced the Zeroth project, the goal was to build a Neural Processing Unit that would be a hardware module for its advanced SoCs. There were rumors that it would be a standard component of the 820 chipset, but there’s no sign of it in the spec sheet.
Machine learning is central to the evolution of the IoT. There simply aren’t enough engineers to monitor and manage the billions of devices that are expected to be deployed. There aren’t enough developer resources to teach image recognition to camera systems, or collision avoidance to autonomous cars, and so the machines have to be able to evolve from a base-level intelligence installed in the factory, and adapt to the dynamic world around them.
If a connected traffic camera had to poll a real human being every time it spotted a car license plate it didn’t recognize, entire systems would crawl to a halt. Moving from the edge-devices and into the centralized datacenter deployments, machine learning enables vast libraries of data to be independently analyzed at speeds far in excess of what a human could manage – in theory, creating value from data that is just lying around, waiting to be used.
Qualcomm says that the SDK will allow OEMs to run their neural network models on the Snapdragon 820, which will allow devices like smartphones, drones, and automobiles, to carry out image recognition, object tracking, gesture and facial recognition, and natural language processing.
The new SDK’s flagship features include an accelerated runtime for the on-device execution of the convolutional and recurrent neural networks, configured in software to get the most out of the 820’s cores – the 64-bit Kyro quad-core CPU, Adreno 530 GPU, and the Hexagon 680 DSP.
A core part of the offering is the Snapdragon’s ability to move its workloads between the different cores, in order to achieve an optimal load or efficiency. This heterogeneous compute power should allow for more capable end-devices, and support for Caffe and CudaConvNet is also included, which will be appreciated by developers that have used those familiar environments.
“The demand for untethered, mobile learning-driven user experiences is increasing rapidly and Qualcomm Technologies’ customers have been clamoring for tools to help them realize their product visions,” said Gary Brotman, director of product management, Qualcomm Technologies. “With the introduction of the new Snapdragon Neural Processing Engine SDK, we are making it possible for myriad sectors, including mobile, IoT and automotive to harnesses the power of Qualcomm Snapdragon 820 and make high-performance, power efficient on-device deep learning a reality.”
The Zeroth software package is currently used in Qualcomm’s Snapdragon Scene Detect, for spotting objects or faces in a camera view and adjusting the settings accordingly, and its Smart Protect malware detection, which looks for anomalous behavior in the phone that might be a sign of a security compromise.
“The Qualcomm Snapdragon Neural Processing Engine SDK helps us bring deep learning to our connected camera and smart cloud network, and that helps us present a full, contextual picture of the driving environment to our commercial automotive fleet and auto insurance customers,” said Frederick Soo, CTO of NAUTO, an autonomous vehicle startup, developing a dashboard camera and driver monitoring system that is targeting adoption among insurance companies.
Soo added that “the Neural Processing Engine SDK means we can quickly deploy our proprietary deep learning algorithms to our Snapdragon-based connected camera devices in the field, which can detect driver distraction and help prevent auto accidents. That same deep learning helps NAUTO’s system deliver valuable information that helps our customers drive more safely, and reduce liability.”
Read original post at Rethink IoT