Intel’s coming up on the one-year anniversary of its Movidius acquisition, and to celebrate it the company has shown off a brand spanking new chip, the Myriad X.
The Myriad X seems to be somewhat of a “Pro” version of the Myriad 2, bringing a major redesign to the computer vision-minded chip, while also bringing some sophisticated deep learning capabilities by way of its new “Neural Compute Engine,” which will make it easier for devices built on the Myriad X to interpret information from their surroundings.
“With this faster, more pervasive intelligence embedded directly into devices, the potential to make our world safer, more productive and more personal is limitless,” Intel Movidius exec Remi El-Ouazzane wrote in a blog post.
A dedicated computer vision chip seems to be something that just about every electronic device could find a use for, but Intel’s main pursuits for implementing Movidius Myriad chips are on drones, VR/AR headsets, robotics and smart cameras. The low-power SoCs allow the devices to focus more brainpower on identifying objects in their environment and rapidly detecting changes.
While the Myriad 2 maxed out at about 1 to 1.5 trillion operations per second, the Myriad X can handle 4 trillion operations per second. What that means from a more practical standpoint is that a smart video camera built on Intel’s latest Movidius chip may not just recognize whether there’s a person in the photo, it might identify their gender or age as well. The Neural Compute Engine enables some pretty heavy image processing to take place on the edge.
The AI-optimized VPU (visual processing unit) really seems to highlight how the Intel acquisition has expanded the ambitions of Movidius particularly in regards to how the computer vision SoC approaches AI and deep-learning.
The last-gen Myriad 2 chip will continue to be a big part of Intel’s visual processing arsenal. While the company wouldn’t directly comment on pricing, the Myriad X is undoubtedly going to be the pricier option for device manufacturers.
from TechCrunch http://tcrn.ch/2iEQKEg