Product developers, researchers and makers will be able to add AI capabilities to their devices and develop, tune and deploy AI-based applications far more easily.
Think only a large enterprise has the resources to deploy artificial intelligence technology? Think again.
Intel aims to make AI more affordable and accessible, especially to smaller companies and entrepreneurs. Last month, the company introduced the Movidius Neural Compute Stick, which it billed as “the world’s first USB-based deep learning inference kit and self-contained” AI accelerator. The $79 USB stick delivers “dedicated deep neural network processing capabilities to a wide range of host devices at the edge,” Intel says.
With the USB stick, Intel suggests that product developers, researchers and makers will be able to add AI capabilities to their devices and develop, tune and deploy AI-based applications far more easily.
The Power of AI in a Small Space
Movidius announced the first version of the Neural Compute Stick in April 2016, five months before Intel acquired the company, according to The Verge.
The stick is powered by Movidius’s Myriad 2 visual processing unit chipset, which aligns with the company’s goal to move image-based deep-learning technology from the cloud to the network edge, TechCrunch reports. “The chips are used on everything from security cameras and drones to [augmented reality] headsets, enabling them all to recognize and identify objects in the world around them,” TechCrunch notes.
The Myriad 2 VPU “provides powerful, yet efficient performance — more than 100 gigaflops of performance within a 1W power envelope — to run real-time deep neural networks directly from the device,” Remi El-Ouazzane, vice president and general manager of Movidius, says in the Intel statement. “This enables a wide range of AI applications to be deployed offline.”
Machine intelligence development basically involves training an algorithm on large sets of sample data using modern machine learning techniques, Intel notes, and then running the algorithm in an app that needs to interpret real-world data, a process known as “inference.” Performing inference at the edge, or natively inside devices, lowers latency and improves both power consumption and privacy.
The USB stick uses the Caffee deep-learning framework to run native neural networks on the devices it’s plugged in to. Users can then tune the neural networks to optimize performance.
The key is that the stick can be used as “a discrete neural network accelerator by adding dedicated deep learning inference capabilities to existing computing platforms for improved performance and power efficiency,” Intel says.
What Are the Uses of AI on a USB Stick?
What will the Neural Compute Stick actually enable? Developers and entrepreneurs are still figuring that out, but there are some clear opportunities for those who want to beef up their products with AI.
As The Verge notes, “AI researchers will be able to use the stick as an accelerator — plugging it in to their computers to get a little more local power when training and designing new neural nets.”
Users can also chain multiple sticks together, according to Movidius, enhancing performance linearly with each one.
Additionally, companies that want to add AI capabilities to physical devices will be able to do so easily. As long as the gadget has a USB port, the stick will enable local neural networking capabilities on the device.
As ZDNet reports, “Movidius suggests it could be used for research and prototyping with end devices like robots or drones. For instance, a researcher could add object recognition capabilities to a smart vacuum cleaner.”
IDC analyst Dave Schubmehl, who specializes in cognitive and AI systems, says that the USB stick could be used for a wide variety of applications. For example, it could be plugged in tocameras for image recognition or a conference room speaker for voice recognition capabilitiesthat match audio to a particular speaker. Or, it could be placed into a robot that could recognize certain products in a store.
The Movidius stick could be used “any place where you need an onboard machine learning model to run in a localized environment,” Schubmehl says, adding that we’re just “scratching the surface” in terms of the potential of AI in products.
Shared from: BizTech Magazine