Thursday, 19 October 2017
A brief guide to mobile AI chips
Mobile AI chips. What are they actually good for?
In the recent months we’ve heard a lot about specialized silicon being used for machine learning in mobile devices. Apple’s new iPhones have their “neural engine”; Huawei’s Mate 10 comes with a “neural processing unit”; and companies that manufacture and design chips (like Qualcomm and ARM) are gearing up to supply AI-optimized hardware to the rest of the industry.
What’s not clear, is how much all this benefits the consumer. When you’re buying your phone, should an “AI chip” be on your wish list? If you want to use the latest AI-powered app that (just picking an example at random here) automatically identifies and hides your nude selfies, do you really need an AI chip? Short answer, no, but let’s dig a little deeper.
WHY DO WE NEED AI CHIPS*?
The reason for having mobile AI chips in the first place is pretty straightforward. Regular CPUs found in phones, laptops, and desktops just aren’t well suited to the demands of machine learning, and trying to make them do it ends up with slow service and a fast-draining battery.
Contemporary AI requires computers to make lots of small calculations very quickly, but CPUs only have a handful of “cores” available to do the math. That’s why the industry loves graphical processing units, or GPUs. These were originally designed to render video game graphics, which, coincidentally, requires making lots of small calculations very quickly. Instead of a handful of cores, they have thousands.
Now, fitting thousands of cores into a chip for your phone isn’t going to happen. But there are other architectural changes you can make to increase the number of simultaneous work your chip can do. Qualcomm’s head of AI and machine learning, Gary Brotman, tells The Verge: “I think parallelization is certainly key, and doing it efficiently, especially.” He’s quick to add, though, that dedicated AI compute units aren’t the only way forward — other bits of chip architecture can also be adapted.
*”AI chip” is a usefully recognizable term, but it’s also imprecise. In the case of Huawei and Apple, what’s being offered is not a single, self-contained chip, but dedicated processors that come as part of a bigger SoC (or system on chip), such as Apple’s A11 Bionic. SoCs already contain various specialized components for things like rendering graphics and processing images, so adding a few cores for AI is kind of par for the course.
WHAT DO WE GET OUT OF IT?
As mentioned above, specialized AI hardware means — in theory — better performance and better battery life. But there are also upsides for user privacy and security, and for developers as well.
First, privacy and security. At the moment, a lot of machine learning services have to send your data to the cloud to perform the actual analysis. Companies like Google and Apple have come up with methods to do these sorts of calculations directly on your phone, but they’re not widely used yet. Having dedicated hardware encourages more on-device AI, which means less risk to users of data getting leaked or hacked.
Plus, if you’re not sending data off into the cloud every few seconds, it means users can access services offline and save data. That latter part is a boon for developers, too. After all, if the analysis is done on-device, it saves the people running the app paying for servers. As long as the hardware is up to scratch, everyone benefits.
IS THIS STUFF READY TO USE?
This next section is where things get trickier. Just because a phone has an AI chip, doesn’t mean AI-powered apps and services will be able to take advantage of it.
In the case of Huawei and Apple, for example, both companies have their own APIs that developers need to use to tap the power of their respective “neural” hardware. And before they can integrate that API, they have to make sure the AI framework they used (for example, Google’s TensorFlow or Facebook’s Caffe2) is also supported. If it’s not, they’ll have to convert it, which also takes time.
Anthony Mullen, a tech analyst at Gartner, says navigating this patchwork of interfaces “isn’t for the faint-hearted.” Speaking to The Verge, he says: “It’ll be a while yet before people are developing elaborate experiences using this hardware. Until then there’ll be special partnerships between manufacturers and third-parties.” That’s why Microsoft is working with Huawei to make sure its Translator app works offline with the company’s NPU chip, and why Facebook partnered with Qualcomm to integrate the latter’s AI focused hardware to load its augmented reality filters faster.
But while big companies like these can afford to put in the time, it’s not clear if it’ll be worth the effort for every small app developer. This won’t be a problem for Apple, which developers will only have to adapt their app once, using the company’s Core ML framework; but it could be a headache for Android, especially if different manufacturers all start introducing their own protocols.
Thankfully, Google is using its power over the ecosystem to combat this problem. Its mobile AI framework, TensorFlow Lite, is already standardizing some experiences on mobile devices, and it’s introducing its own Android-wide APIs to "tap into silicon-specific accelerators."
“From a developer’s standpoint in the Android environment it won’t mitigate all the fragmentation risks,” says Brotman. “But it’ll certainly provide a construct to make it easier.” He adds that some of the effects of this work won’t be fully felt until Android P is ready.
SO DO I NEED AN AI CHIP IN MY PHONE?
No, not really. So much work is being done on making AI services run better on the hardware currently available, that unless you’re a real power user, you don’t need to worry about it.
In both Huawei and Apple’s cases, the primary use of their shiny new hardware is just generally making their phones... better. For Huawei that means monitoring how the Mate 10 is used over its lifetime and reallocating resources to keep it from slowing down; for Apple that means powering new features like Face ID and animoji.
Having computing power dedicated to AI tasks is neat, sure, but so are other features of high-end handsets — like dual camera lenses or waterproofing. Boasting about AI chips makes for good marketing now, it won’t be long before it just becomes another component.
l Samsung EB-BA700ABE 2600mah 3.8V Battery
l Extended Backup Battery for Motorola EU20 Motorola Droid Ultra XT1080 Internal
l Samsung EB-BG920ABE Samsung Galaxy S6 Internal Battery G920F
Battery for Sony LIS1594ERPC Sony Z5 compact Z5C Z5 mini E5823
Subscribe to:
Post Comments (Atom)
Popular Articles
-
When you buy our cheap GA40 Battery - GA40 Battery you can be rest assured that you are receiving the best value and service for your mon...
-
Bose Battery Power your system. It is the best choose to get standby Bose battery for your laptop here high quality li-ion cells, longer ba...
-
Bose Battery Power your system. It is the best choose to get standby Bose battery for your laptop here high quality li-ion cells, longer ba...
-
We only sell high quality Samsung Galaxy Tab 7.0 GT-P1000 battery packs products, each SP4960C3A tablet batteries is brand new,All our S...
-
When you buy our cheap LIS1485ERPC Battery - LIS1485ERPC Battery you can be rest assured that you are receiving the best value and servic...
No comments:
Post a Comment