Imagine emojis that you can control with your face - the adorable panda or dog emoji that looks and talks like you. One of the highlights of the much-awaited iPhone X is the Animoji feature. Using the TrueDepth camera that analyses more than 50 facial muscle movements, users will be able to lend their face and expressions to a dozen emojis, edit and send them out in real time. Powering this fancy new feature on the Apple iPhone X is the A11 Bionic chip with neural engine that can recognise people, places and objects.
A neural engine, explains Apple, is a hardware that's "purpose-built for machine learning, a type of artificial intelligence that enables computers to learn from observation. It's capable of incredibly fast computations needed by neural networks while also being incredibly efficient."
Having focused on multiple cores to boost performance in smartphones, chip manufacturers are now adding dedicated hardware to address the need for machine learning to handle tasks such as real-time voice processing and image recognition swiftly. These new components improve the performance and efficiency of tasks associated with AI assistants. Although existing processors can be put to use, they slow down the device and drain the battery much faster.
With smartphones including chipsets with dedicated hardware for AI, users can expect a significantly better experience. Smartphones will be able to listen, read and understand how users think, and provide more relevant information when needed in real time. For instance, based on the time of day, an AI engine can suggest switching to low-light mode to avoid eye strain.
The neural engine in iPhone X's A11 Bionic chip helps unlocking the phone using FaceID, irrespective of a change in the user's appearance. It is also equipped to carry out intuitive tasks such as keeping the screen lit when you're reading, and lowering the volume of an alarm or ringer when you are looking at the device with eyes wide open. There is a 'Hey Siri' detector that uses a deep neural network to activate Apple's voice assistant, Siri, without the user having to press a button. Hardware, software, and Internet services work seamlessly together to provide this hands-free experience.
Huawei's Kirin 970 chipset has a dedicated neural network processing unit (NPU) that has advanced object-recognition function. It helps identify objects in real time to adjust metrics for capturing better photos or offer real-time translations in augmented reality mode. "The aim is to deliver this in the most efficient way with on-device AI processing in order to ensure an experience with less latency, less power consumption, no network dependence and more privacy. This new generation of smartphones will transform human-to-machine interactions," explains a Huawei spokesperson. Huawei Mate 10 and Mate 10 Pro are the first two smartphones launched with the AI-powered Kirin 970 chip.
Qualcomm has adopted a different approach. It has created a software development kit (SDK) for developers to power immersive and engaging user experiences with machine learning on its mobile platforms using the Snapdragon Neural Processing Engine (NPE). This means developers can take advantage of user experiences across filters (augmented reality), scene detection, facial recognition, natural language understanding, object tracking and avoidance, gesturing and text recognition, to name a few, and optimise their apps to run AI efficiently.
Instead of focusing only on the flagship devices, Qualcomm's Snapdragon NPE SDK supports multiple chips including ones that power mid-level smartphones. "The primary benefit of on-device AI is a more seamless, immersive, realistic and real-time experience," says a Qualcomm spokesperson.
Google's latest - Pixel 2 and Pixel 2 XL - have a special-purpose chip for AI-related tasks and image processing. The Pixel Visual Core co-processor is believed to improve speed and battery life when shooting photos with Google's HDR+ technology. Samsung isn't far behind. Recently, the company invested in Chinese AI firm DeePhi Tech. Samsung's next Galaxy S9 flagship, to be launched in 2018, is rumoured to be powered by the by Exynos-powered AI chip.
By integrating smart devices with the cloud and big data, on-device AI will continue to make advancements in the realm of security, processing, and low power consumption. Currently, a lot of data is being sent to the cloud for authentication, but with AI-embedded chipsets that enable real-time processing, there are lesser chances of data being leaked or hacked. Smartphones are slated to become smarter and more sensitive in the days ahead.