TensorFlow is the go-to solution for ML and Data Scientists currently along with several others like PyTorch. But TensorFlow is popular. Also, we have a TensorFlow Lite as well for IoT and Mobile apps. And the same thing is coming to the Play Services, thanks to Google. This will bring on-device ML for the apps on Android Devices and will be called Android ML Platform in Play Services.
Why Adding TensorFlow Lite in the Play Service in the first place?
Google has initially found that packing additional libraries such as TF in a mobile app takes more storage on-device. Thus implementing one on the top so that every developer can access it without adding their own package is a plus point. This will make it easy for the developers to work with Play Services. But it will be a choice for the app developers to join Play Services or use their own ML package for their apps.
This approach will have three components. First – applying Tensorflow Lite to Play Services. Second – for optimal performance, Google is adding an Automatic Acceleration feature, which will “enable per-model testing to create allowlists for specific devices taking performance, accuracy, and stability into account.” This will check if the model acceleration is based on hardware acceleration or not. Third – Google will also update the Neural Network API as well out of the Android OS updates. This will be in collaboration with the vendors so that they can provide updated device drivers in future updates.
This is yet to be seen how this approach goes. Play Services already take a huge chunk of storage for no reason, but adding TensorFlow Lite will add more to it. But in comparison to each developer adding their own ML package, this will reduce the importing libraries size by removing the requirement of adding a new library for ML.
Also, Read: Vivo S10 : Spotted on TENAA