mardi 8 mai 2018

Google’s ML Kit is a new Firebase SDK that takes the headache out of machine learning

Machine learning and artificial intelligence have quickly entered our lexicon in recent years, but few truly understand how the technology works, or what they’re capable of. Even Google’s own AI researchers joke that machine learning is akin to alchemy. As a busy developer, you may not have the time to learn about machine learning (ML), but Google doesn’t want that to stop you from reaping its benefits. For that reason, the company today announced ML Kit: A new SDK that incorporates years of Google’s work on machine learning into a Firebase package that mobile app developers on both iOS and Android can use to enhance their apps.

If you don’t know anything about machine learning, then don’t fret: You don’t need any prior background ML knowledge. You’re probably familiar with some real-world applications of the technology such as face detection and image recognition. Google’s ML Kit wants your app to benefit from the real-world uses of ML without you needing to understand how the algorithm works. And if you do understand ML or are willing to learn, you too can take advantage of ML Kit.


Machine Learning for beginners with ML Kit

Google’s new Firebase SDK for ML offers five APIs for some of the most common use cases on mobile:

  • Text recognition
  • Face detection
  • Barcode scanning
  • Image labeling
  • Landmark recognition

Machine Learning Kit ML Kit

All you need to do is pass data to the API and the SDK returns a response. It’s that simple. Some examples of ML use include music applications that interpret what notes you play and apply echo/noise cancellation to your music. Another example could be optical character recognition (OCR) for nutrition labels for calorie counting apps.

The list of available base APIs will expand in the coming months to include a smart reply API just like Android P and a high-density face contour addition to the face detection API.


ML Kit for experienced users

If you have a bit of prior background knowledge, then you can also deploy your own custom TensorFlow Lite models. All you have to do is upload your model to the Firebase console so you don’t have to worry about bundling the model into your APK (thus reducing file size.) ML Kit serves your model dynamically so you can update your models without re-publishing your app.

Even better is that Google will automatically compress full TensorFlow models into a TensorFlow Lite model, which reduces file size and ensures more people on limited data connections can enjoy your app.


On-device and Cloud APIs

ML Kit offers both on-device and Cloud APIs. The on-device API processes data without a network connection (like Android Oreo’s text selection feature) whereas the Cloud APIs use Google Cloud Platform to process data for more accuracy.

ML Kit works on both Android and iOS, and on Android, in particular, it works with devices running Android versions as old as Ice Cream Sandwich. If the user is running Android 8.1 Oreo and above, then ML Kit will offer better performance thanks to the Neural Networks API that is already present. On devices with chipsets that have specialized hardware such as the Qualcomm Snapdragon 845 (and its Hexagon DSP) or the HiSilicon Kirin 970 (and its Neural Processing Unit), on-device processing will be accelerated. Google says they are working with SoC vendors to improve on-device recognition, too.


Conclusion

Developers looking to get started should look for the new SDK in the Firebase console. You can leave feedback in the Google group for Firebase.

Developers with experience in ML who are looking to try Google’s algorithm for compressing TensorFlow models can sign up here. Lastly, check out the Firebase Remote Config if you want to experiment with multiple custom models; it allows you to switch model values dynamically, create population segments, and experiment with several models in parallel.



from xda-developers https://ift.tt/2rtqmyS
via IFTTT

Aucun commentaire:

Enregistrer un commentaire