TensorFlow Lite and Android

This years TensorFlow Developer Summit was really interesting for my day job, but it did get me thinking about the possibilities for Machine Learning (ML) together with Android for my new business.

So I figured I would share with you my thoughts on running the demo app which they showed at the summit, and give some more insights into what’s going on.

I’m going to build using Android Studio, and I’m following the instructions that appear on the TensorFlow site

First things first, clone the TensorFlow git repo

git clone https://github.com/tensorflow/tensorflow.git

TensorFlow is an open source framework for defining, training, and running machine learning applications.  It’s heavily pushed and promoted by Google, but there is some crazy community involvement, and Google do a great service with things like the TensorFlow Developer Summit, with live streaming, Q&As over Twitter, etc.

Once the repo is cloned, fire up Android Studio you can import the project from tensorflow/lite/java/demo

Use the Import project (Gradle, Eclipse ADT, etc) option on the dialog.

It will then start to build automatically

And then you can launch it in the emulator.  If you didn’t bother setting up the camera to use your webcam (or if you don’t have a webcam), you will get the slightly lame Swedish house interior with nothing good on TV

For me, the demo was using the back camera as the default, and I couldn’t find a way to switch it on the device.  Making the virtual device use the webcam as the back camera fixed it, though the orientation is not how I’d like it

Holding up pictures of things seemed to work better than actual real life things.  It seems to think I am an Italian Greyhound.

Technically, it’s a ukulele, but I’ll settle for acoustic guitar.