After converting the model to the .tflite format, we can add it to the assets folder of our Android application. We can then load the model using a helper function, loadModelFile:
tfliteModel = loadModelFile(activity);
Because our model is in the assets folder of our app, we need to pass the current activity. If you are not familiar with Android app development, you can think of an activity as a specific screen of an app.
We can then create Interpreter. In TensorFlow Lite, the interpreter is necessary to run a model and return predictions. In our example, we pass the default Options constructor. The Options constructor could be used to change the number of threads or the precision of the model:
Interpreter.Options tfliteOptions = new Interpreter.Options();
tflite = new Interpreter(tfliteModel, tfliteOptions);
Finally, we will create ByteBuffer. This is a data structure that contains the input image data:
imgData =
ByteBuffer.allocateDirect(
DIM_BATCH_SIZE...