Slide 8
Slide 8 text
TensorFlow (Jan 2017)
// Load model from disk
TensorFlowInferenceInterface tf =
new TensorFlowInferenceInterface(assetManager, "my_model.pb");
// Copy the input data into TensorFlow
tf.feed(inputName, floatValues, 1, inputSize, inputSize, 3);
// Run the inference call.
tf.run(outputNames, logStats);
// Copy the output Tensor back into the output array.
tf.fetch(outputName, outputs);
Using Custom Models with ML Kit • Eric Fung • Ohio DevFest 2018