Main Content

TFLiteModel

TensorFlow Lite model

Description

A TFLiteModel object enables support for simulation and code generation for deep learning inference by using TensorFlow™ Lite models

Use a TFLiteModel object with the predict function in your MATLAB® code to perform inference in MATLAB execution, code generation, or MATLAB Function block in Simulink® models. For more information, see Prerequisites for Deep Learning with TensorFlow Lite Models.

To use this object, you must install the Deep Learning Toolbox Interface for TensorFlow Lite support package.

Creation

To create a TFLiteModel object from a pretrained TensorFlow Lite model file, use the loadTFLiteModel function.

Properties

expand all

Name of the TensorFlow Lite model file, specified as a character vector.

Number of inputs the TensorFlow Lite model accepts, specified as a integer-valued numeric scalar.

Number of outputs the TensorFlow Lite model produces, specified as a integer-valued numeric scalar.

Size of inputs of the TensorFlow model, specified as a cell array containing numeric arrays.

Size of outputs of the TensorFlow model, specified as a cell array containing numeric arrays.

Number of computational threads used for running inference with the TensorFlow Lite model, specified as an integer-valued numeric scalar.

The default value of this property is equal to the value returned by the maxNumCompThreads function.

Mean value to which the input data is normalized, specified as a double scalar. If the input data is not normalized, you must set this property to 0. Otherwise, set this property based on how the input data is normalized.

Standard deviation to which the input data is normalized, specified as a double scalar. If the input data is not normalized, you must set this property to 1. Otherwise, set this property based on how the input data is normalized.

Object Functions

predictCompute deep learning network output for inference by using a TensorFlow Lite model

Examples

collapse all

Suppose that your current working directory contains a TensorFlow Lite Model named mobilenet_v1_0.5_224.tflite.

Load the model by using the loadTFLite function. Inspect the object this function creates.

net = loadTFLiteModel('mobilenet_v1_0.5_224.tflite');
disp(net)
  TFLiteModel with properties:
            ModelName: 'mobilenet_v1_0.5_224.tflite'
            NumInputs: 1
           NumOutputs: 1
            InputSize: {[224 224 3]}
           OutputSize: {[1001 1]}
           NumThreads: 8
                 Mean: 127.5000
    StandardDeviation: 127.5000

Create a MATLAB function that can perform inference using the object net. This function loads the Mobilenet-V1 model into a persistent network object. Then the function performs prediction by passing the network object to the predict function. Subsequent calls to this function reuse this the persistent object.

function out = tflite_predict(in)
persistent net;
if isempty(net)
    net = loadTFLiteModel('mobilenet_v1_0.5_224.tflite');
end
out = predict(net,in);
end

For an example that shows how to generate code for this function and deploy on Raspberry Pi™ hardware, see Generate Code for TensorFlow Lite (TFLite) Model and Deploy on Raspberry Pi.

Extended Capabilities

Version History

Introduced in R2022a