Main Content

Deploy Trained Regression Model as Microservice Docker Image

Supported platform: Linux®, Windows®, macOS

This example shows how to create a microservice Docker® image from the MATLAB® regression tree created in Train Regression Trees Using Regression Learner App (Statistics and Machine Learning Toolbox). In that example, you train a regression tree to predict the fuel economy in miles per gallon of a car model, given other variables as inputs. The microservice image created by MATLAB Compiler SDK™, described in this example, provides an HTTP/HTTPS endpoint to access MATLAB code.

You package a MATLAB function into a deployable archive, and then create a Docker image that contains the archive and a minimal MATLAB Runtime package. You can then run the image in Docker and make calls to the service using any of the MATLAB Production Server™ client APIs.

List of Example Files

  • trainedModel.mat

  • predictValue.m

  • httpTest.m

To download the example files, type the following into your MATLAB command window.

openExample("compilersdk/DeployRegressionExample", workDir=pwd)

Required Products

Type ver at the MATLAB command prompt to verify whether the following products are installed:

  • MATLAB

  • Statistics and Machine Learning Toolbox™

  • MATLAB Compiler™

  • MATLAB Compiler SDK

Prerequisites

Create MATLAB Function to Predict Fuel Economy

For this example, we are using a pretrained regression model. To create one using Regression Learner App, see Train Regression Trees Using Regression Learner App (Statistics and Machine Learning Toolbox). Follow the example through step 20, where you export a full model under the default name trainedModel. Then, save the model using the save function.

save("trainedModel");

Then, create a prediction function.

function prediction = predictValue(x)
load("trainedModel.mat","trainedModel");
X = cell2table(x,...
    "VariableNames",{'Acceleration','Cylinders','Displacement','Horsepower','Model_Year','Weight','Origin','MPG'});
prediction = trainedModel.predictFcn(X);
end

Test the function from the MATLAB command line.

predictValue({12, 8, 307, 130, 70, 3504, 'USA', 18})
ans =

    15.7500

Create Deployable Archive

Package the predictValue function into a deployable archive using the compiler.build.productionServerArchive function.

You can specify additional options in the compiler.build command by using name-value arguments.

buildResults = compiler.build.productionServerArchive('predictValue.m',...
'ArchiveName','regressionTreeModel','Verbose',true);
buildResults = 

  Results with properties:

                  BuildType: 'productionServerArchive'
                      Files: {'\home\mluser\work\regressionTreeModelproductionServerArchive\regressionTreeModel.ctf'}
    IncludedSupportPackages: {}
                    Options: [1×1 compiler.build.ProductionServerArchiveOptions]
        RuntimeDependencies: [1×1 compiler.runtime.Dependencies]

The compiler.build.Results object buildResults contains information on the build type, generated files, included support packages, and build options.

Once the build is complete, the function creates a folder named regressionTreeModelproductionServerArchive in your current directory to store the deployable archive.

Package Archive into Microservice Docker Image

  • Build the microservice Docker image using the buildResults object that you created. You can specify additional options in the compiler.build command by using name-value arguments. For details, see compiler.package.microserviceDockerImage.

compiler.package.microserviceDockerImage(buildResults,...
'ImageName','regressiontreemodel-microservice',...
'DockerContext',fullfile(pwd,'microserviceDockerContext'));

The function generates the following files within a folder named microserviceDockerContext in your current working directory:

  • applicationFilesForMATLABCompiler/regressionTreeModel.ctf — Deployable archive file.

  • Dockerfile — Docker file that specifies Docker run-time options.

  • GettingStarted.txt — Text file that contains deployment information.

Test Docker Image

In a system command window, verify that your regressiontreemodel-microservice image is in your list of Docker images.

docker images

This command returns a list of Docker images, including your microservice:

REPOSITORY                                                               TAG               IMAGE ID       CREATED          SIZEregressiontreemodel-microservice                                             latest            57f43b6811ce   22 seconds ago   7.31GB
matlabruntime/r2025a/release/update4/f08180002000003010                  latest            3d8fedb4189b   5 weeks ago      7.31GB

Run the regressiontreemodel-microservice microservice image from the system command prompt.

docker run --rm -p 9900:9910 regressiontreemodel-microservice -l trace &

Port 9910 is the default port exposed by the microservice within the Docker container. You can map it to any available port on your host machine. For this example, it is mapped to port 9900.

You can specify additional options in the Docker command. For a complete list of options, see Microservice Command Arguments.

Once the microservice container is running in Docker, you can check the status of the service by going to the following URL in a web browser:

http://<hostname>:9900/api/health

Note: Use localhost as the hostname if Docker is running on the same machine as the browser.

If the service is ready to receive requests, you see the following message.

"status:  ok"

Test the running service. Use MATLAB desktop to send a JSON query to the service through port 9900. For more information on constructing JSON requests, see JSON Representation of MATLAB Data Types (MATLAB Production Server).

%% Import MATLAB HTTP interface packages
import matlab.net.*
import matlab.net.http.*
import matlab.net.http.fields.*

%% Setup message body
body = MessageBody;

input_data = '{"mwdata": [[12], [8], [307], [130], [70], [3504], "USA", [18]], "mwsize": [1,8],"mwtype":"cell"}';
payloadText = strcat('{"nargout":1,"rhs":[', input_data,']}');

body.Payload = payloadText;

%% Setup request
requestUri = URI('http://localhost:9900/regressionTreeModel/predictValue');
options = matlab.net.http.HTTPOptions('ConnectTimeout',20,...
    'ConvertResponse',false);
request = RequestMessage;
request.Header = HeaderField('Content-Type','application/json');
request.Method = 'POST';
request.Body = body;

%% Send request & view raw response
response = request.send(requestUri, options);
disp(response.Body.Data)

%% Decode JSON and Display Response
lhs = mps.json.decoderesponse(response.Body.Data);

Note: Replace hostname in the code above with the appropriate host. Since Docker is running locally, use localhost.

The output is:

{"lhs":[{"mwdata":[15.750000000000002],"mwsize":[1,1],"mwtype":"double"}]}

To stop the service, use the following command to display the container id.

docker ps
CONTAINER ID   IMAGE                          COMMAND                  CREATED          STATUS          PORTS                                       NAMES
0662c1e1fa85   regressiontreemodel-microservice   "/opt/matlabruntime/…"   17 minutes ago   Up 17 minutes   0.0.0.0:9900->9910/tcp, :::9900->9910/tcp   ecstatic_torvalds

Stop the service using the specified container id.

docker stop 0662c1e1fa85

Share Docker Image

You can share your Docker image in various ways.

  • Push your image to the Docker central registry DockerHub, or to your private registry. This is the most common workflow.

  • Save your image as a tar archive and share it with others. This workflow is suitable for immediate testing.

For details about pushing your image to DockerHub or your private registry, consult the Docker documentation.

Save Docker Image as Tar Archive

To save your Docker image as a tar archive, open a system command window, navigate to the Docker context folder, and type the following.

docker save regressiontreemodel-microservice -o regressiontreemodel-microservice.tar

This command creates a file named regressiontreemodel-microservice.tar in the current folder. Set the appropriate permissions (for example, using chmod) prior to sharing the tarball with other users.

Load Docker Image from Tar Archive

Load the image contained in the tarball on the end user machine.

docker load --input regressiontreemodel-microservice.tar

Verify that the image is loaded.

docker images

Run Docker Image

docker run --rm -p 9900:9910 regressiontreemodel-microservice

See Also

Topics

External Websites