MATLAB with TensorFlow and PyTorch for Deep Learning - MATLAB
Video Player is loading.
Current Time 0:00
Duration 18:29
Loaded: 0%
Stream Type LIVE
Remaining Time 18:29
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
    Video length is 18:29

    MATLAB with TensorFlow and PyTorch for Deep Learning

    MATLAB® and Simulink® with deep learning frameworks, TensorFlow and PyTorch, provide enhanced capabilities for building and training your machine learning models. Via interoperability, you can take full advantage of the MATLAB ecosystem and integrate it with resources developed by the open-source community. You can combine workflows that include data-centric preprocessing, model tuning, model compression, model integration, and automatic code generation with models developed outside of MATLAB.

    Explore the options and benefits, along with examples, of the various interoperability pathways available, including:

    • Importing and exporting models from TensorFlow, PyTorch, and ONNX into and from MATLAB
    • Coexecuting MATLAB alongside installations of TensorFlow and PyTorch

    Published: 25 May 2022

    [MUSIC PLAYING]

    Hi, everyone, and welcome to the MATLAB Expo session, MATLAB with TensorFlow and PyTorch. My name is David, and with me today are Yann and Sivylla, who will be co-presenting this talk. It will run for about 20 minutes, with a 5 minute Q&A to follow. With that in mind, let's begin.

    [MUSIC PLAYING]

    Oh, hi, Sivylla, we're just waiting for Yann to join. Before he does join, this is your first MATLAB Expo, right?

    Yes, it is. And I'm very happy to be here.

    Oh, that's great. Oh, here's Yann. All right, Yann, we're about to get started. So, in three, two, one, let's go. Thanks for meeting me today Sivylla and Yann. I've brought us together because we've got some exciting news to share. Our AI-enabled car project has been approved.

    Wow, that's great,

    Yeah, amazing stuff.

    And, all right, it's pretty exciting. So I wanted to use this time, not just to announce the great news, but also to kick the project off, if it's OK. To start, I want to highlight, we've successfully built AI products before. We did it by following the AI system design workflow that you see on the screen. When things weren't working as good as we liked, we iterated over and over until we met the requirements. Is everyone on board with following this again?

    Yeah.

    Yes, definitely.

    That's great. So, look, let's go through the higher level requirements for this project then. For data preparation, Sivylla, this is mainly on the engineering team. They're the ones who know the systems where the data comes from best. And I know you already know this, but make sure they prepare and process it with their domain knowledge as best they can, for example, when it comes to data labeling. If they don't have enough of the data they need, there's always the option of generating synthetic data. And this can be simulated from the system models we already have in Simulink.

    Next is our modeling requirements. Yann, this is where we'll be looking for the data science team to find the best model for HII application we put in the car. I'd also recommend you don't limit yourself to one tool, as this may limit you finding the best model available. Now, these models are going to be put in a car, so it's important for them to be tested in the larger systems in which they will reside. We already have these system models in Simulink. And we can simulate the models, the AI models within them.

    The end goal is to deploy to low-cost, low-powered embedded devices. These will require the models to be coded into the language used on the target. It could be C or C+, Cuda, or VHDL. And finally, we need to make sure both teams can work together as efficiently as possible. One way to do this is to choose the right tools to get the job done.

    So, do you have an idea of which tools you want to use?

    Yes, my entire team uses MATLAB.

    Well, mine uses a combination of PyTorch and TensorFlow.

    OK. Do we know if we can use MATLAB TensorFlow and PyTorch together?

    I don't know.

    Oh, neither do I.

    OK. So do we need to standardize on one tool only? Sivylla and Yann, what's your feedback on this?

    Sivylla, [NON-ENGLISH SPEECH]

    [NON-ENGLISH SPEECH]

    [NON-ENGLISH SPEECH]

    OK. [NON-ENGLISH SPEECH]

    [NON-ENGLISH SPEECH]

    David, Yann and I agree that we have to find ways to connect MATLAB and Python.

    OK. Just give me a second. I'm just doing a quick search, MATLAB and TensorFlow user story. Oh, here's an article to check out. Mitsui Chemicals deploys AI and automation systems with TensorFlow or MATLAB, oh. It looks like the tools can be used together. Could both of you go off and investigate the options and report back?

    OK.

    So, David, let me summarize the option we've listed out with Sivylla. We found, essentially, three options, three ways, in which MATLAB can work together with TensorFlow and PyTorch. On my side, I will be investigating co-execution between MATLAB and Python. And on her side, Sivylla we look into model converters for both TensorFlow and PyTorch. And she will also look into all the available models in the MATLAB Deep Learning Model Lab

    So let me start with the first one. This is co-execution with TensorFlow or PyTorch. What is required is, first, you have to have the two systems installed and set up on your machine. So both MATLAB and whatever framework you choose to be using in Python, that could be TensorFlow or PyTorch. And this is going to require some datatype conversion between MATLAB and Python. And you could have some performance impact when transferring data between the two.

    But on the plus side, this will allow us to test any kind of models that are available both in TensorFlow and PyTorch. Let me show you a shot demo on how to do that. In this demo, we will show how to do image classification in MATLAB using TensorFlow. And for that, we will leverage the co-execution between MATLAB and Python.

    You can retrieve this demo, the code, on our GitHub account, matlab-deep-learning, simply clone it. Or, if you don't have Git installed, then you can simply download the zip. Here, you have some details about what is done. And I will walk you through the code.

    So those are the steps that we're going to follow. I've saved this script as a MATLAB live script. Here, we will Setup Python. We will import and convert the data and call TensorFlow directly from MATLAB, and then, Call a Python User Defined module.

    So first step is setting up Python. Here, I've done it already. And as you can see, I'm using Python 3.9. Now, you want to make sure that you have TensorFlow installed on your machine and up to date. I've added this bang command using pip show to display the version of TensorFlow that I'm using, it's 2.8, and I can do the same by importing TensorFlow and printing the version of TensorFlow. And for this, I'm using Pyrun, that is available in MATLAB since 2018.

    Now, let's show how we can simply call a Python function that is one from the math library, the square root. So you simply enter py, that name of the module, that name of the function. So those are the basics for co-execution.

    Now, let's import some data. And for this example, we'll be importing an image as a matrix. And we will convert the data before proceeding to any deep learning prediction. So, here, we need to resize the data first. And you want to permute the data to rearrange the channels in the order that TensorFlow would want it to be. Quick disclaimer, no banana were hurt during the conversion.

    Now, let's go to calling TensorFlow. First, we will import a pretrained model. You can look at the layers of this model and convert it as a cell array. And in this cell array, you can see the different layers from the TensorFlow pretrained model.

    OK, so in order to input the data from MATLAB into Python, we will call as a numpy module and transform it as an array, like this. And we will preprocess this input. Now, let's go to the core of the demo, the classification. The results are returned, again, as numpy array, as you can see. And this array can be converted into different MATLAB datatypes.

    We also retrieve the label, that is, a Python list. And inside of this list, we have several tables that are organized as id, glass, and probability. a The class that we retrieve here is a banana. And you can check it, , simply by labeling the image, like this.

    Now, let's do the same but this time calling a Python User Defined module. We've written a quick Python file that you can see and render now in MATLAB 22A with a syntax highlighting. And, as you can see, it's defining two functions for loading the model and pre-processing and predicting. So let's do just that. Let's load the model, preprocess and predict, retrieve the label, and bring the figure. And here you go, banana.

    So as a summary, co-execution can work both ways. First, you can call Python directly from MATLAB to access any AI framework and models. And the other way around, you can call the MATLAB engine from Python to reuse some domain-specific functions developed by the engineering team.

    OK. Thanks, Yann. Sivylla, over to you.

    I discovered that the deep learning toolbox box provides functions that allow me to import models directly from TensorFlow and from PyTorch via ONNX. And I only have to use MATLAB. The input functionality is enhanced with every MATLAB release. Actually, importing a TensorFlow model is the option that Mitsui chemicals chose for their AI automation system. Let me show you this video of how you can import the TensorFlow model.

    There are various open source deep learning repositories. I also have colleagues who create models in TensorFlow and PyTorch. As a MATLAB user, how can I get these open source models into MATLAB? I use the Deep Learning Toolbox import functions. Let's open MATLAB. Now we'll show you how I converted TensorFlow model to a Deep Learning Toolbox network.

    I use the importTensorFlowNetwork function to import NASNetMobile, which is an AMS classification model in safe model format. And this is one line of code. This might take a few minutes I'll go make coffee.

    I am back, and the software has converted TensorFlow model to MATLAB network. To check the network, I use analyze network. There are no warnings of errors, so I can use the network. Then, I classify an image by running this code. The image was correctly classified as a banana.

    I can also classify them as in Simulink by loading the imported network into the predict block. Of course, the engineering team will most likely not be classified banana images. But we can, for example, integrate an imported object detection network into a larger Simulink system for vehicle lane detection.

    Yann and I described the options of how to get a TensorFlow and PyTorch model into MATLAB, either with co-execution or by importing the model I might not even have to get the model from an external platform. I can get the MATLAB network directly from the MATLAB Deep Learning Model Hub. The Hub provides over 50 pretrained models, and the model selection keeps growing. These models are the same ones I can find in open source repositories, such as Yolo V3. I will show you a video of how you can get a model from the MATLAB Deep Learning Model Hub.

    I can get a [INAUDIBLE] in deep learning model from the MATLAB Deep Learning Model Hub. This online repository provides many models for various deep learning tasks, such as image and video classification and object detection. Let's say I want a model for image classification.

    In this demo, I'm [INAUDIBLE] mobilenetv2. Normally, I would try out different models. There are two ways to get mobilenetv2, from the doc or from the Hub. Then I open MATLAB to get the model. To get the model from the Doc, I run this one line of code. I have already installed the corresponding support packets. Now I have a network ready to use for image classification.

    To get the model from the MATLAB Hub, I need to download the relevant folder. I can download the folder directly from the Hub, or download programmatically, as shown here. Then, I run only this one line of code. Now, again, I have network ready to use.

    OK, now we have a MATLAB network, [INAUDIBLE] from the MATLAB Deep Learning Model Hub. That is option three. Or import it from an external platform, that is option two. What can I do with this network? I can perform all the deep learning workflows that MATLAB supports. And [INAUDIBLE] in one of the videos how to do prediction in the command line and in Simulink. I can also use a network for Transfer Learning, or deployed to an embedded device of standalone application.

    When I need guidance, the Deep Learning Toolbox comes with many examples on how to perform these workflows, and examples focused on important networks. There's also a recent blog post that guides me as a user on how to import TensorFlow and PyTorch models.

    OK, great. Thanks for the overview, Yann and Sivylla. I've been taking notes, and I'll summarize our key challenges and the preferred approach. During the R&D phase, there's going to be a lot of iteration, going backwards and forwards from the data prep stage and the modeling stage. For this, I believe the best approach looks to be co-execution. It'll allow both teams to work in their tool of choice.

    Once we've converged on the model we'd like to use, it's critical that we get the model deployed successfully on the target as quickly as possible. Given that MATLAB has a workflow for automatic deployment to embedded systems, we have two options.

    First, check if the model is already available in the MATLAB Deep Learning Model Hub. And if it isn't, let's go ahead and use MATLAB's model converters to bring them in. Doing this will allow us to meet our project requirements in a short amount of time. So, there's one final thing we now need to do. Let's go ahead and get this done.

    Yeah, let's do it.

    Yeah.

    [MUSIC PLAYING]

    OK, great, we're done. I think that was a fantastic job by both of you.

    [INAUDIBLE]

    Thanks.

    Yeah.

    [INAUDIBLE]

    All right, we're about to move to the Q&A.