Installing Prerequisite Products
To use GPU Coder™ for CUDA® code generation, you must install and setup the following products. For setup instructions , see Setting Up the Prerequisite Products.
MathWorks Products and Support Packages
MATLAB Coder™ (required).
Parallel Computing Toolbox™ (required).
Simulink® (required for generating code from Simulink models).
Computer Vision Toolbox™ (recommended).
Deep Learning Toolbox™ (required for deep learning).
Embedded Coder® (recommended).
Image Processing Toolbox™ (recommended).
Simulink Coder (required for generating code from Simulink models).
GPU Coder Interface for Deep Learning support package (required for deep learning).
MATLAB Coder Support Package for NVIDIA® Jetson™ and NVIDIA DRIVE® Platforms (required for deployment to embedded targets such as NVIDIA Jetson and Drive).
For instructions on installing MathWorks® products, see the MATLAB installation documentation for your platform. If you have installed
MATLAB and want to check which other MathWorks products are installed, enter
ver in the MATLAB Command Window. To install the support packages, use Add-On Explorer in
If MATLAB is installed on a path that contains non 7-bit ASCII characters, such as Japanese characters, GPU Coder does not work because it cannot locate code generation library functions.
NVIDIA GPU enabled for CUDA with a compatible graphics driver. For more information, see CUDA GPUs (NVIDIA).
To see the CUDA compute capability requirements for code generation, consult the following table.
Target Compute Capability
Source code, static or dynamic library, and executables
3.2 or higher.
Deep learning applications in 8-bit integer precision
6.1, 7.0 or higher.
Deep learning applications in half-precision (16-bit floating point)
5.3, 6.0, 6.2 or higher.
ARM® Mali graphics processor.
For the Mali device, GPU Coder supports code generation for only deep learning networks.
GCC C/C++ compiler. For supported versions, see Supported and Compatible Compilers.
Microsoft® Visual Studio® 2017
Microsoft Visual Studio 2019
Microsoft Visual Studio 2022
For CUDA MEX, the code generator uses the NVIDIA compiler and libraries installed with MATLAB. Standalone code (static library, dynamically linked library, or executable program) generation has additional software requirements.
GPU Coder has been tested with CUDA Toolkit v9.x-v11.8.
To download the CUDA Toolkit, see CUDA Toolkit Archive (NVIDIA).
NVIDIA Nsight™ systems
Generate an execution profiling report for the generated CUDA code. The report provides metrics that help you analyze your application algorithms and identify opportunities to optimize performance.
GPU Coder has been tested with Nsight 2022.5.1
The profiling tools from NVIDIA might not support legacy GPU hardware such as the Kepler family of devices. For information on supported GPU devices, see the NVIDIA documentation.
NVIDIA CUDA deep neural network library (cuDNN) for NVIDIA GPUs
For the host GPU device, GPU Coder has been tested with cuDNN v8.7.
To download cuDNN, see cuDNN (NVIDIA).
NVIDIA TensorRT™ high performance inference optimizer and runtime library
For the host GPU device, GPU Coder has been tested with TensorRT v18.104.22.168.
To download TensorRT, see TensorRT (NVIDIA).
ARM Compute Library for Mali GPUs
GPU Coder has been tested with v19.05.
For more information, see Compute Library (ARM).
Open Source Computer Vision Library (OpenCV)
Required for deep learning examples.
For examples targeting NVIDIA GPUs on the host development computer, use OpenCV v3.1.0.
For examples targeting ARM GPUs, use OpenCV v2.4.9 on the ARM target hardware.
For more information, see OpenCV.
- Setting Up the Prerequisite Products
- The GPU Environment Check and Setup App
- Code Generation by Using the GPU Coder App
- Code Generation Using the Command Line Interface
- Code Generation for Deep Learning Networks by Using cuDNN
- Code Generation for Deep Learning Networks by Using TensorRT
- Code Generation for Deep Learning Networks Targeting ARM Mali GPUs