This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English version of the page.

Note: This page has been translated by MathWorks. Click here to see
To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

Installing Prerequisite Products

To use GPU Coder™ for CUDA® C/C++ code generation, you must install the following products:

MathWorks Products

  • MATLAB® (required).

  • MATLAB Coder™ (required).

  • Parallel Computing Toolbox™ (required).

  • Deep Learning Toolbox™ (required for deep learning).

  • GPU Coder Interface for Deep Learning Libraries (required for deep learning).

  • Image Processing Toolbox™ (recommended).

  • Computer Vision System Toolbox™ (recommended).

  • Embedded Coder® (recommended).

  • Simulink® (recommended).

Note

If MATLAB is installed on a path that contains non 7-bit ASCII characters, such as Japanese characters, MATLAB Coder does not work because it cannot locate code generation library functions.

For instructions on installing MathWorks® products, see the MATLAB installation documentation for your platform. If you have installed MATLAB and want to check which other MathWorks products are installed, enter ver in the MATLAB Command Window.

Third-party Products

GPU Code Generation from MATLAB

  • NVIDIA® GPU enabled for CUDA with compute capability 3.2 or higher (Is my GPU supported?).

  • CUDA toolkit and driver. The default installation comes with the nvcc compiler, cuFFT, cuBLAS, cuSOLVER, and Thrust libraries. GPU Coder has been tested with CUDA toolkit v9.1 (Get the CUDA toolkit).

  • C/C++ Compiler:

    Linux®

    Windows®

    GCC C/C++ compiler 6.3.x

    Microsoft® Visual Studio® 2013

    Microsoft Visual Studio 2015

    Microsoft Visual Studio 2017

    The NVIDIA nvcc compiler supports multiple versions of GCC and therefore you can generate CUDA code with other versions of GCC. However, there may be compatibility issues when executing the generated code from MATLAB as the C/C++ run-time libraries that are included with the MATLAB installation are compiled for GCC 6.3.

Code Generation for Deep Learning Networks

The code generation requirements for deep learning networks depends on the platform you are targeting.

 NVIDIA GPUs
Hardware Requirements

CUDA enabled GPU with compute capability 3.2 or higher.

Targeting NVIDIA TensorRT™ libraries with INT8 precision requires a CUDA GPU with minimum compute capability of 6.1.

Software Libraries

CUDA Deep Neural Network library (cuDNN) v7.x.

NVIDIA TensorRT – high performance deep learning inference optimizer and runtime library, v3.0.

Operating System Support

cuDNN support is on Windows and Linux.

TensorRT support is only on Linux.

Other

Open Source Computer Vision Library (OpenCV), v3.1.0 is required for deep learning examples.

Note: The examples require separate libs such as, opencv_core.libopencv_video.lib. The OpenCV library that ships with Computer Vision System Toolbox does not have all the required libraries and the OpenCV installer does not install them. Therefore, you must download the OpenCV source and build the libraries.

For more information, refer to the OpenCV documentation.

Code Generation for Embedded GPU Boards - NVIDIA Tegra Based Jetson TX2, TX1, and TK1

  • CUDA toolkit for ARM® and Linaro GCC 4.9 toolchain for the TX2. Use the gcc-linaro-4.9-2016.02-x86_64_aarch64-linux-gnu release tarball.

  • CUDA toolkit for ARM and Linaro GCC 4.9 toolchain for the TX1.

  • CUDA toolkit 6.5 for ARM and Linaro GCC 4.8 toolchain for the TK1. Use the gcc-linaro-arm-linux-gnueabihf-4.8-2013.08_linux release tarball.

To set up the Linaro tools, see the instructions on Cross-Compilation on Linux.

Note

Embedded GPU targeting is supported only from the Linux platform.