- Welcome
- Getting Started With the NVIDIA DriveWorks SDK
- Modules
- Samples
- Tools
- Tutorials
- SDK Porting Guide
- DriveWorks API
- More
The DNN Plugin sample loads and runs an MNIST network with a custom implementation of the max pooling layer.
PoolPlugin.cpp implements this layer and defines the functions that are required by DriveWorks to load a plugin. This file is then compiled as a shared library, which is then loaded by sample_dnn_plugin executable at runtime.
The command line for the sample is:
./sample_dnn_plugin
Accepts the following optional parameters:
./sample_dnn_tensor --tensorRT_model=[path/to/TensorRT/model]
Where:
--tensorRT_model=[path/to/TensorRT/model] Specifies the path to the NVIDIA<sup>®</sup> TensorRT<sup>™</sup> model file. The loaded network is expected to have a output blob named "prob". Default value: path/to/data/samples/dnn/<gpu-architecture>/mnist.bin, where <gpu-architecture> can be `volta-discrete` or `volta-integrated` or `turing`.
The sample creates a window, displays a white screen where a hand-drawn digit will be recognized via aforementioned MNIST network model.
For more information, see DNN Plugins.