Quantcast
Channel: TAO Toolkit - NVIDIA Developer Forums
Viewing all articles
Browse latest Browse all 497

What models requires TensorRT OSS on Deepstream 6.4?

$
0
0

Please provide the following information when requesting support.

• Hardware (T4/V100/Xavier/Nano/etc): GeForce 4090
• Network Type (Detectnet_v2/Faster_rcnn/Yolo_v4/LPRnet/Mask_rcnn/Classification/etc): Yolo_v4 & Classification
• TLT Version (Please run “tlt info --verbose” and share “docker_tag” here): v5.5.0
• Training spec file(If have, please share here): NA
• How to reproduce the issue ? (This is for errors. Please share the command line and the detailed log here.): NA

In GitHub - NVIDIA-AI-IOT/deepstream_tao_apps at release/tao5.1_ds6.4ga, the instructions say that:

The OSS plugins are needed for some models with DeepStream 6.4.

  1. Are Yolo_v4 & Classification engines created from tao models require TensorRT OSS?

  2. The bellow picture says that for DeepStream 6.4, TRT_OSS_CHECKOUT_TAG is “binary plugin only”, what does that mean?

2 posts - 2 participants

Read full topic


Viewing all articles
Browse latest Browse all 497

Trending Articles