Is there documentation that explains how TAO can be used on the deepstream SDK 6.4 docker deepstream:6.4-gc-triton-devel
Based on the description I would expect this docker contains all TAO tools but I can’t find this in the container.
Documentation at some point simple reverts to TAO Toolkit Quick Start Guide - NVIDIA Docs which describes how to install older dockers.
The whole thing appears to be overly complicated for my simple use case.
Is there an easy/advised way to tune some yolo/resnet like models for end use on jetson orin (ubuntu 22.04 on x86 is available for training)
Getting inference running on the DS6.4 docker was very easy, I would expect that custom model training on that docker is as easy to set up but the TAO 5.2.0 documentation does not help me.
Or maybe TAO is overkill for me and I should try training using more lower level APIs? tensorrt (I used to use keras)?
Although the quantization looks like an important step which is probably the main added value of TAO for me.
Please redirect me to the correct forum for this question if this is not the place to post it.
TIA
• Hardware (Jetson Orin Nano/Geforce RTX 2060/RTX 4080 Laptop)
• Network Type (Detectnet_v2/Yolo_v4/Classification/etc)
2 posts - 2 participants