Quantcast
Channel: TAO Toolkit - NVIDIA Developer Forums
Viewing all articles
Browse latest Browse all 547

Export classification model using classification_tf1

$
0
0

Please provide the following information when requesting support.

• Hardware (T4/V100/Xavier/Nano/etc): I used the T4 GPU.
• Network Type (Detectnet_v2/Faster_rcnn/Yolo_v4/LPRnet/Mask_rcnn/Classification/etc): I used the classification_tf1 model
• TLT Version (Please run “tlt info --verbose” and share “docker_tag” here): Not applicable.
• Training spec file(If have, please share here)
• How to reproduce the issue ? (This is for errors. Please share the command line and the detailed log here.) I used this NVIDIA colab file, we trained, validated and tested the model, and at the end I tried to export the model with this command:

The exported model does not have the same accuracy with which it was trained.

3 posts - 2 participants

Read full topic


Viewing all articles
Browse latest Browse all 547

Trending Articles