Quantcast
Channel: TAO Toolkit - NVIDIA Developer Forums
Viewing all articles
Browse latest Browse all 409

Deepstream Onnx inference no output

$
0
0

• Hardware: ubuntu 20 x86 with RTX 3060
• Network Type: Classification tf1 (VehicleMakeNet - Resnet18)
• TLT Version: http://nvcr.io/nvidia/tao/tao-toolkit:5.0.0-tf1.15.5
• Training spec file:
retrain_car_make.txt (1.0 KB)

• How to reproduce the issue ?
I have a vehiclemakenet modified that classify 35 car brands. I have exported the model after trained in tao classification tf1:

# Generate .onnx file using tao container
!tao model classification_tf1 export \
            -m $USER_EXPERIMENT_DIR/retrain_pruned/weights/resnet_010.hdf5 \
            -o $USER_EXPERIMENT_DIR/export/final_model \
            -e $SPECS_DIR/retrain_car_make.cfg \
            --classmap_json $USER_EXPERIMENT_DIR/retrain_pruned/classmap.json \
            --gen_ds_config

I use this Onnx file to inference the net in Deepstream, but I do not have any output from this net. (If I use the catalog pretrained model it works, but I need more output classes). When deepstream generate the engine, I obtain that:

INFO: [FullDims Engine Info]: layers num: 2
0   INPUT  kFLOAT input_1         3x224x224       min: 1x3x224x224     opt: 2x3x224x224     Max: 2x3x224x224     
1   OUTPUT kFLOAT predictions     35              min: 0               opt: 0               Max: 0   

But when I run the pretrained model I obtain that:

INFO: [FullDims Engine Info]: layers num: 2
0   INPUT  kFLOAT input_1         3x224x224   
1   OUTPUT kFLOAT predictions/Softmax    35x1x1  

So, I do not get the softmax output in my model engine. This is my model:

27 posts - 2 participants

Read full topic


Viewing all articles
Browse latest Browse all 409

Trending Articles