Quantcast
Channel: TAO Toolkit - NVIDIA Developer Forums
Viewing all articles
Browse latest Browse all 497

Inference with TensorRT is different that inference with HDF5

$
0
0

I’ve trained a model using TAO and got the HDF5 model then exported it to ONNX.
From ONNX, I deployed two tensorRT engine (FP32 and FP16).

I’ve run inference using the HDF5 model and the two tensor Engines, but I am not getting the same results (for instance, both tensorRT engines predict 2569 labels for the same sample of 1000 images while the HDF5 inference give me 605 labels).

This is how I exported to ONNX:

!tao model faster_rcnn export --gpu_index $GPU_INDEX \
                        -m $USER_EXPERIMENT_DIR/frcnn_resnet_50.epoch_108.hdf5 \
                        -o $USER_EXPERIMENT_DIR/frcnn_resnet_50_epoch_108.onnx \
                        -e $SPECS_DIR/specs_birds_inference.txt

This is how I deployed to TRT FP32:

!tao deploy faster_rcnn gen_trt_engine --gpu_index $GPU_INDEX \
                        -m $USER_EXPERIMENT_DIR/frcnn_resnet_50_epoch_108.onnx \
                        -e $SPECS_DIR/specs_birds_inference.txt \
                        --data_type fp32 \
                        --batch_size 1 \
                        --max_batch_size 1 \
                        --engine_file $USER_EXPERIMENT_DIR/birds_trt.epoch_108_fp32_bs1.engine \
                        --results_dir $USER_EXPERIMENT_DIR

This is how I infered with TRT FP32:

!tao deploy faster_rcnn inference  --gpu_index $GPU_INDEX \
                                   -e $SPECS_DIR/specs_birds_inference.txt \
                                   -m $USER_EXPERIMENT_DIR/birds_trt.epoch_108_fp32_bs1.engine \
                                   -i /workspace/tao-experiments-birds/data/sample_for_deployment/image_2 \
                                   -r /workspace/tao-experiments-birds/data/sample_for_deployment/inf_fp32
  • Same logic for FP16.

Attached the specs file.
specs_birds_inference.txt (3.7 KB)

Is there something wrong in the command for which I am getting this different results between the three inferences?

4 posts - 2 participants

Read full topic


Viewing all articles
Browse latest Browse all 497

Trending Articles