Quantcast
Channel: TAO Toolkit - NVIDIA Developer Forums
Viewing all articles
Browse latest Browse all 497

LPRNet gives different results when running on Jetson

$
0
0

• Hardware Platform Jetson
• DeepStream Version 6.3
• JetPack Version 5.1
• TensorRT Version 5.1
• Issue Type: questions

LPRNet I have trained on custom dataset produces different results when running on Jetson.

In my experiment I would process video on Jetson and save crops from object detector, and then run inference in my training environment. In some cases results are the same, while in others model outputs a wrong character. Sometimes it would drop one character or give wrong output for one or two characters.
For example:
1200-0
On Jetson I get MLY0005, while on Docker I get correct value of MLY9905

I train LPRNet on nvcr.io/nvidia/tao/tao-toolkit:5.0.0-tf1.15.5 image, use it to export to onnx and then transfer it to Jetson where it is automatically converted to tlt once I start the app.

What could be the reason for this behavior and how to get my model to give better results?

10 posts - 2 participants

Read full topic


Viewing all articles
Browse latest Browse all 497

Trending Articles