Quantcast
Channel: TAO Toolkit - NVIDIA Developer Forums
Viewing all articles
Browse latest Browse all 497

LPRNet training and deployment

$
0
0

Please provide the following information when requesting support.

• Hardware (T4/V100/Xavier/Nano/etc) : rtx 4090/t4(6.3 DS) and Nano(6.0 DS)
• Network Type (Detectnet_v2/Faster_rcnn/Yolo_v4/LPRnet/Mask_rcnn/Classification/etc) : LPRNet

While training how to save weights based on best metrics (like accuracy) rather than after every 5 epochs?

I have trained a model using tao toolkit and the weight files are as .hdf5 format. How to convert this to etlt so as to deploy in the pipeline?

I generated the onnx file from hdf5, but when I use that onnx in DS pipeline to generate the engine file error pops up:

Using file: ./models/anpr_config.yml
0:00:00.583616331 221656 0x560ff81ece40 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<secondary-infer-engine2> NvDsInferContext[UID 3]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 3]: Trying to create engine from model files
[libprotobuf ERROR google/protobuf/text_format.cc:298] Error parsing text-format onnx2trt_onnx.ModelProto: 1:1: Invalid control characters encountered in text.
[libprotobuf ERROR google/protobuf/text_format.cc:298] Error parsing text-format onnx2trt_onnx.ModelProto: 2:11: Invalid control characters encountered in text.
[libprotobuf ERROR google/protobuf/text_format.cc:298] Error parsing text-format onnx2trt_onnx.ModelProto: 2:17: Already saw decimal point or exponent; can't have another one.
[libprotobuf ERROR google/protobuf/text_format.cc:298] Error parsing text-format onnx2trt_onnx.ModelProto: 2:13: Message type "onnx2trt_onnx.ModelProto" has no field named "keras2onnx".
ERROR: [TRT]: ModelImporter.cpp:688: Failed to parse ONNX model from file: /home/mainak/ms/C++/anpr_kp/models/lprnet_epoch-024.onnx
ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:315 Failed to parse onnx file
ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:971 failed to build network since parsing model errors.
ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:804 failed to build network.
0:00:02.577464454 221656 0x560ff81ece40 ERROR                nvinfer gstnvinfer.cpp:676:gst_nvinfer_logger:<secondary-infer-engine2> NvDsInferContext[UID 3]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2022> [UID = 3]: build engine file failed
0:00:02.615730998 221656 0x560ff81ece40 ERROR                nvinfer gstnvinfer.cpp:676:gst_nvinfer_logger:<secondary-infer-engine2> NvDsInferContext[UID 3]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2108> [UID = 3]: build backend context failed
0:00:02.615758235 221656 0x560ff81ece40 ERROR                nvinfer gstnvinfer.cpp:676:gst_nvinfer_logger:<secondary-infer-engine2> NvDsInferContext[UID 3]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1282> [UID = 3]: generate backend failed, check config file settings
0:00:02.616255703 221656 0x560ff81ece40 WARN                 nvinfer gstnvinfer.cpp:898:gst_nvinfer_start:<secondary-infer-engine2> error: Failed to create NvDsInferContext instance
0:00:02.616263316 221656 0x560ff81ece40 WARN                 nvinfer gstnvinfer.cpp:898:gst_nvinfer_start:<secondary-infer-engine2> error: Config file path: /home/mainak/ms/C++/anpr_kp/models/lpr_config_sgie_us.yml, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
Running...
ERROR from element secondary-infer-engine2: Failed to create NvDsInferContext instance
Error details: gstnvinfer.cpp(898): gst_nvinfer_start (): /GstPipeline:ANPR-pipeline/GstNvInfer:secondary-infer-engine2:
Config file path: /home/mainak/ms/C++/anpr_kp/models/lpr_config_sgie_us.yml, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
Returned, stopping playback
Deleting pipeline
Disconnecting MQTT Client
Destroying MQTT Client

@Morganh Any suggestion on this is highly appreciated

2 posts - 2 participants

Read full topic


Viewing all articles
Browse latest Browse all 497

Trending Articles