Hello,
I exported a model from HDF5 to ONNX to a tensorRT Engine. It’s a fastercnn model.
When I try inference with TensorRT (following this notebook), I get an error:
runtime = trt.Runtime(TRT_LOGGER)
with open(engine_file, "rb") as f:
engine = runtime.deserialize_cuda_engine(f.read())
==>
[03/11/2024-09:03:12] [TRT] [E] 3: getPluginCreator could not find plugin: ProposalDynamic version: 1
[03/11/2024-09:03:12] [TRT] [E] 1: [pluginV2Runner.cpp::load::303] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
TensorRT version: 8.6.1
Tao: 5.0.0
2 posts - 2 participants