Please provide the following information when requesting support.
• Hardware (T4/V100/Xavier/Nano/etc): any supported dGPU
• Network Type (Detectnet_v2/Faster_rcnn/Yolo_v4/LPRnet/Mask_rcnn/Classification/etc): Object detection, any architecture
• TLT Version (Please run “tlt info --verbose” and share “docker_tag” here): v30.2205
• Training spec file(If have, please share here): NA
• How to reproduce the issue ? (This is for errors. Please share the command line and the detailed log here.): NA
From the doc in Generating a Template DeepStream Config File
And the doc in Specification File for Inference
So does this mean the order of target_classes
in Cost Function determined the model output order?
For example, with this cost_function_config
:
cost_function_config {
target_classes {
name: "car"
...
}
target_classes {
name: "cyclist"
...
}
target_classes {
name: "pedestrian"
...
}
...
}
The exported labels.txt
for DeepStream is:
car
cyclist
pedestrian
If I change the cost_function_config
to:
cost_function_config {
target_classes {
name: "pedestrian"
...
}
target_classes {
name: "cyclist"
...
}
target_classes {
name: "car"
...
}
...
}
The exported labels.txt
for DeepStream is:
pedestrian
cyclist
car
Is my understand of cost_function_config
to labels.txt
relationship correct?
It’s unclear from the documentation which config file defines the order in which the model predicts the output. Can you confirm that it’s the cost_function_config
that defines the order? If not, then where is the order in which the model predicts the output configured?
2 posts - 2 participants