Quantcast
Channel: TAO Toolkit - NVIDIA Developer Forums
Viewing all articles
Browse latest Browse all 497

DINO Retail Object Detection - Distillation

$
0
0

Hi,
I am trying to train a DINO Retail Object Detection using the fan_tiny backbone. I have successfully fine tuned fan_base and fan_small on our retail object dataset using the available trainable models from the NGC page .
Now, I want to see if I can reach similar/close accuracy using fan_tiny and get higher inference speed.

First, I could not find any retail object trainable model fan_tiny in NGC. Is there one I can use or is there none avaialble?

Since I could not find any trainable fan_tiny, I’ve started to run some knowledge distillation using our fine tuned fan_small as the teacher and Pre-trained DINO ImageNet weights fan_hybrid_tiny as the student.
At the moment, it is not converging at all.

Before I share my specs etc…, I just would like to know if that distillation is possible and if I am using the right student tiny model.

Thank you

• Network Type: DINO Retail Object Detection
• TAO Version: 5.5.0

1 post - 1 participant

Read full topic


Viewing all articles
Browse latest Browse all 497

Trending Articles