Quantcast
Channel: TAO Toolkit - NVIDIA Developer Forums
Viewing all articles
Browse latest Browse all 409

Distillation training

$
0
0

Please provide the following information when requesting support.

• Hardware (T4)
• Network Type (Dino and Yolo)

• How to reproduce the issue ?
I am trying to get a small model which can be effectively deployed to Jetson devices like Yolo models , but i want them accurate as bigger models like Dino

There is this distillation approach now available for Dino as shown in the tutorial below

Also i saw there is this code base as well
NVIDIA general distillation modules :

NVIDIA dino distillation models

I want to do distillation training , by using Dino model as a teacher and Yolo model as a student
Will this possible currently in the TAO pipeline

Or how this can be implemented inside the TAO repositories ,

Or is there any other way using External training code of Yolo to use the Dino training pipeline in TAO
to achieve this purpose

Appreciate your support on this

1 post - 1 participant

Read full topic


Viewing all articles
Browse latest Browse all 409

Trending Articles