论文笔记:Evolving Losses for Unsupervised Video Representation Learning

Evolving Losses for Unsupervised Video Representation Learning 论文笔记 Distillation Knowledge Distillation from: zhihu Distillate Knowledge from Teacher model Net-T to Student model Net-S. 目的:为了精简模型方便部署。
相关文章
相关标签/搜索