【论文整理】知识蒸馏最全论文列表!一文掌握全新研究方向!

knowledge distillation papers Early Papers Model Compression, Rich Caruana, 2006 Distilling the Knowledge in a Neural Network, Hinton, J.Dean, 2015 Knowledge Acquisition from Examples Via Multiple Mod
相关文章
相关标签/搜索