(笔记)通过知识蒸馏和量化进行模型压缩MODEL COMPRESSION VIA DISTILLATION AND QUANTIZATION

(笔记)Model Compression via Distillation and Quantization (笔记)Model Compression via Distillation and Quantization 原文链接: 代码: 摘要 算法一:加入知识蒸馏loss的量化训练 算法二:训练量化集p 效果 原文链接: https://arxiv.org/abs/1802.05668 代码
相关文章
相关标签/搜索