论文笔记BERT: Bidirectional Encoder Representations from Transformers

1 简介 本文根据2019年《BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding》翻译总结的。 BERT: Bidirectional Encoder Representations from Transformers. 应用预训练模型于下游任务有两种策略,分别是feature-based
相关文章
相关标签/搜索