文献阅读笔记—Improving Language Understanding by Generative Pre-Training

迁移学习在nlp领域的应用之pretrain language representation,四连载,建议按顺序看,看完对该方向一定会非常清楚的! (一)ELMO:Deep contextualized word representations (二)Universal Language Model Fine-tuning for Text Classification (三)openAI GPT
相关文章
相关标签/搜索