NLP之不简单笔记:啥是BERT

目录 1. what is BERT? 2. Structure 2.1 self-attention 2.2 multi-head 2.3 Positional encoding and Positional embeddings 3. Pre-training and finetune 3.1 pre-training 3.2 fine-tune 4. Example and practise
相关文章
相关标签/搜索