BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Q113783)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding |
scientific article from arXiv
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding |
scientific article from arXiv |
Statements
11 October 2018
0 references
cs.CL
0 references