"...en/_static/git@developer.sourcefind.cn:wangsen/mineru.git" did not exist on "b4f7b53ecb88c8d42b9e136dd9bd40dbcfaf7257"
Unverified Commit d303f84e authored by Gunnlaugur Thor Briem's avatar Gunnlaugur Thor Briem Committed by GitHub
Browse files

fix: wrong architecture count in README

Just say “the following” so that this intro doesn't so easily fall out of date :) )
parent f0616062
......@@ -132,7 +132,7 @@ At some point in the future, you'll be able to seamlessly move from pre-training
## Model architectures
🤗 Transformers currently provides 10 NLU/NLG architectures:
🤗 Transformers currently provides the following NLU/NLG architectures:
1. **[BERT](https://github.com/google-research/bert)** (from Google) released with the paper [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805) by Jacob Devlin, Ming-Wei Chang, Kenton Lee and Kristina Toutanova.
2. **[GPT](https://github.com/openai/finetune-transformer-lm)** (from OpenAI) released with the paper [Improving Language Understanding by Generative Pre-Training](https://blog.openai.com/language-unsupervised/) by Alec Radford, Karthik Narasimhan, Tim Salimans and Ilya Sutskever.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment