"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "49d8076fa25d944d18fa4e9413ce838837805701"
Commit 61a2b7dc authored by Manuel Romero's avatar Manuel Romero Committed by Julien Chaumond
Browse files

Fix typo

parent 6e261d3a
...@@ -7,7 +7,7 @@ thumbnail: https://i.imgur.com/jgBdimh.png ...@@ -7,7 +7,7 @@ thumbnail: https://i.imgur.com/jgBdimh.png
This model is a fine-tuned on [SQuAD-es-v2.0](https://github.com/ccasimiro88/TranslateAlignRetrieve) and **distilled** version of [BETO](https://github.com/dccuchile/beto) for **Q&A**. This model is a fine-tuned on [SQuAD-es-v2.0](https://github.com/ccasimiro88/TranslateAlignRetrieve) and **distilled** version of [BETO](https://github.com/dccuchile/beto) for **Q&A**.
Distillation makes the model smaller, fasert, cheaper and lighter than [bert-base-spanish-wwm-cased-finetuned-spa-squad2-es](https://github.com/huggingface/transformers/blob/master/model_cards/mrm8488/bert-base-spanish-wwm-cased-finetuned-spa-squad2-es/README.md) Distillation makes the model **smaller, faster, cheaper and lighter** than [bert-base-spanish-wwm-cased-finetuned-spa-squad2-es](https://github.com/huggingface/transformers/blob/master/model_cards/mrm8488/bert-base-spanish-wwm-cased-finetuned-spa-squad2-es/README.md)
This model was fine-tuned on the same dataset but using **distillation** during the process as mentioned above (and one more train epoch). This model was fine-tuned on the same dataset but using **distillation** during the process as mentioned above (and one more train epoch).
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment