"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "b17963d82ffa1355d222d3377594e61a25acd7aa"
Commit 601e4247 authored by Manuel Romero's avatar Manuel Romero Committed by Julien Chaumond
Browse files

Update README.md

parent 1b9e765b
......@@ -3,9 +3,9 @@ language: multilingual
thumbnail:
---
# BERT (base-multilingual-uncased) fine-tuned on XQuAD
# BERT (base-multilingual-uncased) fine-tuned for multilingual Q&A
This model was created by [Google](https://github.com/google-research/bert/blob/master/multilingual.md) and fine-tuned on [XQuAD](https://github.com/deepmind/xquad) for multilingual (`11 different languages`) **Q&A** downstream task.
This model was created by [Google](https://github.com/google-research/bert/blob/master/multilingual.md) and fine-tuned on [XQuAD](https://github.com/deepmind/xquad) like data for multilingual (`11 different languages`) **Q&A** downstream task.
## Details of the language model('bert-base-multilingual-uncased')
......@@ -77,19 +77,6 @@ As **XQuAD** is just an evaluation dataset, I used `Data augmentation techniques
The model was trained on a Tesla P100 GPU and 25GB of RAM.
The script for fine tuning can be found [here](https://github.com/huggingface/transformers/blob/master/examples/distillation/run_squad_w_distillation.py)
## Results:
| Metric | # Value |
| --------- | ----------- |
| **Exact** | **93.03** |
| **F1** | **94.62** |
## Comparison:
| Model | Exact | F1 score |
| --------- | ----------- | ------- |
| [bert-multi-cased-finetuned-xquadv1](https://huggingface.co/mrm8488/bert-multi-cased-finetuned-xquadv1) | 91.43 | 94.14 |
|bert-multi-uncased-finetuned-xquadv1 | **93.03** | **94.62**
## Model in action
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment