NOTE: This model has been superseded by deepset/roberta-base-squad2-v2. For an explanation of why, see [this github issue](https://github.com/deepset-ai/FARM/issues/552) from the FARM repository.
NOTE: This is version 2 of the model. See [this github issue](https://github.com/deepset-ai/FARM/issues/552) from the FARM repository for an explanation of why we updated. If you'd like to use version 1, specify `revision="v1.0"` when loading the model in Transformers 3.5.
## Overview
## Overview
**Language model:** roberta-base
**Language model:** roberta-base
...
@@ -19,10 +19,10 @@ NOTE: This model has been superseded by deepset/roberta-base-squad2-v2. For an e
...
@@ -19,10 +19,10 @@ NOTE: This model has been superseded by deepset/roberta-base-squad2-v2. For an e
## Hyperparameters
## Hyperparameters
```
```
batch_size = 50
batch_size = 96
n_epochs = 3
n_epochs = 2
base_LM_model = "roberta-base"
base_LM_model = "roberta-base"
max_seq_len = 384
max_seq_len = 386
learning_rate = 3e-5
learning_rate = 3e-5
lr_schedule = LinearWarmup
lr_schedule = LinearWarmup
warmup_proportion = 0.2
warmup_proportion = 0.2
...
@@ -32,9 +32,18 @@ max_query_length=64
...
@@ -32,9 +32,18 @@ max_query_length=64
## Performance
## Performance
Evaluated on the SQuAD 2.0 dev set with the [official eval script](https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/).
Evaluated on the SQuAD 2.0 dev set with the [official eval script](https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/).
```
```
"exact": 78.49743114629833,
"exact": 79.97136359807968
"f1": 81.73092721240889
"f1": 83.00449234495325
"total": 11873
"HasAns_exact": 78.03643724696356
"HasAns_f1": 84.11139298441825
"HasAns_total": 5928
"NoAns_exact": 81.90075693860386
"NoAns_f1": 81.90075693860386
"NoAns_total": 5945
```
```
## Usage
## Usage
...
@@ -85,7 +94,7 @@ For doing QA at scale (i.e. many docs instead of single paragraph), you can load
...
@@ -85,7 +94,7 @@ For doing QA at scale (i.e. many docs instead of single paragraph), you can load