@@ -27,7 +27,7 @@ Available datasets: `copy`, `sort`, `wmt14`, `multi30k`(default).
## Test Results
### Transfomer
### Transformer
- Multi30k: we achieve BLEU score 35.41 with default setting on Multi30k dataset, without using pre-trained embeddings. (if we set the number of layers to 2, the BLEU score could reach 36.45).
- WMT14: work in progress
...
...
@@ -38,7 +38,7 @@ Available datasets: `copy`, `sort`, `wmt14`, `multi30k`(default).
## Notes
- Currently we do not support Multi-GPU training(this will be fixed soon), you should only specifiy only one gpu\_id when running the training script.
- Currently we do not support Multi-GPU training(this will be fixed soon), you should only specify only one gpu\_id when running the training script.