# Understanding Back-Translation at Scale (Edunov et al., 2018) This page includes pre-trained models from the paper [Understanding Back-Translation at Scale (Edunov et al., 2018)](https://arxiv.org/abs/1808.09381). ## Pre-trained models Description | Dataset | Model | Test set(s) ---|---|---|--- Transformer
([Edunov et al., 2018](https://arxiv.org/abs/1808.09381); WMT'18 winner) | [WMT'18 English-German](http://www.statmt.org/wmt18/translation-task.html) | [download (.tar.bz2)](https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.bz2) | See NOTE in the archive ## Citation ```bibtex @inproceedings{edunov2018backtranslation, title = {Understanding Back-Translation at Scale}, author = {Edunov, Sergey and Ott, Myle and Auli, Michael and Grangier, David}, booktitle = {Conference of the Association for Computational Linguistics (ACL)}, year = 2018, } ```