fsmt.rst 3 KB
Newer Older
1
FSMT
Sylvain Gugger's avatar
Sylvain Gugger committed
2
3
4
5
-----------------------------------------------------------------------------------------------------------------------

**DISCLAIMER:** If you see something strange, file a `Github Issue
<https://github.com/huggingface/transformers/issues/new?assignees=&labels=&template=bug-report.md&title>`__ and assign
6
7
8
@stas00.

Overview
Sylvain Gugger's avatar
Sylvain Gugger committed
9
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
10

Sylvain Gugger's avatar
Sylvain Gugger committed
11
12
FSMT (FairSeq MachineTranslation) models were introduced in `Facebook FAIR's WMT19 News Translation Task Submission
<https://arxiv.org/abs/1907.06616>`__ by Nathan Ng, Kyra Yee, Alexei Baevski, Myle Ott, Michael Auli, Sergey Edunov.
13
14
15

The abstract of the paper is the following:

Sylvain Gugger's avatar
Sylvain Gugger committed
16
17
18
19
20
21
22
23
*This paper describes Facebook FAIR's submission to the WMT19 shared news translation task. We participate in two
language pairs and four language directions, English <-> German and English <-> Russian. Following our submission from
last year, our baseline systems are large BPE-based transformer models trained with the Fairseq sequence modeling
toolkit which rely on sampled back-translations. This year we experiment with different bitext data filtering schemes,
as well as with adding filtered back-translated data. We also ensemble and fine-tune our models on domain-specific
data, then decode using noisy channel model reranking. Our submissions are ranked first in all four directions of the
human evaluation campaign. On En->De, our system significantly outperforms other systems as well as human translations.
This system improves upon our WMT'18 submission by 4.5 BLEU points.*
24
25
26
27

The original code can be found here <https://github.com/pytorch/fairseq/tree/master/examples/wmt19>__.

Implementation Notes
Sylvain Gugger's avatar
Sylvain Gugger committed
28
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
29

Sylvain Gugger's avatar
Sylvain Gugger committed
30
31
32
- FSMT uses source and target vocabulary pairs that aren't combined into one. It doesn't share embeddings tokens
  either. Its tokenizer is very similar to :class:`~transformers.XLMTokenizer` and the main model is derived from
  :class:`~transformers.BartModel`.
33
34
35


FSMTConfig
Sylvain Gugger's avatar
Sylvain Gugger committed
36
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
37
38
39
40
41
42

.. autoclass:: transformers.FSMTConfig
    :members:


FSMTTokenizer
Sylvain Gugger's avatar
Sylvain Gugger committed
43
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
44
45

.. autoclass:: transformers.FSMTTokenizer
Sylvain Gugger's avatar
Sylvain Gugger committed
46
47
    :members: build_inputs_with_special_tokens, get_special_tokens_mask,
        create_token_type_ids_from_sequences, prepare_seq2seq_batch, save_vocabulary
48
49
50


FSMTModel
Sylvain Gugger's avatar
Sylvain Gugger committed
51
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
52
53
54

.. autoclass:: transformers.FSMTModel
    :members: forward
Sylvain Gugger's avatar
Sylvain Gugger committed
55
56
57
58
59
60
61


FSMTForConditionalGeneration
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: transformers.FSMTForConditionalGeneration
    :members: forward