Unverified Commit 52b3a05e authored by Patrick von Platen's avatar Patrick von Platen Committed by GitHub
Browse files

[Bart doc] Fix outdated statement (#9299)

* fix bart doc

* fix docs
parent 7777db15
...@@ -55,9 +55,8 @@ Implementation Notes ...@@ -55,9 +55,8 @@ Implementation Notes
- Bart doesn't use :obj:`token_type_ids` for sequence classification. Use :class:`~transformers.BartTokenizer` or - Bart doesn't use :obj:`token_type_ids` for sequence classification. Use :class:`~transformers.BartTokenizer` or
:meth:`~transformers.BartTokenizer.encode` to get the proper splitting. :meth:`~transformers.BartTokenizer.encode` to get the proper splitting.
- The forward pass of :class:`~transformers.BartModel` will create decoder inputs (using the helper function - The forward pass of :class:`~transformers.BartModel` will create the ``decoder_input_ids`` if they are not passed.
:func:`transformers.models.bart.modeling_bart._prepare_bart_decoder_inputs`) if they are not passed. This is This is different than some other modeling APIs. A typical use case of this feature is mask filling.
different than some other modeling APIs.
- Model predictions are intended to be identical to the original implementation when - Model predictions are intended to be identical to the original implementation when
:obj:`force_bos_token_to_be_generated=True`. This only works, however, if the string you pass to :obj:`force_bos_token_to_be_generated=True`. This only works, however, if the string you pass to
:func:`fairseq.encode` starts with a space. :func:`fairseq.encode` starts with a space.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment