Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
52b3a05e
Unverified
Commit
52b3a05e
authored
Dec 24, 2020
by
Patrick von Platen
Committed by
GitHub
Dec 24, 2020
Browse files
[Bart doc] Fix outdated statement (#9299)
* fix bart doc * fix docs
parent
7777db15
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
3 deletions
+2
-3
docs/source/model_doc/bart.rst
docs/source/model_doc/bart.rst
+2
-3
No files found.
docs/source/model_doc/bart.rst
View file @
52b3a05e
...
...
@@ -55,9 +55,8 @@ Implementation Notes
- Bart doesn'
t
use
:
obj
:`
token_type_ids
`
for
sequence
classification
.
Use
:
class
:`~
transformers
.
BartTokenizer
`
or
:
meth
:`~
transformers
.
BartTokenizer
.
encode
`
to
get
the
proper
splitting
.
-
The
forward
pass
of
:
class
:`~
transformers
.
BartModel
`
will
create
decoder
inputs
(
using
the
helper
function
:
func
:`
transformers
.
models
.
bart
.
modeling_bart
.
_prepare_bart_decoder_inputs
`)
if
they
are
not
passed
.
This
is
different
than
some
other
modeling
APIs
.
-
The
forward
pass
of
:
class
:`~
transformers
.
BartModel
`
will
create
the
``
decoder_input_ids
``
if
they
are
not
passed
.
This
is
different
than
some
other
modeling
APIs
.
A
typical
use
case
of
this
feature
is
mask
filling
.
-
Model
predictions
are
intended
to
be
identical
to
the
original
implementation
when
:
obj
:`
force_bos_token_to_be_generated
=
True
`.
This
only
works
,
however
,
if
the
string
you
pass
to
:
func
:`
fairseq
.
encode
`
starts
with
a
space
.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment