Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
f38c4ad3
Unverified
Commit
f38c4ad3
authored
Dec 20, 2020
by
Stas Bekman
Committed by
GitHub
Dec 20, 2020
Browse files
better logging and help (#9203)
parent
e0e255be
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
3 additions
and
1 deletion
+3
-1
examples/seq2seq/finetune_trainer.py
examples/seq2seq/finetune_trainer.py
+1
-0
examples/seq2seq/utils.py
examples/seq2seq/utils.py
+2
-1
No files found.
examples/seq2seq/finetune_trainer.py
View file @
f38c4ad3
...
...
@@ -98,6 +98,7 @@ class DataTrainingArguments:
metadata
=
{
"help"
:
"The maximum total sequence length for validation target text after tokenization. Sequences longer "
"than this will be truncated, sequences shorter will be padded."
" This argument is also used to override the ``max_length`` param of ``model.generate``, which is used during ``evaluate`` and ``predict``"
},
)
test_max_target_length
:
Optional
[
int
]
=
field
(
...
...
examples/seq2seq/utils.py
View file @
f38c4ad3
...
...
@@ -434,7 +434,8 @@ def use_task_specific_params(model, task):
if
task_specific_params
is
not
None
:
pars
=
task_specific_params
.
get
(
task
,
{})
logger
.
info
(
f
"using task specific params for
{
task
}
:
{
pars
}
"
)
logger
.
info
(
f
"setting model.config to task specific params for
{
task
}
:
\n
{
pars
}
"
)
logger
.
info
(
"note: command line args may override some of these"
)
model
.
config
.
update
(
pars
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment