Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
Megatron-LM
Commits
00129014
Commit
00129014
authored
Apr 28, 2022
by
Vijay Korthikanti
Browse files
avoid 'reverse checkpointed' term in argument description
parent
5d2e13a6
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
megatron/arguments.py
megatron/arguments.py
+2
-2
No files found.
megatron/arguments.py
View file @
00129014
...
@@ -492,9 +492,9 @@ def _add_training_args(parser):
...
@@ -492,9 +492,9 @@ def _add_training_args(parser):
help
=
'Checkpoint activatins to allow for training '
help
=
'Checkpoint activatins to allow for training '
'with larger models, sequences, and batch sizes. '
'with larger models, sequences, and batch sizes. '
'It is supported at two granularities 1) full: '
'It is supported at two granularities 1) full: '
'whole transformer layer is
reverse
checkpointed, '
'whole transformer layer is checkpointed, '
'2) selective: core attention part of the transformer '
'2) selective: core attention part of the transformer '
'layer is
reverse
checkpointed.'
)
'layer is checkpointed.'
)
group
.
add_argument
(
'--distribute-checkpointed-activations'
,
group
.
add_argument
(
'--distribute-checkpointed-activations'
,
action
=
'store_true'
,
action
=
'store_true'
,
help
=
'If set, distribute checkpointed activations '
help
=
'If set, distribute checkpointed activations '
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment