Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
c425d60b
Commit
c425d60b
authored
Jan 12, 2022
by
Sylvain Gugger
Browse files
Fix link to deepspeed config
parent
68209044
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
docs/source/main_classes/deepspeed.mdx
docs/source/main_classes/deepspeed.mdx
+2
-2
No files found.
docs/source/main_classes/deepspeed.mdx
View file @
c425d60b
...
@@ -1707,13 +1707,13 @@ Work is being done to enable estimating how much memory is needed for a specific
...
@@ -1707,13 +1707,13 @@ Work is being done to enable estimating how much memory is needed for a specific
## Non-Trainer Deepspeed Integration
## Non-Trainer Deepspeed Integration
The [`~
integrations
.HfDeepSpeedConfig`] is used to integrate Deepspeed into the 🤗 Transformers core
The [`~
deepspeed
.HfDeepSpeedConfig`] is used to integrate Deepspeed into the 🤗 Transformers core
functionality, when [`Trainer`] is not used.
functionality, when [`Trainer`] is not used.
When using [`Trainer`] everything is automatically taken care of.
When using [`Trainer`] everything is automatically taken care of.
When not using [`Trainer`], to efficiently deploy DeepSpeed stage 3, you must instantiate the
When not using [`Trainer`], to efficiently deploy DeepSpeed stage 3, you must instantiate the
[`~
integrations
.HfDeepSpeedConfig`] object before instantiating the model.
[`~
deepspeed
.HfDeepSpeedConfig`] object before instantiating the model.
For example for a pretrained model:
For example for a pretrained model:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment