Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
b85b8806
"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "520198f56f31eef2fe032562ecf52b21928cb3bf"
Unverified
Commit
b85b8806
authored
Aug 24, 2023
by
Sourab Mangrulkar
Committed by
GitHub
Aug 24, 2023
Browse files
fix ram efficient fsdp init (#25686)
parent
68fa9a59
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
src/transformers/modeling_utils.py
src/transformers/modeling_utils.py
+2
-2
No files found.
src/transformers/modeling_utils.py
View file @
b85b8806
...
@@ -113,11 +113,11 @@ _init_weights = True
...
@@ -113,11 +113,11 @@ _init_weights = True
def
is_fsdp_enabled
():
def
is_fsdp_enabled
():
return
strtobool
(
os
.
environ
.
get
(
"ACCELERATE_USE_FSDP"
,
"False"
))
==
1
return
torch
.
distributed
.
is_initialized
()
and
strtobool
(
os
.
environ
.
get
(
"ACCELERATE_USE_FSDP"
,
"False"
))
==
1
def
is_fsdp_enabled_and_dist_rank_0
():
def
is_fsdp_enabled_and_dist_rank_0
():
return
is_fsdp_enabled
()
and
torch
.
distributed
.
is_initialized
()
and
torch
.
distributed
.
get_rank
()
==
0
return
is_fsdp_enabled
()
and
torch
.
distributed
.
get_rank
()
==
0
if
is_sagemaker_mp_enabled
():
if
is_sagemaker_mp_enabled
():
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment