Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
ee8e80a0
"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "c89bdfbe720bc8f41c7dc6db5473a2cb0955f224"
Unverified
Commit
ee8e80a0
authored
Apr 07, 2023
by
Sourab Mangrulkar
Committed by
GitHub
Apr 07, 2023
Browse files
fix FSDP version related issues (#22489)
fix fsdp
parent
c7ec71ba
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
3 deletions
+6
-3
src/transformers/trainer.py
src/transformers/trainer.py
+6
-3
No files found.
src/transformers/trainer.py
View file @
ee8e80a0
...
@@ -1481,6 +1481,11 @@ class Trainer:
...
@@ -1481,6 +1481,11 @@ class Trainer:
mixed_precision_policy
=
MixedPrecision
(
param_dtype
=
dtype
,
reduce_dtype
=
dtype
,
buffer_dtype
=
dtype
)
mixed_precision_policy
=
MixedPrecision
(
param_dtype
=
dtype
,
reduce_dtype
=
dtype
,
buffer_dtype
=
dtype
)
if
type
(
model
)
!=
FSDP
:
if
type
(
model
)
!=
FSDP
:
# XXX: Breaking the self.model convention but I see no way around it for now.
# XXX: Breaking the self.model convention but I see no way around it for now.
signature
=
inspect
.
signature
(
FSDP
.
__init__
).
parameters
.
keys
()
kwargs
=
{}
for
arg
in
[
"limit_all_gathers"
,
"forward_prefetch"
,
"backward_prefetch"
]:
if
arg
in
signature
:
kwargs
[
arg
]
=
getattr
(
self
,
arg
)
self
.
model
=
model
=
FSDP
(
self
.
model
=
model
=
FSDP
(
model
,
model
,
sharding_strategy
=
self
.
fsdp
,
sharding_strategy
=
self
.
fsdp
,
...
@@ -1488,9 +1493,7 @@ class Trainer:
...
@@ -1488,9 +1493,7 @@ class Trainer:
auto_wrap_policy
=
auto_wrap_policy
,
auto_wrap_policy
=
auto_wrap_policy
,
mixed_precision
=
mixed_precision_policy
,
mixed_precision
=
mixed_precision_policy
,
device_id
=
self
.
args
.
device
,
device_id
=
self
.
args
.
device
,
backward_prefetch
=
self
.
backward_prefetch
,
**
kwargs
,
forward_prefetch
=
self
.
forword_prefetch
,
limit_all_gathers
=
self
.
limit_all_gathers
,
)
)
else
:
else
:
try
:
try
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment