Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
05de038f
"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "29e4597950e9de8cc32321e4286f1a61894b46d4"
Unverified
Commit
05de038f
authored
Sep 14, 2023
by
Abhilash Majumder
Committed by
GitHub
Sep 13, 2023
Browse files
Flex xpu bug fix (#26135)
flex gpu bug fix
parent
9709ab11
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
1 deletion
+2
-1
src/transformers/training_args.py
src/transformers/training_args.py
+2
-1
No files found.
src/transformers/training_args.py
View file @
05de038f
...
...
@@ -1425,12 +1425,13 @@ class TrainingArguments:
and
is_torch_available
()
and
(
self
.
device
.
type
!=
"cuda"
)
and
(
self
.
device
.
type
!=
"npu"
)
and
(
self
.
device
.
type
!=
"xpu"
)
and
(
get_xla_device_type
(
self
.
device
)
!=
"GPU"
)
and
(
self
.
fp16
or
self
.
fp16_full_eval
)
):
raise
ValueError
(
"FP16 Mixed precision training with AMP or APEX (`--fp16`) and FP16 half precision evaluation"
" (`--fp16_full_eval`) can only be used on CUDA or NPU devices."
" (`--fp16_full_eval`) can only be used on CUDA or NPU devices
or certain XPU devices (with IPEX)
."
)
if
(
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment