Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
ComfyUI
Commits
66e28ef4
Commit
66e28ef4
authored
Feb 04, 2024
by
comfyanonymous
Browse files
Don't use is_bf16_supported to check for fp16 support.
parent
24129d78
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
1 deletion
+4
-1
comfy/model_management.py
comfy/model_management.py
+4
-1
No files found.
comfy/model_management.py
View file @
66e28ef4
...
@@ -722,10 +722,13 @@ def should_use_fp16(device=None, model_params=0, prioritize_performance=True, ma
...
@@ -722,10 +722,13 @@ def should_use_fp16(device=None, model_params=0, prioritize_performance=True, ma
if
is_intel_xpu
():
if
is_intel_xpu
():
return
True
return
True
if
torch
.
cuda
.
is_bf16_supported
()
:
if
torch
.
version
.
hip
:
return
True
return
True
props
=
torch
.
cuda
.
get_device_properties
(
"cuda"
)
props
=
torch
.
cuda
.
get_device_properties
(
"cuda"
)
if
props
.
major
>=
8
:
return
True
if
props
.
major
<
6
:
if
props
.
major
<
6
:
return
False
return
False
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment