Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
9a220ce3
Unverified
Commit
9a220ce3
authored
Jul 27, 2023
by
Bram Vanroy
Committed by
GitHub
Jul 27, 2023
Browse files
Clarify 4/8 bit loading log message (#25134)
* clarify 4/8 bit loading log message * make style
parent
9429642e
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
2 deletions
+4
-2
src/transformers/modeling_utils.py
src/transformers/modeling_utils.py
+4
-2
No files found.
src/transformers/modeling_utils.py
View file @
9a220ce3
...
@@ -2734,8 +2734,10 @@ class PreTrainedModel(nn.Module, ModuleUtilsMixin, GenerationMixin, PushToHubMix
...
@@ -2734,8 +2734,10 @@ class PreTrainedModel(nn.Module, ModuleUtilsMixin, GenerationMixin, PushToHubMix
llm_int8_skip_modules
=
quantization_config
.
llm_int8_skip_modules
llm_int8_skip_modules
=
quantization_config
.
llm_int8_skip_modules
load_in_8bit_fp32_cpu_offload
=
quantization_config
.
llm_int8_enable_fp32_cpu_offload
load_in_8bit_fp32_cpu_offload
=
quantization_config
.
llm_int8_enable_fp32_cpu_offload
if
load_in_8bit
:
logger
.
info
(
"Detected 8-bit loading: activating 8-bit loading for this model"
)
logger
.
info
(
"Detected 8-bit loading: activating 8-bit loading for this model"
)
else
:
logger
.
info
(
"Detected 4-bit loading: activating 4-bit loading for this model"
)
# We keep some modules such as the lm_head in their original dtype for numerical stability reasons
# We keep some modules such as the lm_head in their original dtype for numerical stability reasons
if
llm_int8_skip_modules
is
None
:
if
llm_int8_skip_modules
is
None
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment