Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
75a208ef
Unverified
Commit
75a208ef
authored
Feb 10, 2023
by
Younes Belkada
Committed by
GitHub
Feb 10, 2023
Browse files
[`Blip2`] Add int8 support for `blip2-flan-t5-xxl` (#21574)
add int8 support
parent
b47a1674
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
0 deletions
+1
-0
src/transformers/models/blip_2/modeling_blip_2.py
src/transformers/models/blip_2/modeling_blip_2.py
+1
-0
No files found.
src/transformers/models/blip_2/modeling_blip_2.py
View file @
75a208ef
...
@@ -285,6 +285,7 @@ class Blip2PreTrainedModel(PreTrainedModel):
...
@@ -285,6 +285,7 @@ class Blip2PreTrainedModel(PreTrainedModel):
r
"language_model.decoder.embed_tokens.weight"
,
r
"language_model.decoder.embed_tokens.weight"
,
]
]
_no_split_modules
=
[
"Blip2Attention"
,
"T5Block"
,
"OPTDecoderLayer"
]
_no_split_modules
=
[
"Blip2Attention"
,
"T5Block"
,
"OPTDecoderLayer"
]
_keep_in_fp32_modules
=
[
"wo"
]
def
_init_weights
(
self
,
module
):
def
_init_weights
(
self
,
module
):
"""Initialize the weights"""
"""Initialize the weights"""
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment