Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
6a707cf5
"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "738ecd17d869577d263eb1fba3fee0ab8ec5b5a2"
Unverified
Commit
6a707cf5
authored
Dec 06, 2022
by
Sylvain Gugger
Browse files
Repo consistency
parent
97a51b0c
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
8 additions
and
1 deletion
+8
-1
src/transformers/models/biogpt/modeling_biogpt.py
src/transformers/models/biogpt/modeling_biogpt.py
+8
-1
No files found.
src/transformers/models/biogpt/modeling_biogpt.py
View file @
6a707cf5
...
@@ -153,7 +153,14 @@ class BioGptAttention(nn.Module):
...
@@ -153,7 +153,14 @@ class BioGptAttention(nn.Module):
# get query proj
# get query proj
query_states
=
self
.
q_proj
(
hidden_states
)
*
self
.
scaling
query_states
=
self
.
q_proj
(
hidden_states
)
*
self
.
scaling
# get key, value proj
# get key, value proj
if
is_cross_attention
and
past_key_value
is
not
None
:
# `past_key_value[0].shape[2] == key_value_states.shape[1]`
# is checking that the `sequence_length` of the `past_key_value` is the same as
# the provided `key_value_states` to support prefix tuning
if
(
is_cross_attention
and
past_key_value
is
not
None
and
past_key_value
[
0
].
shape
[
2
]
==
key_value_states
.
shape
[
1
]
):
# reuse k,v, cross_attentions
# reuse k,v, cross_attentions
key_states
=
past_key_value
[
0
]
key_states
=
past_key_value
[
0
]
value_states
=
past_key_value
[
1
]
value_states
=
past_key_value
[
1
]
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment