Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
c77092a5
Unverified
Commit
c77092a5
authored
Mar 21, 2022
by
Suraj Patil
Committed by
GitHub
Mar 21, 2022
Browse files
[FlaxGPTJ] Fix bug in rotary embeddings (#16298)
parent
4b277483
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
src/transformers/models/gptj/modeling_flax_gptj.py
src/transformers/models/gptj/modeling_flax_gptj.py
+1
-1
No files found.
src/transformers/models/gptj/modeling_flax_gptj.py
View file @
c77092a5
...
...
@@ -122,7 +122,7 @@ def create_sinusoidal_positions(num_pos, dim):
def
rotate_every_two
(
tensor
):
rotate_half_tensor
=
jnp
.
stack
((
tensor
[:,
:,
:,
1
::
2
],
tensor
[:,
:,
:,
::
2
]),
axis
=-
1
)
rotate_half_tensor
=
jnp
.
stack
((
-
tensor
[:,
:,
:,
1
::
2
],
tensor
[:,
:,
:,
::
2
]),
axis
=-
1
)
rotate_half_tensor
=
rotate_half_tensor
.
reshape
(
rotate_half_tensor
.
shape
[:
-
2
]
+
(
-
1
,))
return
rotate_half_tensor
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment