Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
ff373a11
Commit
ff373a11
authored
Oct 18, 2022
by
xyupeng
Committed by
Frank Lee
Oct 19, 2022
Browse files
[NFC] polish tests/test_layers/test_sequence/checks_seq/check_layer_seq.py code style (#1723)
parent
7e62af28
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
8 deletions
+3
-8
tests/test_layers/test_sequence/checks_seq/check_layer_seq.py
...s/test_layers/test_sequence/checks_seq/check_layer_seq.py
+3
-8
No files found.
tests/test_layers/test_sequence/checks_seq/check_layer_seq.py
View file @
ff373a11
...
...
@@ -12,15 +12,10 @@ def check_selfattention():
BATCH
=
4
HIDDEN_SIZE
=
16
layer
=
TransformerSelfAttentionRing
(
16
,
8
,
8
,
0.1
)
layer
=
TransformerSelfAttentionRing
(
16
,
8
,
8
,
0.1
)
layer
=
layer
.
to
(
get_current_device
())
hidden_states
=
torch
.
rand
(
SUB_SEQ_LENGTH
,
BATCH
,
HIDDEN_SIZE
).
to
(
get_current_device
())
attention_mask
=
torch
.
randint
(
low
=
0
,
high
=
2
,
size
=
(
BATCH
,
1
,
1
,
1
,
SUB_SEQ_LENGTH
*
WORLD_SIZE
)).
to
(
get_current_device
())
attention_mask
=
torch
.
randint
(
low
=
0
,
high
=
2
,
size
=
(
BATCH
,
1
,
1
,
1
,
SUB_SEQ_LENGTH
*
WORLD_SIZE
)).
to
(
get_current_device
())
out
=
layer
(
hidden_states
,
attention_mask
)
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment