Commit a7c4159d authored by Gustaf Ahdritz's avatar Gustaf Ahdritz
Browse files

Restore layer_norm_m to its rightful place

parent c1c8999c
......@@ -254,6 +254,7 @@ class MSAAttention(nn.Module):
use_lma=use_lma,
)
else:
m = self.layer_norm_m(m)
m = self.mha(
q_x=m,
kv_x=m,
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment