Unverified Commit f7e80721 authored by tznurmin's avatar tznurmin Committed by GitHub
Browse files

Fixed the default number of attention heads in Reformer Configuration (#6973)

parent e20d8895
...@@ -160,7 +160,7 @@ class ReformerConfig(PretrainedConfig): ...@@ -160,7 +160,7 @@ class ReformerConfig(PretrainedConfig):
lsh_num_chunks_before=1, lsh_num_chunks_before=1,
lsh_num_chunks_after=0, lsh_num_chunks_after=0,
max_position_embeddings=4096, max_position_embeddings=4096,
num_attention_heads=2, num_attention_heads=12,
num_buckets=None, num_buckets=None,
num_hashes=1, num_hashes=1,
pad_token_id=0, pad_token_id=0,
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment