"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "d7b3bf547c9d488192cb4a7ac394907a23dc8bae"
Unverified Commit e33929ef authored by Erick Rocha Fonseca's avatar Erick Rocha Fonseca Committed by GitHub
Browse files

Fix in Reformer Config documentation (#5138)

parent 84be482f
......@@ -97,7 +97,7 @@ class ReformerConfig(PretrainedConfig):
Number of following neighbouring chunks to attend to in LocalSelfAttention layer in addition to itself.
local_attention_probs_dropout_prob (:obj:`float`, optional, defaults to 0.1):
The dropout ratio for the attention probabilities in LocalSelfAttention.
lsh_chunk_length (:obj:`int`, optional, defaults to 64):
lsh_attn_chunk_length (:obj:`int`, optional, defaults to 64):
Length of chunk which attends to itself in LSHSelfAttention. Chunking reduces memory complexity from sequence length x sequence length (self attention) to chunk length x chunk length x sequence length / chunk length (chunked self attention).
lsh_num_chunks_before (:obj:`int`, optional, defaults to 1):
Number of previous neighbouring chunks to attend to in LSHSelfAttention layer to itself.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment