Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
c618ab4f
"docs/vscode:/vscode.git/clone" did not exist on "8f2ef52fb615bfa4d44467e18c949a9eb52bd260"
Unverified
Commit
c618ab4f
authored
May 18, 2023
by
joaoareis
Committed by
GitHub
May 18, 2023
Browse files
Fix DecisionTransformerConfig doctring (#23450)
parent
5777c3cb
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
src/transformers/models/decision_transformer/configuration_decision_transformer.py
...ecision_transformer/configuration_decision_transformer.py
+2
-2
No files found.
src/transformers/models/decision_transformer/configuration_decision_transformer.py
View file @
c618ab4f
...
...
@@ -57,9 +57,9 @@ class DecisionTransformerConfig(PretrainedConfig):
n_positions (`int`, *optional*, defaults to 1024):
The maximum sequence length that this model might ever be used with. Typically set this to something large
just in case (e.g., 512 or 1024 or 2048).
n_layer (`int`, *optional*, defaults to
12
):
n_layer (`int`, *optional*, defaults to
3
):
Number of hidden layers in the Transformer encoder.
n_head (`int`, *optional*, defaults to 1
2
):
n_head (`int`, *optional*, defaults to 1):
Number of attention heads for each attention layer in the Transformer encoder.
n_inner (`int`, *optional*):
Dimensionality of the inner feed-forward layers. If unset, will default to 4 times `n_embd`.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment