Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
0bf5e50038ee341ece03bfd0c8ff45a6c57aed5a
0bf5e50038ee341ece03bfd0c8ff45a6c57aed5a
Switch branch/tag
flash-attention
training
configs
model
gpt2model
gpt2-medium.yaml
Find file
Normal view
History
Permalink
gpt2-medium.yaml
85 Bytes
Edit
Web IDE
Newer
Older
Release training code
Tri Dao
committed
Nov 28, 2022
1
2
3
4
5
6
# @package _global_
model
:
config
:
n_embd
:
1024
n_head
:
16
n_layer
:
24