Commit 475a6b99 authored by Steven Steinke's avatar Steven Steinke
Browse files

Fix embeddings and seq length

parent 5cb2d1ea
......@@ -30,8 +30,8 @@ python -m torch.distributed.launch $DISTRIBUTED_ARGS ./tasks/main.py \
--num-attention-heads 16 \
--batch-size 8 \
--checkpoint-activations \
--seq-length 512 \
--max-position-embeddings 512 \
--seq-length 1024 \
--max-position-embeddings 1024 \
--log-interval 10 \
--fp16 \
--no-load-optim \
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment