Commit e005d327 authored by Rayyyyy's avatar Rayyyyy
Browse files

modify README 70B T0 8B

parent 5c13d125
......@@ -119,7 +119,7 @@ NPROC_PER_NODE=${DCU_NUM} xtuner train ./llama3_8b_instruct_qlora_alpaca_e3_M.py
- Meta-Llama-3-8B 模型示例,Meta-Llama-3-70B模型仅需替换--ckpt_dir、--tokenizer_path对应模型地址即可。
```bash
torchrun --nproc_per_node 8 example_text_completion.py \
--ckpt_dir Meta-Llama-3-70B/original/ \
--ckpt_dir Meta-Llama-3-8B/original/ \
--tokenizer_path Meta-Llama-3-8B/original/tokenizer.model \
--max_seq_len 128 --max_batch_size 4
```
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment