Commit edb026d5 authored by wxj's avatar wxj
Browse files

Update README.md

parent a65d5678
Pipeline #2653 passed with stage
...@@ -210,6 +210,21 @@ DATA_PATH="/datasets/oscar-1GB-llama_text_document" ...@@ -210,6 +210,21 @@ DATA_PATH="/datasets/oscar-1GB-llama_text_document"
`Llama_pretraining.log`中查看训练日志 `Llama_pretraining.log`中查看训练日志
# 微调
将hf格式转为pt格式
```shell
python tools/checkpoint/convert.py \
--model-type GPT \
--loader llama_mistral \
--saver megatron \
--target-tensor-parallel-size 1 \
--checkpoint-type hf \
--model-size llama2-7Bf \
--load-dir /models/llama2/Llama-2-7b-hf/ \
--save-dir ./Llama-2-7b-megatron-lm-0108 \
--tokenizer-model /models/llama2/Llama-2-7b-hf
```
# 参考 # 参考
- [README_ORIGIN](README_ORIGIN.md) - [README_ORIGIN](README_ORIGIN.md)
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment