Commit 223a7500 authored by Rayyyyy's avatar Rayyyyy
Browse files

update evaluation infos

parent 53d602e7
...@@ -22,14 +22,14 @@ Llama-3中选择了一个相对标准的decoder-only的transformer架构。与Ll ...@@ -22,14 +22,14 @@ Llama-3中选择了一个相对标准的decoder-only的transformer架构。与Ll
### Docker(方法一) ### Docker(方法一)
```bash ```bash
docker pull image.sourcefind.cn:5000/dcu/admin/base/pytorch:2.1.0-centos7.6-dtk23.10.1-py38 docker pull image.sourcefind.cn:5000/dcu/admin/base/pytorch:2.1.0-ubuntu22.04-dtk23.10.1-py310
docker run -it -v /path/your_code_data/:/path/your_code_data/ -v /opt/hyhal/:/opt/hyhal/:ro --shm-size=32G --privileged=true --device=/dev/kfd --device=/dev/dri/ --group-add video --name docker_name imageID bash docker run -it -v /path/your_code_data/:/path/your_code_data/ -v /opt/hyhal/:/opt/hyhal/:ro --shm-size=32G --privileged=true --device=/dev/kfd --device=/dev/dri/ --group-add video --name docker_name imageID bash
cd /your_code_path/llama3_pytorch cd /your_code_path/llama3_pytorch
pip install -e . pip install -e .
pip install deepspeed-0.12.3+git299681e.abi0.dtk2310.torch2.1.0a0-cp38-cp38-manylinux2014_x86_64.whl pip install deepspeed-0.12.3+gitfe61783.abi0.dtk2310.torch2.1.0a0-cp310-cp310-manylinux2014_x86_64.whl
pip install bitsandbytes-0.43.0-py3-none-any.whl pip install bitsandbytes-0.43.0-py3-none-any.whl
pip install -U xtuner # 0.1.18 pip install -U xtuner # 0.1.18
pip install mmengine==0.10.3 pip install mmengine==0.10.3
...@@ -46,7 +46,7 @@ cd /your_code_path/llama3_pytorch ...@@ -46,7 +46,7 @@ cd /your_code_path/llama3_pytorch
pip install -e . pip install -e .
pip install deepspeed-0.12.3+git299681e.abi0.dtk2310.torch2.1.0a0-cp38-cp38-manylinux2014_x86_64.whl pip install deepspeed-0.12.3+gitfe61783.abi0.dtk2310.torch2.1.0a0-cp310-cp310-manylinux2014_x86_64.whl
pip install bitsandbytes-0.43.0-py3-none-any.whl pip install bitsandbytes-0.43.0-py3-none-any.whl
pip install -U xtuner # 0.1.18 pip install -U xtuner # 0.1.18
pip install mmengine==0.10.3 pip install mmengine==0.10.3
...@@ -56,7 +56,7 @@ pip install mmengine==0.10.3 ...@@ -56,7 +56,7 @@ pip install mmengine==0.10.3
关于本项目DCU显卡所需的特殊深度学习库可从[光合](https://developer.hpccube.com/tool/)开发者社区下载安装。 关于本项目DCU显卡所需的特殊深度学习库可从[光合](https://developer.hpccube.com/tool/)开发者社区下载安装。
```bash ```bash
DTK驱动: dtk23.10.1 DTK驱动: dtk23.10.1
python: python3.8 python: python3.10
torch: 2.1.0 torch: 2.1.0
xtuner: 0.1.18 xtuner: 0.1.18
``` ```
...@@ -66,7 +66,7 @@ xtuner: 0.1.18 ...@@ -66,7 +66,7 @@ xtuner: 0.1.18
```bash ```bash
pip install -e . pip install -e .
pip install deepspeed-0.12.3+git299681e.abi0.dtk2310.torch2.1.0a0-cp38-cp38-manylinux2014_x86_64.whl pip install deepspeed-0.12.3+gitfe61783.abi0.dtk2310.torch2.1.0a0-cp310-cp310-manylinux2014_x86_64.whl
pip install bitsandbytes-0.43.0-py3-none-any.whl pip install bitsandbytes-0.43.0-py3-none-any.whl
pip install -U xtuner # 0.1.18 pip install -U xtuner # 0.1.18
pip install mmengine==0.10.3 pip install mmengine==0.10.3
...@@ -141,6 +141,32 @@ torchrun --nproc_per_node 1 example_chat_completion.py \ ...@@ -141,6 +141,32 @@ torchrun --nproc_per_node 1 example_chat_completion.py \
--max_seq_len 512 --max_batch_size 6 --max_seq_len 512 --max_batch_size 6
``` ```
## 验证
1. 安装 `llama-recipes``lm-eval`
```bash
# llama-recipes 下载
git clone https://github.com/meta-llama/llama-recipes.git
cd ./llama-recipes/recipes/evaluation/
# 修改eval.py第15行代码,将from lm_eval.utils import make_table 改为
from lm_eval.evaluator import make_table
# 修改eval.py第121行代码,num_fewshot参数的默认值改为0
default=0
# 返回根目录
cd ~
# lm-eval 下载
git clone http://developer.hpccube.com/codes/chenych/lm-evaluation-harness.git
cd ./lm-evaluation-harness/
pip install -e .
cd ../
```
2. 修改待测模型**pretrained**参数地址,例如 `/home/Meta-Llama-3-8B-Instruct`,特别地,当前仅支持`hellaswag`数据集进行测试验证。执行以下命令:
```bash
cd /path/of/llama-recipes/recipes/evaluation
python eval.py --model hf --model_args pretrained=/home/llama3/Meta-Llama-3-8B-Instruct,dtype="float" --tasks hellaswag --device cuda --batch_size 8
```
## result ## result
- Meta-Llama-3-8B-Instruct - Meta-Llama-3-8B-Instruct
<div align=center> <div align=center>
......
FROM image.sourcefind.cn:5000/dcu/admin/base/pytorch:2.1.0-centos7.6-dtk23.10.1-py38 FROM image.sourcefind.cn:5000/dcu/admin/base/pytorch:2.1.0-ubuntu22.04-dtk23.10.1-py310
\ No newline at end of file \ No newline at end of file
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment