Commit b66a1735 authored by yangql's avatar yangql
Browse files

修改格式

parent b9bb0b64
......@@ -27,6 +27,7 @@ docker run --shm-size 16g --network=host --name=bert_ort --privileged --device=/
# 激活dtk
source /opt/dtk/env.sh
```
## 数据集
## 推理
### Python版本推理
......@@ -35,7 +36,7 @@ source /opt/dtk/env.sh
```
export PYTHONPATH=/opt/dtk/lib:$PYTHONPATH
```
#### 安装依赖
#### 运行示例
```python
# 进入bert ort工程根目录
cd <path_to_bert_ort>
......@@ -45,19 +46,11 @@ cd Python/
# 安装依赖
pip install -r requirements.txt
```
#### 运行示例
```python
# 运行示例
python bert.py
```
输出结果为:
```
“1”:"open-source exascale-class platform for accelerated computing",
"2":"(Tensorflow / PyTorch)",
"3":"scale"
```
输出结果中,问题id对应预测概率值最大的答案。
### C++版本推理
本次采用经典的Bert模型完成问题回答任务,模型和分词文件下载链接:https://pan.baidu.com/s/1yc30IzM4ocOpTpfFuUMR0w, 提取码:8f1a, 将bertsquad-10.onnx文件和uncased_L-12_H-768_A-12分词文件保存在Resource/文件夹下。下面介绍如何运行C++代码示例,C++示例的详细说明见Doc目录下的Tutorial_Cpp.md。
......@@ -76,7 +69,7 @@ source ~/.bashrc
source /opt/dtk/env.sh
```
#### 运行示例
```python
```c++
# 进入bert ort工程根目录
cd <path_to_bert_ort>
......@@ -86,7 +79,15 @@ cd build/
# 执行示例程序
./Bert
```
如下所示,在当前界面根据提示输入问题,得到预测答案。
## result
### python版本
```
“1”:"open-source exascale-class platform for accelerated computing",
"2":"(Tensorflow / PyTorch)",
"3":"scale"
```
### C++版本
```
question:What is ROCm?
answer:open-source exascale-class platform for accelerated computing
......@@ -95,16 +96,16 @@ answer:tensorflow / pytorch
question:What is ROCm built for?
answer:scale
```
## 应用场景
### 算法类别
`对话问答`
### 热点应用行业
`零售``医疗``教育`
`零售`,`医疗`,`教育`
## 源码仓库及问题反馈
https://developer.hpccube.com/codes/modelzoo/bert_ort
## 参考资料
https://github.com/ROCmSoftwarePlatform/onnxruntime/blob/81120e9e8b377567daa00d55614c902f35b2ae8f/onnxruntime/python/tools/transformers/onnx_model_bert.py
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment