Commit f6d1e016 authored by chenzk's avatar chenzk
Browse files

Update sf.md

parent 56eabd55
......@@ -4,11 +4,11 @@
## 模型结构
bert_large_squad核心是transformer,transformer结构如下:
![image](https://developer.hpccube.com/codes/modelzoo/bert_large_squad_onnx/-/raw/main/resources/transformer.png)
![image](https://developer.sourcefind.cn/codes/modelzoo/bert_large_squad_onnx/-/raw/main/resources/transformer.png)
## 算法原理
bert_large_squad模型的主要参数为:24个transformer层、1024个hidden size、16个self-attention heads,简要原理可用下图表示:
![image](https://developer.hpccube.com/codes/modelzoo/bert_large_squad_onnx/-/raw/main/resources/squad.png)
![image](https://developer.sourcefind.cn/codes/modelzoo/bert_large_squad_onnx/-/raw/main/resources/squad.png)
## 数据集
暂无合适中文数据集
......@@ -33,7 +33,7 @@ python3 main.py
```
## result
![image](https://developer.hpccube.com/codes/modelzoo/bert_large_squad_onnx/-/raw/main/resources/bert_result.png)
![image](https://developer.sourcefind.cn/codes/modelzoo/bert_large_squad_onnx/-/raw/main/resources/bert_result.png)
### 精度
暂无
## 应用场景
......@@ -42,7 +42,7 @@ python3 main.py
### 热点应用行业
医疗,科研,金融,教育
## 源码仓库及问题反馈
https://developer.hpccube.com/codes/modelzoo/bert_large_squad_onnxruntime
https://developer.sourcefind.cn/codes/modelzoo/bert_large_squad_onnxruntime
## 参考资料
https://github.com/google-research/bert
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment