README.md 3.54 KB
Newer Older
liucong's avatar
liucong committed
1
2
3
4
5
6
7
8
# Bidirectional Encoder Representation from Transformers(BERT)

## 模型介绍
BERT的全称为Bidirectional Encoder Representation from Transformers,是一个预训练的语言表征模型。它强调了不再像以往一样采用传统的单向语言模型或者把两个单向语言模型进行浅层拼接的方法进行预训练,而是采用新的masked language model(MLM),以致能生成深度的双向语言表征。

## 模型结构
以往的预训练模型的结构会受到单向语言模型(从左到右或者从右到左)的限制,因而也限制了模型的表征能力,使其只能获取单方向的上下文信息。而BERT利用MLM进行预训练并且采用深层的双向Transformer组件(单向的Transformer一般被称为Transformer decoder,其每一个token(符号)只会attend到目前往左的token。而双向的Transformer则被称为Transformer encoder,其每一个token会attend到所有的token)来构建整个模型,因此最终生成能融合左右上下文信息的深层双向语言表征。

liucong's avatar
liucong committed
9
10
11
12
13
14
## python版本推理

下面介绍如何运行python代码示例,具体推理代码解析,在Doc/Tutorial_Python.md中有详细说明。

本次采用经典的Bert模型完成问题回答任务,模型和分词文件下载链接:https://pan.baidu.com/s/1yc30IzM4ocOpTpfFuUMR0w, 提取码:8f1a, 将bertsquad-10.onnx文件和uncased_L-12_H-768_A-12分词文件保存在Resource/文件夹下。

liucong's avatar
liucong committed
15
### 拉取镜像
liucong's avatar
liucong committed
16
17
18
19

在光源可拉取推理的docker镜像,BERT模型推理的镜像如下: 

```python
20
docker pull image.sourcefind.cn:5000/dcu/admin/base/custom:ort1.14.0_migraphx3.0.0-dtk22.10.1
liucong's avatar
liucong committed
21
22
```

liucong's avatar
liucong committed
23
24
25
26
27
28
29
### 推理示例

1.参考《MIGraphX教程》设置好PYTHONPATH

2.安装依赖:

```python
liucong's avatar
liucong committed
30
31
# 进入bert migraphx工程根目录
cd <path_to_bert_migraphx> 
liucong's avatar
liucong committed
32
33
34
35
36
37
38
39

# 进入示例程序目录
cd ./Python/

# 安装依赖
pip install -r requirements.txt
```

liucong's avatar
liucong committed
40
3.在Python目录下执行如下命令运行该示例程序:
liucong's avatar
liucong committed
41
42

```python
liucong's avatar
liucong committed
43
python bert.py
44
45
```

liucong's avatar
liucong committed
46
47
48
49
50
51
52
53
54
55
56
57
输出结果为:

<img src="./Doc/Images/Bert_05.png" style="zoom:90%;" align=middle>

输出结果中,问题id对应预测概率值最大的答案。

## C++版本推理

下面介绍如何运行C++代码示例,具体推理代码解析,在Doc/Tutorial_Cpp.md目录中有详细说明。

参考Python版本推理中的构建安装,在光源中拉取推理的docker镜像。

58
59
60
61
62
63
64
65
### 修改CMakeLists.txt

- 如果使用ubuntu系统,需要修改CMakeLists.txt中依赖库路径:
  将"${CMAKE_CURRENT_SOURCE_DIR}/depend/lib64/"修改为"${CMAKE_CURRENT_SOURCE_DIR}/depend/lib/"

- **MIGraphX2.3.0及以上版本需要c++17**


liucong's avatar
liucong committed
66
### 构建工程
67
68
69
70
71
72
73
74
75
76
77
78

```
rbuild build -d depend
```

### 设置环境变量

将依赖库依赖加入环境变量LD_LIBRARY_PATH,在~/.bashrc中添加如下语句:

**Centos**:

```
liucong's avatar
liucong committed
79
export LD_LIBRARY_PATH=<path_to_bert_migraphx>/depend/lib64/:$LD_LIBRARY_PATH
liucong's avatar
liucong committed
80
81
```

82
**Ubuntu**:
liucong's avatar
liucong committed
83

84
```
liucong's avatar
liucong committed
85
export LD_LIBRARY_PATH=<path_to_bert_migraphx>/depend/lib/:$LD_LIBRARY_PATH
86
```
liucong's avatar
liucong committed
87

88
然后执行:
liucong's avatar
liucong committed
89

90
91
92
93
```
source ~/.bashrc
```

liucong's avatar
liucong committed
94
### 推理示例
liucong's avatar
liucong committed
95

liucong's avatar
liucong committed
96
运行Bert示例程序,具体执行如下命令:
97
98

```python
liucong's avatar
liucong committed
99
100
# 进入bert migraphx工程根目录
cd <path_to_bert_migraphx> 
101

liucong's avatar
liucong committed
102
# 进入build目录
103
104
cd ./build/

liucong's avatar
liucong committed
105
106
# 执行示例程序
./Bert
107
108
```

liucong's avatar
liucong committed
109
如下所示,会在当前界面中提示输入问题,根据问题得到预测答案。
liucong's avatar
liucong committed
110

111
<img src="./Doc/Images/Bert_06.png" style="zoom:100%;" align=middle>
liucong's avatar
liucong committed
112
113
114
115
116
117
118
119

## 历史版本

https://developer.hpccube.com/codes/modelzoo/bert_migraphx

## 参考资料

https://github.com/ROCmSoftwarePlatform/AMDMIGraphX/tree/develop/examples/nlp/python_bert_squad