README.md 3.63 KB
Newer Older
liucong's avatar
liucong committed
1
2
3
4
5
6
7
8
# Bidirectional Encoder Representation from Transformers(BERT)

## 模型介绍
BERT的全称为Bidirectional Encoder Representation from Transformers,是一个预训练的语言表征模型。它强调了不再像以往一样采用传统的单向语言模型或者把两个单向语言模型进行浅层拼接的方法进行预训练,而是采用新的masked language model(MLM),以致能生成深度的双向语言表征。

## 模型结构
以往的预训练模型的结构会受到单向语言模型(从左到右或者从右到左)的限制,因而也限制了模型的表征能力,使其只能获取单方向的上下文信息。而BERT利用MLM进行预训练并且采用深层的双向Transformer组件(单向的Transformer一般被称为Transformer decoder,其每一个token(符号)只会attend到目前往左的token。而双向的Transformer则被称为Transformer encoder,其每一个token会attend到所有的token)来构建整个模型,因此最终生成能融合左右上下文信息的深层双向语言表征。

liucong's avatar
liucong committed
9
## Python版本推理
liucong's avatar
liucong committed
10

liucong's avatar
liucong committed
11
本次采用经典的Bert模型完成问题回答任务,模型和分词文件下载链接:https://pan.baidu.com/s/1yc30IzM4ocOpTpfFuUMR0w, 提取码:8f1a, 将bertsquad-10.onnx文件和uncased_L-12_H-768_A-12分词文件保存在Resource/文件夹下。下面介绍如何运行python代码示例,Python示例的详细说明见Doc目录下的Tutorial_Python.md。
liucong's avatar
liucong committed
12

liucong's avatar
liucong committed
13
### 下载镜像
liucong's avatar
liucong committed
14

liucong's avatar
liucong committed
15
在光源中下载MIGraphX镜像: 
liucong's avatar
liucong committed
16
17

```python
18
docker pull image.sourcefind.cn:5000/dcu/admin/base/custom:ort1.14.0_migraphx3.0.0-dtk22.10.1
liucong's avatar
liucong committed
19
20
```

liucong's avatar
liucong committed
21
### 设置Python环境变量
liucong's avatar
liucong committed
22

liucong's avatar
liucong committed
23
24
25
```
export PYTHONPATH=/opt/dtk/lib:$PYTHONPATH
```
liucong's avatar
liucong committed
26

liucong's avatar
liucong committed
27
### 安装依赖
liucong's avatar
liucong committed
28
29

```python
liucong's avatar
liucong committed
30
31
# 进入bert migraphx工程根目录
cd <path_to_bert_migraphx> 
liucong's avatar
liucong committed
32
33

# 进入示例程序目录
liucong's avatar
liucong committed
34
cd Python/
liucong's avatar
liucong committed
35
36
37
38
39

# 安装依赖
pip install -r requirements.txt
```

liucong's avatar
liucong committed
40
41
42
### 运行示例

在Python目录下执行如下命令运行该示例程序:
liucong's avatar
liucong committed
43
44

```python
liucong's avatar
liucong committed
45
python bert.py
46
47
```

liucong's avatar
liucong committed
48
49
输出结果为:

liucong's avatar
liucong committed
50
51
52
53
54
```
“1”:"open-source exascale-class platform for accelerated computing",
"2":"(Tensorflow / PyTorch)",
"3":"scale"
```
liucong's avatar
liucong committed
55
56
57
58
59

输出结果中,问题id对应预测概率值最大的答案。

## C++版本推理

liucong's avatar
liucong committed
60
本次采用经典的Bert模型完成问题回答任务,模型和分词文件下载链接:https://pan.baidu.com/s/1yc30IzM4ocOpTpfFuUMR0w, 提取码:8f1a, 将bertsquad-10.onnx文件和uncased_L-12_H-768_A-12分词文件保存在Resource/文件夹下。下面介绍如何运行C++代码示例,C++示例的详细说明见Doc目录下的Tutorial_Cpp.md。
liucong's avatar
liucong committed
61

liucong's avatar
liucong committed
62
### 下载镜像
liucong's avatar
liucong committed
63

liucong's avatar
liucong committed
64
在光源中下载MIGraphX镜像: 
65

liucong's avatar
liucong committed
66
67
68
69
```
docker pull image.sourcefind.cn:5000/dcu/admin/base/custom:ort1.14.0_migraphx3.0.0-dtk22.10.1
```

70

liucong's avatar
liucong committed
71
### 构建工程
72
73
74
75
76
77
78
79
80
81

```
rbuild build -d depend
```

### 设置环境变量

将依赖库依赖加入环境变量LD_LIBRARY_PATH,在~/.bashrc中添加如下语句:

```
liucong's avatar
liucong committed
82
export LD_LIBRARY_PATH=<path_to_bert_migraphx>/depend/lib64/:$LD_LIBRARY_PATH
liucong's avatar
liucong committed
83
84
```

85
然后执行:
liucong's avatar
liucong committed
86

87
88
89
90
```
source ~/.bashrc
```

liucong's avatar
liucong committed
91
### 运行示例
liucong's avatar
liucong committed
92

93
```python
liucong's avatar
liucong committed
94
95
# 进入bert migraphx工程根目录
cd <path_to_bert_migraphx> 
96

liucong's avatar
liucong committed
97
# 进入build目录
liucong's avatar
liucong committed
98
cd build/
99

liucong's avatar
liucong committed
100
101
# 执行示例程序
./Bert
102
103
```

liucong's avatar
liucong committed
104
如下所示,在当前界面根据提示输入问题,得到预测答案。
liucong's avatar
liucong committed
105

liucong's avatar
liucong committed
106
107
108
109
110
111
112
113
```
question:What is ROCm?
answer:open-source exascale-class platform for accelerated computing
question:Which frameworks does ROCmsupport?
answer:tensorflow / pytorch
question:What is ROCm built for?
answer:scale
```
liucong's avatar
liucong committed
114

liucong's avatar
liucong committed
115
## 源码仓库及问题反馈
liucong's avatar
liucong committed
116
117
118

https://developer.hpccube.com/codes/modelzoo/bert_migraphx

liucong's avatar
liucong committed
119
## 参考
liucong's avatar
liucong committed
120
121

https://github.com/ROCmSoftwarePlatform/AMDMIGraphX/tree/develop/examples/nlp/python_bert_squad